caching games list locally

Avatar image for captainclam
captainclam

4

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Hi. I was planning on caching the id/name/image fields for the entire games list (or at least say the 1000 most popular or recent) and then making individual requests to the api as I need additional information. I figured this way I could implement things like an autocomplete without constantly spamming out api requests.

however I'm a bit of a n00b and I'm having trouble figure out how to query the games resource in such a way that I can get all the records. It seems the limit is 100 items per request, and although I know how to use the sort and filter params, I can't figure out how to combine them to programatically retrieve all records.

So for example, if I get the top 100 most reviewed games like so:

giantbomb.fetch('games', {limit: 100, field_list:'id,name,deck,image', sort:'number_of_user_reviews:desc'})

How can I then get the "next 100" most reviewed games.

Trying things like {filter: 'number_of_user_reviews:10|20'} doesn't seem to work - only gives me number_of_user_reviews==10

Any advice? Not that it really matters but I'm using nodejs and looking to cache in redis.

Avatar image for chaser324
chaser324

9415

Forum Posts

14945

Wiki Points

0

Followers

Reviews: 1

User Lists: 15

#2 chaser324  Moderator

I'm pretty sure you need to use the 'offset' parameter.

I'm not sure if it's zero or one based indexing, but let's just say it's zero for this example. Assuming you go with the max limit of 100, on your first query you would use offset=0 and get results 0-99. Next query you would use offset=100 and get 100-199, and so on like that.

Avatar image for captainclam
captainclam

4

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

EXACTLY what I was looking for, thanks! Can't believe I didn't spot that earlier.