I just read the post by @edgework on rate limiting that indicates some more lenient limits are now in place:
"There's a limit of 450 requests within a 15 minute window."
That covers most of my use cases quite well and once there's a caching layer in place, it shouldn't be an issue at all. I do have a couple particular scenarios though that necessitate me periodically (daily? weekly? haven't decided yet) getting large dumps of data from the API. For instance, I want to periodically grab a list of all the game names in the Giant Bomb wiki. A quick hit of the /games API with no filters indicates that there are current 48,552 games in the database. So with a max response limit of 100 and 450 request/15 minutes, that means:
- I can fetch 4500 games (450 requests * 100 games per request) every 15 minutes and there are approximately 50,000 games, so fetching a list of all the game names would take about 165 minutes or about 2 hours 45 minutes.
Am I missing something? Is there a non-abusive way to hit the API for a complete list of games without making that many requests spread out over that much time?
Thanks!
Log in to comment