Caching Data

Avatar image for hexxagonal
hexxagonal

47

Forum Posts

719

Wiki Points

0

Followers

Reviews: 0

User Lists: 14

#1  Edited By hexxagonal

Anyone know if there is any rules on caching the data retrieved from Giant Bomb? I was thinking of implementing a cache of a few hours on data so I don't have to pull it from a web service every time.

Avatar image for washclof
Washclof

111

Forum Posts

926

Wiki Points

0

Followers

Reviews: 0

User Lists: 4

#2  Edited By Washclof

As far as I know there is not any hard set rules (available to the public) for the number of requests you can send per hour and such.

Though doing caching is always nice, especially if the information on the page will not be changing very often. So I would highly recommend it. Also a small thing you can do that may help lower the load on the servers when pulling a very large page is using the field_list filter and limiting it to just the information you need. I use this on a few things and it also helps the load time on requests.

Avatar image for lordandrew
LordAndrew

14609

Forum Posts

98305

Wiki Points

0

Followers

Reviews: 0

User Lists: 36

#3  Edited By LordAndrew

Caching for a few hours seems pretty short. Not much is likely to change in that time. You'd probably want to cache data for longer than that, unless you know it changes often and will be used often.

And maybe implement some sort of auto-adjusting cache, so that items that are updated more often are re-retrieved more often while stuff that rarely changes are retrieved more rarely? Hm. I'd better write that down. It would be more difficult to implement, but would certainly provide some additional benefit.