subkamran's forum posts

  • 16 results
  • 1
  • 2
#1 Posted by subkamran (18 posts) -

@jslack: Awesome! I sent you a DM too with the details. I'm using NoSQL storage for my app so it makes it easy to cache results from GiantBomb; that is exactly how I store cached searches (by hashing the request).

Great to hear there'll be enhancements soon, one of the big asks is the ability to filter by ranges or gt/lt (i.e. /games/filter=date_last_modified:(gt=xxx&lt=xxx) or something).

#2 Edited by subkamran (18 posts) -

@jslack: Thanks for the support. I recently released the new version of my site (Keep Track of My Games) which greatly reduces the number of requests I make to the GB API daily. Previously I was doing a request per game to refresh but I've since switched to using the filter mechanism to filter the query by batches of 100, so it cut the number of requests down drastically.

I've also contemplated just cloning the database so I can handle all the data myself... really for me this would be ideal because I've had complaints about the search (searching for "Shadow of Mordor" will give you the proper result way down in the list with Call of Duty games showing up before it) that I think I can handle myself using my backing store. This would benefit you guys too since I still have to do live searches against the API unless the search was cached. The only downside is really storage and refreshing all of that while not accidentally DDOS'ing the API, since I pay for all the costs myself and I can't make money using the API, this is a barrier but at this point I'm willing to take the hit. I think with some clever techniques this could be done fairly efficiently without running into the rate limiting (and to make it easier on you).

If you made the data available as a dump (FTP is fine), I could have a very simple scheduled job that downloads the data and shoves it into my database. That would solve all my needs. Sunlight Labs API does this with its data and is super helpful if you don't want to do one-off queries; you can just connect to their GitHub account and download the raw dumps.

@penguinpowered: They must have my email associated with the API key because I have to login for it to show me the API key. I think they could email all of us if they really wanted to... this is typical for a public API with users... I don't exactly have time to frequent the forums with a full-time job and it falls off my mind unless I remember to check or if something goes wrong. My point is, it shouldn't be reactive, it should be proactive--if there's a change that would affect people, tell them, don't let them find out when their stuff breaks :) It's just courteous.

#3 Edited by subkamran (18 posts) -

FWIW my entire app (and the new version coming out shortly) doesn't (and shouldn't) require user authentication from GiantBomb, that would essentially kill my app; it's based around just being able to search and track games and I use GB behind the scenes to sync and offer up the games. I need to use the API key on the server to do all that and I don't want to require people to have a GB account.

I would also appreciate communication from GB when things like rate-limiting go into effect, I just came here looking for something else and saw this thread--how would I know normally about this until it affected my users? With 400 requests every 15 minutes that might be generous but it's not generous when I need to refresh my game cache each night. Now, I can improve the way I do that (i.e. batch it, don't refresh really old games that often, etc.) but I would have liked a warning saying that you guys intend to limit so I have time to do that.

I would also pay money to see a stable API and better support; I understand you can't use the API for commercial purposes but I'd be willing to pay a subscription fee if it meant increased caps, usage reports, support, and a stable API. There's no other API as featured as GB but so far GB API has been a real cluster these past few years I've used it. Honestly, I would prefer a raw data dump of your entire database and pay $30/mo to access it. I'd just use my job to read it, cache it locally in my database, and I can handle all the other stuff myself.

#4 Edited by subkamran (18 posts) -

Far Cry 4's image(s) are broken right now as well:

http://static.giantbomb.com/uploads/scale_small/0/3699/2633124-far+cry+4.png

It should be .jpg (that works). It's borked on the game page itself too:

http://www.giantbomb.com/far-cry-4/3030-46310/

#5 Posted by subkamran (18 posts) -

I mean the entire site search is also borked, so it's not just an API issue. You'd think you'd roll back to a stable version while this got fixed? It's been 16+ days since this was reported. There's a lot of people depending on this API (including my site). I'd be willing to pay for access if it meant stability and regular updates :)

#6 Posted by subkamran (18 posts) -

I was just hoping changing the protocol would work.

This essentially precludes the ability for my site to work under SSL, since I can't link to Giantbomb images since browsers won't display them. That leaves me two choices: implement a workaround to not have to use SSL or to cache all the images locally (ugh).

On my end I think I can workaround having to use SSL for the game pages but it would be nice not to have to. It's been 6 months, have you guys re-considered?

#7 Posted by subkamran (18 posts) -

Why do people keep reverting the release to 2013? Clearly the planned release is for May 14, which I submitted but it's been reverted again.

#8 Edited by subkamran (18 posts) -

Hey guys,

Is there a way to craft an HTTPS URL for GB images returned from the API? I noticed if I try this:

https://static.giantbomb.com/uploads/scale_small/8/87790/2293393-box_re6.png

It actually has a bad certificate; i.e. Akamai is serving the image but the cert is for GB, so browsers deny it unless you add an exception.

This is sort of a big deal if you intend to serve your site over SSL.

Any ideas?

#9 Posted by subkamran (18 posts) -

I would love raw dumps... then I could just cache everything locally.

#10 Edited by subkamran (18 posts) -

@frobie, is it by design to add a container around platforms in search results? Previously it was simply an array of platform objects, now it's an array of containers with a "platform" property that points to a platform. This is messing with my deserialization strategy :( I just noticed this recently. Was there a reason to change it? It's inconsistent now. The game detail resource still acts correctly, "platforms" is an array of platform objects.

Another question: how come all the images are crazy big? Small is still pretty big (601W) and Medium is huge (902W). Fun fact: if I change the URL for a Medium image to use "scale_tiny" it's a more palatable size (300x320). However, "tiny_url" returns a square cropped image, not "scale_tiny" as I'd expect. I'm using Last of Us as a reference.

  • 16 results
  • 1
  • 2