#1 Edited by Jahed (4 posts) -

I realised the API key that's given is pretty lose, as in there's no authentication behind it (domain checks for example).

If say I use it in a web app to make AJAX requests, how would I prevent someone from grabbing the key from the script and (mis)using it themselves? Is there any consequences to my access to the API if such a thing happens?

Edit: Just to explain my current solution to this: I'm proxying requests through my server which adds the API key (and other common params) and does some basic referrer authentication.

#2 Posted by LordAndrew (14588 posts) -

Anything sent to the client can't be kept secret, period. Perform the API interaction from your server if you're concerned, and that seems to be what you're doing already.

#3 Posted by Jahed (4 posts) -

Right. I guess my real question was whether or not there's plans to do any authentication against the API keys. Like simply having a domain+secret combination on the server which is hashed and served as the API key so requests can be validated against the referrer.

I don't want to keep the API key a secret, but rather, restrict its use to a domain.

I was planning to make an open source web app which other people could use without having their own server. I wouldn't want their traffic to go through my server and I'd assume they wouldn't want to expose their API keys either.

But yeah, not a big deal for me. I realised I can also fix the lack of CORS headers using the server too so I don't have to rely on JSONP (which is convenient).

#4 Posted by LordAndrew (14588 posts) -

I'm not aware of any plans for that. The API's been around since I think 2008, and hasn't seen much change. It would be nice though.

#5 Edited by Laika (148 posts) -
@jahed said:

Right. I guess my real question was whether or not there's plans to do any authentication against the API keys. Like simply having a domain+secret combination on the server which is hashed and served as the API key so requests can be validated against the referrer.

That's not on the roadmap, to my knowledge. However, @jslack is putting some per-key and per-IP rate limiting in place to prevent API abuse, which probably makes the likelihood of an outright API key ban less likely.

#6 Posted by jSlack (549 posts) -

@jahed@lordandrew We put in an API limit per key. We are currently testing it (we use similar limits as Twitter/Facebook API), to see if it helps prevent abuse.

The API is getting a makeover, with V2. This will include new docs, and possibly some new features. More details coming soon. I've pasted over in CV about it, and I'll do the same here.

#7 Posted by TheFaxman (63 posts) -

@jslack: I think this is affecting my app's users - I've had multiple reports of failures due to rate limit errors. My app makes requests for about 5-6 resources at start up to populate its UI and it seems to hit the limit very frequently even though the usage patterns is not what I would call abusive (a dozen requests in the span of 10-15 seconds then nothing for a few minutes or longer). This can turn to 100s of requests per hour if people are really browsing around the app.

My app is a client application so there are potentially many hundreds of clients making requests at any given point in time. This seems to equate to the app being hit by rate limit failures most of the time and I'm not sure how to address it. Help is appreciated here - I have some upset users that need their Giant Bomb fix.

#8 Edited by ObjectiveCaio (29 posts) -

@jslack: @thefaxman: Same problem here. I've been getting many crash reports and user complaints due to the app reaching the rate limit. My app doesn't even rely on requests that much. During normal use, it only makes requests when searching and browsing games that haven't been opened, and once a day it refreshes the users' list of games (usually not too many), and that's when it hits the API a bit harder.

I've reduced API use for the next version as much as possible, adding some logic to searches and limiting users' ability to do manual list refreshes to one per day, but the app is just starting out, and as the userbase grows, I don't think this will make that much of a difference.

If we were able to see how hard we're hitting the API (with a dashboard on our GB profiles maybe), we'd be better equipped to find problems in our use of it.

#9 Posted by jSlack (549 posts) -

@thefaxman@objectivecaio I upped the limit yesterday (to double) while we test this.

I've been posting over in CV a bit but to recap; We're working on rolling out the new v2 API, in the mean time we added some basic API restrictions because we were getting slaughtered with some apps scraping us really hard. We are working on some new features, and de-coupling from the main site so the API requests don't affect site performance.

In the meantime we added per-key API limits (currently 400 requests every 15 minutes per API key), which I feel is quite generous. It's intended only to stop people who created apps with API keys hard coded, sometimes doing 5000 requests every 10 minutes.

#10 Edited by Chaser324 (7043 posts) -

From what I've seen, I think that probably applies to a huge portion of the apps that people have developed, and I've never read anything in the past advising developers to not hardcode API keys. Very few of them seem to be taking the route of going through http://www.giantbomb.com/boxee/ to get a user's individual key. Putting some sort of user authentication into the API itself (or at the very least making it more well known that the current process of obtaining a user's API key through the Boxee page exists) would probably deter developers from hardcoding API keys.

EDIT: Actually, does the Boxee process still work? Should devs direct users to directly retrieve and enter the API key? You should provide some guidance to devs on this subject.

EDIT 2: Also, didn't you have the data to know who would be affected by this? Why not give them some prior notice to allow them time to change their app? I'm not saying you had to directly e-mail them (although you do have their e-mail addresses), but a pinned post to the API developers page would have been nice.

Moderator Online
#11 Posted by TheFaxman (63 posts) -

@jslack: The hard coded API key usage was the only guidance I've seen for client apps. I have no problems switching off of this but I'd like some guidance as to what I should be moving to? Should I ask for API keys from users directly? Use the /boxee/ code method? Something else?

I need to try to get a fix out as soon as possible since my users are blocked so I need to figure out a method fairly quickly.

#12 Posted by ObjectiveCaio (29 posts) -
#13 Posted by LordAndrew (14588 posts) -

I would be fine using users' API keys if there was a developer-friendly way of obtaining them. We have to ask the user for their login credentials, POST to the login form and hope that its fields don't change unexpectedly, scrape the link code from a web page and hope that its structure doesn't change unexpectedly, and then use that link code in a way that I've forgotten because it's not documented anywhere.I could skip several steps by insisting the user obtain and provide the link code themselves, but I still don't remember what to do with the link code once I have it.There is a lot of room for improvement here.

#14 Edited by Chaser324 (7043 posts) -

@lordandrew: Totally agree. If they want to stop allowing hardcoded keys, that's totally fine, but they need/needed to give devs a heads up and provide guidance on a recommended alternative. That Boxee authentication stuff isn't documented anywhere, scraping is a bad option, and asking a user to retrieve and enter their API key isn't a great option either.

Moderator Online
#15 Posted by KamasamaK (2489 posts) -

@chaser324 said:

asking a user to retrieve and enter their API key isn't a great option either.

I don't think it's unreasonable. It is a one-time set-up step, and you can just give them the link and have them copy/paste the key. You may dissuade very lazy users, but it's their loss. Alternatively, giving their login credentials may also scare away people.

#16 Posted by atholm (38 posts) -

@kamasamak said:

@chaser324 said:

asking a user to retrieve and enter their API key isn't a great option either.

I don't think it's unreasonable. It is a one-time set-up step, and you can just give them the link and have them copy/paste the key. You may dissuade very lazy users, but it's their loss. Alternatively, giving their login credentials may also scare away people.

Agree, I have a simple signup in the start of my giantbomb app and only 10-15% of users do it, even though they have many options available; Facebook, Twitter, Google and simple email+password.

Like @objectivecaio my app is basic an app version of the Wiki content. I have also tried to limit my calls and I cache a good amount of the data, but my users is also experiencing the new API restrictions. Will it be possible to create another way of generating the API keys so you have control over which app's are heavy hitters? Maybe a form where we all can submit our apps and get a special key back.

#17 Edited by subkamran (26 posts) -

FWIW my entire app (and the new version coming out shortly) doesn't (and shouldn't) require user authentication from GiantBomb, that would essentially kill my app; it's based around just being able to search and track games and I use GB behind the scenes to sync and offer up the games. I need to use the API key on the server to do all that and I don't want to require people to have a GB account.

I would also appreciate communication from GB when things like rate-limiting go into effect, I just came here looking for something else and saw this thread--how would I know normally about this until it affected my users? With 400 requests every 15 minutes that might be generous but it's not generous when I need to refresh my game cache each night. Now, I can improve the way I do that (i.e. batch it, don't refresh really old games that often, etc.) but I would have liked a warning saying that you guys intend to limit so I have time to do that.

I would also pay money to see a stable API and better support; I understand you can't use the API for commercial purposes but I'd be willing to pay a subscription fee if it meant increased caps, usage reports, support, and a stable API. There's no other API as featured as GB but so far GB API has been a real cluster these past few years I've used it. Honestly, I would prefer a raw data dump of your entire database and pay $30/mo to access it. I'd just use my job to read it, cache it locally in my database, and I can handle all the other stuff myself.

#18 Posted by PenguinPowered (40 posts) -

@subkamran since they don't have any separate developer mailing list or anything I don't know how you to expect them to get in touch with you. I think it behooves the developer, when they are reliant on a particular API in this manner to keep an eye on the forums for this kind of change.

I do however agree about the API. Not particularly that it is unstable, but there are some oddities that could be fixed and it could be far easier to query. I subscribed to GiantBomb specifically because I appreciate that they have really the only (that I was able to find) programmatically accessible game database and I would love to see it grow and improve. I expect a lot of developers (including myself) would offer up some of our own time to help it improve, if that is an option.

#19 Posted by jSlack (549 posts) -

@penguinpowered: Definitely. We've been working on rebuilding it for a while, but there are other concurrent priorities which prevent us from working full time on this.

I have the rebuild on my radar, and we're hoping to have the new version (api v2) out "soon".

@subkamran Rate limiting was specifically for defensive purposes as it was destroying our production databases. Beyond our normal users, we have some bots and others who attempt the DDOS the api on a regular basis, sending 20,000 requests per minute to a single request.

We want everyone to be able to use the API as their hearts content, and I've been working on offloading the API onto dedicated servers that won't affect the site as much. I hear you guys, and I agree, I want to make the best tools possible so you can use them with no limits. Included in the API re-write and re-platform, we unlimited use for registered users.

I understand the frustration, hit me up directly and I'll see what I can do to help you in the mean time.

#20 Edited by subkamran (26 posts) -

@jslack: Thanks for the support. I recently released the new version of my site (Keep Track of My Games) which greatly reduces the number of requests I make to the GB API daily. Previously I was doing a request per game to refresh but I've since switched to using the filter mechanism to filter the query by batches of 100, so it cut the number of requests down drastically.

I've also contemplated just cloning the database so I can handle all the data myself... really for me this would be ideal because I've had complaints about the search (searching for "Shadow of Mordor" will give you the proper result way down in the list with Call of Duty games showing up before it) that I think I can handle myself using my backing store. This would benefit you guys too since I still have to do live searches against the API unless the search was cached. The only downside is really storage and refreshing all of that while not accidentally DDOS'ing the API, since I pay for all the costs myself and I can't make money using the API, this is a barrier but at this point I'm willing to take the hit. I think with some clever techniques this could be done fairly efficiently without running into the rate limiting (and to make it easier on you).

If you made the data available as a dump (FTP is fine), I could have a very simple scheduled job that downloads the data and shoves it into my database. That would solve all my needs. Sunlight Labs API does this with its data and is super helpful if you don't want to do one-off queries; you can just connect to their GitHub account and download the raw dumps.

@penguinpowered: They must have my email associated with the API key because I have to login for it to show me the API key. I think they could email all of us if they really wanted to... this is typical for a public API with users... I don't exactly have time to frequent the forums with a full-time job and it falls off my mind unless I remember to check or if something goes wrong. My point is, it shouldn't be reactive, it should be proactive--if there's a change that would affect people, tell them, don't let them find out when their stuff breaks :) It's just courteous.

#21 Edited by jSlack (549 posts) -

@subkamran: First, yes, we can email all users with an API key. We were hoping to have our new API changes out already, and doing a release about it including notes, features and new documentation that I've been working on. Unfortunately we're pushing out other significant tech changes, that must go first, before we release this, and got held back. Sorry.

Next, I am happy to help out with your strategy, as a web and app developer myself. A good idea for apps is to take advantage of native local storage on api requests. So for iOS for example, you can use Core Data or SQLLite to cache the data returned for a specific api request. On the web, taking advantage of database, nosql solution (redis, cassandra, or whatever) would be cool. What we usually do is, save a key based on the request filled with the data, so like, if the user requests: /api?param1&param2&param3 we store the key as myLocalDataStore::param1:param2:param3 = $dataReturned.

However, that's just a suggestion. With the new version we won't have limits, and we'll be using queueing to take advantage of heavy loads (which we kind of do now, just not optimally). We can talk about increasing your limit so you can populate your data. I'm doing it for a couple of other app developers as well. We can set a timeframe to do this. Currently we need to get our new tech stack stuff out on live before we do, but I expect to be free to help out on this issue next week.

#22 Posted by subkamran (26 posts) -

@jslack: Awesome! I sent you a DM too with the details. I'm using NoSQL storage for my app so it makes it easy to cache results from GiantBomb; that is exactly how I store cached searches (by hashing the request).

Great to hear there'll be enhancements soon, one of the big asks is the ability to filter by ranges or gt/lt (i.e. /games/filter=date_last_modified:(gt=xxx&lt=xxx) or something).

#23 Posted by jSlack (549 posts) -

@subkamran: Cool thanks. Sounds good. That filter range is something we want / will have in the new api, for certain requests.

#24 Posted by atholm (38 posts) -

@jslack thanks for being so transparent about this and can't wait for the API update. I have a few users reporting that they have reached the limit, but have moved a lot of my communication to local Core Data model.

#25 Posted by jSlack (549 posts) -

@atholm: Awesome! I'm hoping to add some guides / sample codes as well on our new API page, so users can take advantage of the features like you have. Honestly, it's far better for app performance anyway, and we want you guys making kick ass apps.

Trying to get this new API stuff as quickly as possible. I'm going to be making changes to the limits on Tuesday (I hope), and will post in the forums when I do.

#26 Edited by subkamran (26 posts) -

@atholm, Sorry, what is the "Core Data" model? Is there a new feature of the API?

#27 Posted by Avenged (33 posts) -

@subkamran: Core Data is an iOS/cocoa database model (typically sqllite managed)