What the fuck? DId these people not go to school?
B- = 82ish, not 69
Where are they getting these conversions from? >.>
1up-- Metacritic conversion
Metacritic has a page that explains their scoring system. Have you read it?
If they cared how their scores were being calculated, they probably wouldn't have switched to that rating system. 1UP doesn't care, neither should you.
I use MetaCritic as a one-stop shop to find all the most relevant reviews that I might want to look over, but their grading system is completely fucked. Equally fucked is the amount of importance given to it.
" Yea I hate the Metacritic scoring system with a passion, expessialy when it comes to letter grades "
"This is their scale:When you look at it like that, 67 for a B- makes sense."
Letter Grades Their Grade Converts To A or A+ 100 A- 91 B+ 83 B 75 B- 67 C+ 58 C 50 C- 42 D+ 33 D 25 D- 16 F+ 8 F or F- 0
Not really. That's redoing the whole grading system. In the US, letter grades are obviously based on the school grading system.
I guess metacritic isn't american perhaps?
It's retarded. And hell, yes. F is 60 and below. Because honestly, after that, games just suck so much balls really.
1Up's grade system can be best described as a more specifically defined 5-star system:
A=5 stars, or "Great" games
B=4 Stars, or "Good" games
C=3 stars, or "Average" games
D=2 stars, or "Poor" games
F=1 star, or "TERRIBLE!" games
Within each of those levels, there is a +, - and straight level. So truly exceptional, groundbreaking games are given the A+ ranking; games that barely are worth the effort they were pressed with are a D-. Simple enough to understand.
Here is how MetaCritic claims they grade their 5 levels of acclaim for games:
Universal Acclaim=100-90
Generally Favorable=75-89
Mixed or Average=50-74
Generally Unfavorable=20-49
Overwhelmingly Dislike=0-19
If you look at these two scales, that would put a B- game in the middle of the "Average" reviews. But as 1Up (and Garnett Lee specifically) have made very clear, they do no consider Bs to be "average" scores; they are above average. At best, a B- should be closer to a 75, not a 67.
But as I said earlier, the worst part of this is that none of it should matter AT ALL! And really, it doesn't, until you realize people get bonuses based on this shit.
" What the fuck? DId these people not go to school?B- = 82ish, not 69Where are they getting these conversions from? >.> "
This is why the New York Times has the best reviews around: there's no number to give you the short hand of what the reviewer thinks, and it forces you to actually look at their analysis and come to your own conclusion what they think.
I've said this before and I'll say it again: Metacritic is killing the industry. Screw the numbers, read the words associated with those numbers.
"Who gives a shit about scores. Only fanboys."
Fanboys and, unfortunately, publishers. You'd be surprised how many developers bonus payments etc are dependant on Metacritic score. It's complete bullshit and it's a practice that needs to stop. You're basing someones pay packet on the collective critical opinion of, honestly, a bunch of asshats opinions (because bar literally a handfull of websites, the opinions that make up the metacritic scores are from people who can no more write and critically evaluate a piece of software than tie their shoelaces) and that's wrong. Not to mention the potential problems for legitimate consumers, you've no idea who half these people are, why their opinions matter, or who's pockets they are in.
Metacritic is a good gathering if reviews and general opinions, but the industry depends far too heavily on it and it's not a good thing.
"
" Yea I hate the Metacritic scoring system with a passion, expessialy when it comes to letter grades "Why? It kind of makes sense, since they cover the whole scale from 1 to 100. I dont use Metacritic, but from the chart that Matty supplied it looks reasonable to me
The main problem I have that if you have two sites that use the letter grading system, a 'C' for one site may mean something completely different for another site, there is also this problem for numbers as well.
But letters are less commonly used for reviews so peoples perception of a letter grade is completely different, especially when you go across different countries and cultures, but usually you have the text review to back up the score that is given, take away that text review and your left just a letter. Now a 'C' for example relating it to the school system where I live (in England) for my A-levels, 'C' is okay, it's not bad if I get all C's it will mean I will get into University, I'm not going to Oxford or Cambridge, but I will still get to go to a respectable Uni, now this metacritic conversion system says that it will become a 50/100 basically a 5/10, now a game that is 5/10, unless it was a game I had been looking forward to or a franchise that I was a big fan of, I wouldn't even touch that game in my opinon 5/10 is a bad score.
So does the reviewer think that the game is a "'C' not great, doesn't stand out that much, but it's not a 'bad' game" or does he mean a "'C' this is a 'bad' game that I would not recommend" well Metacritic would rate that as a 50/100 a game that the reviewer would not recommend but you would have to read the reviewers text to see what he really thought of the game.
And I know in their corporate shpill they say something like the "Scores are wieghted differently to the final score depending on what website/magazine they come from", but that still does not cover each writer in those websites/magazines.
To sum up, I think that this is the main problem with Metacritic, and other sites like Metacritic, in general but it is a bigger problem when talking about letters
(Sorry for the long post I got a bit carried away)
" I've said this before and I'll say it again: Metacritic is killing the industry. Screw the numbers, read the words associated with those numbers. "
The industry's killing themselves by using sites like Metacritic and Game Rankings. It was never intended for use by publishers to determine bonuses and such.
As for the scoring issue:
1UP does have control over that, but they obviously don't care care how their scores are being calculated; nor should they.
Q: I read Manohla Dargis' review of [MOVIE NAME] and I swear it sounded like a 9... why did you guys say she gave it an 8? A: Many reviewers include some sort of grade for the movie, album, or game they are reviewing, whether it is on a 5-star scale, a 100-point scale, a letter grade, or other mark. However, plenty of other reviewers choose not to do this. Hey, that's great... they want you to actually read their review rather than just glance at a number. (Personally, we at Metacritic like to read reviews, which is one of the reasons we include a link to every full review on our site.... we want you to read them too!)
However, this does pose a problem for our METASCORE computations, which are based on numbers, not qualitative concepts like art and emotions. (If only all of life were like that!) Thus, our staff must assign a numeric score, from 0-100, to each review that is not already scored by the critic. Naturally, there is some discretion involved here, and there will be times when you disagree with the score we assigned. However, our staffers have read a lot of reviews--and we mean a lot--and thus through experience are able to maintain consistency both from film to film and from reviewer to reviewer. When you read over 200 reviews from Manohla Dargis, you begin to develop a decent idea about when she's indicating a 90 and when she's indicating an 80.
Note, however, that our staff will not attempt to assign super-exact scores like 87 or 43, as doing so would be impossible. Typically, we will work in increments of 10 (so a good review will get a 60, 70, 80, 90, or 100), although in some instances we may also fall halfway in-between (such as a 75).
Q: Hey, I AM Manohla Dargis, and you said I gave the movie an 80, when really I gave it a 90. What gives? A: Now, if you are indeed the critic who wrote the review, and disagree with one of our scores, please let us know and we'll change it.
This does happen from time to time, and many of the critics included on this site (such as Ms. Dargis) do indeed check their reviews (as well as those of their colleagues) on metacritic.com.
" What the fuck? DId these people not go to school?
B- = 82ish, not 69
Where are they getting these conversions from? >.> "
Surprisingly, it is not the same scale you're familiar with from school. In fact, it's very similar to the conversion Metacritic performs.
I guess 1UP isn't american perhaps?
I think Metacritic is okay for a basic idea of a worth of a game but comparing it to others scores is really stupid.
Anything and everything below a 75 on metacritic is generally perceived by their audience as a "bad game" (even though it's Average on their own scale) which is why their conversion for a B- from 1UP makes no sense.
" Anything and everything below a 75 on metacritic is generally perceived by their audience as a "bad game" (even though it's Average on their own scale) which is why their conversion for a B- from 1UP makes no sense. "
Read this for more details.
Converting the previous numbers to letters was a bad idea in my opinion because it instinctively taught idiots to think of the letter grades in terms of the old number system when it's supposed to be just a straight-up letter system a la academia or Entertainment Weekly. So naturally, idiots decide to post that stupid Joystiq scale any time the topic of 1UP's scale comes up instead of simply saying "it works like it does in school."
In my experience reading reviews, I've found that the scoring system is very rarely intended to be like the system used in schools.
That's the system they used to convert the scores, and I imagine that's more or less how they would have scored the games if they were still using the old scale.
How do I figure? The people responsible for the conversion knew exactly what they were doing. They must have had a reason for choosing that scale over the more well know school conversion.
But really, who gives a fuck?!
I was more looking for an example, but that works, I guess. I just don't see how this isn't perfectly translating to school conversion - an A is an excellent score, a B is good but there's room for improvement, a C is okay but nowhere near a level of expertise (and if you're anything like my school, if you get a C in a course towards your major, you don't get the credits), a D barely passes and an F is an abject failure. But in the end I agree with you: who DOES give a fuck?
Personally, I wish that the American public school grading system would never be used as a yardstick by which to gauge game review scores. Why? Because by its very nature, it's a system in which over half of its scale means exactly the same thing: failure. It's that system that plants a very disingenuous idea into American gamers' heads about game review scores: that anything below a relatively high score is worth shit. It isn't true of course, but it's an idea that's hard to dispel. There's nothing wrong with Metracritic's system as far as I can tell. The argument that it doesn't compare to the American public school grade system is nonsense; if it did, there would be a huge hole within the scale, because a zero and a 50 would be exactly the same thing, a concept I find rather ridiulous in the realm of scoring. No website has an obligation to adhere to the rules of American public schools (thank God).
I don't even read reviews anymore... i just ask regular people what they think about... its what the internet is for
In response to this:
This actually even better illustrates the initial point I was trying to make. Notice how the B grades range from 8.5 to 7 (which strikes me as low for a B grade, but that is neither here nor there; the reason why a straight B is absent also perplexes). While this doesn't perfectly line up with academic grades, it does fairly accurately depict MetaCritic's range of above average, but not exceptional, work: 89-75. A 67 is much more inline with what 1Up's scale, if this were a one to one comparison, would consider in the C range, or what is considered "Average."
" @TheKidNixon: Metacritic is flawed whatever the grade system is. Every sites scale is different. Even a 1-100 scale is different on every site. Some sites treat 50 as average, others 70. Their numberig system for A-F grading makes perfect sense. It might not be able to be applied to 1up flawlessly, but you no sites grading system can be applied to metacritic flawlessly. "
Interesting how so many of you are adamant that the scoring of games is such a negative practice, yet even giantbomb has a scoring system.
I think they should give up on scales all together and just settle on an appropriate two word phrase that best conveys their feelings about the game, like "Unrelentingly Weak," or "Effortlessly Supreme"
This way no computer could parse it reliably, but humans would know what it means. And if you're deciding between two games to buy by looking only at the scores you should probably think about reading the damn review. Someone spent a lot of time writing that.
The "problem" with scoring is that it becomes the short-hand for the review itself, boiling down the paragraphs of thoughtful reflection on the gaming experience into a quick glance. MetaCritic only enhances the problem because you get a single aggregate score, then a list of short blurb's that a second-hand source has deemed the single most relevant bit of information from that review. Criticism, instead of being a long-form reaction to a gaming experience, is reduced to a series of MTV style sound blurbs. It is all connected to the short-attention span that the internet breeds and promotes.
I'm not grabbing a pitchfork, calling MetaCritic the worst thing to ever happen to gaming. (That's Twitter, obviously.) And really, others are right: by my own logic, I shouldn't care about what MetaCritic redefines Lee's score as for their own ultimately meaningless listing. Mostly, I get frustrated because their internal logic leaves something to be desired. If they are going to label themselves as the premiere source of agreggated scores for mass media, they need to be more careful than they are with their translation of scores, prestige levels and overall design.
"I was more looking for an example, but that works, I guess. I just don't see how this isn't perfectly translating to school conversion - an A is an excellent score, a B is good but there's room for improvement, a C is okay but nowhere near a level of expertise (and if you're anything like my school, if you get a C in a course towards your major, you don't get the credits), a D barely passes and an F is an abject failure. But in the end I agree with you: who DOES give a fuck?Also I was just kidding about the whole idiot thing, I hope you know. :)"
The funny thing thing is, your interpretation is actually very similar to mine despite displaying what appeared to be a significantly different view earlier. I view 1UP's C grade as average, and D and F as being bad. But if I'd have to assign an actual score to a C grade (Something I've never actually attempted to do until today. In fact, today was the first time I saw that Joystiq comparison thing) I'd give it a 5 or 6. Which is the very same thing 1UP did when converting their scores to letter grades.
Perhaps the perceived difference in views comes from the connotations those scores were given in school. When 1UP's new rating system was launched, Shoe said:
We're not publicizing the conversion scale because we want our readers to go with our new scoring system and not be constantly translating the new letters back to our old scores. We also don't want our reviewers to be thinking about how they translate. It's just easier for us to have everyone move forward and accept the new ratings. But most people can figure it out. Our old "average" in the 5 range roughly translates to the C letter grades (with plusses and minuses), for example.No translating. Forget the equivalent number scores because they don't apply here. Your descriptions of each grade seem to be spot-on, but that conversion simply won't do. In fact, it's best to avoid any conversion and just "move forward and accept the new ratings". I certainly have.
One of the problems with Metacritic's "wonky" scoring is that some publishers have used it to deny developers their royalties. Stephen Totillo wrote about it last year...
One developer, who asked not to be named told me about an instance in which their company didn’t receive royalties for a game that sold more than a million copies. The reason was because — as had been stipulated in a contract with the publisher — the Metacritic score for the game was too low....Former GameSpot reviewers Jeff Gerstmann (Giant Bomb) and Alex Navarro said they’ve not only heard of this practice but even know developers that were caught up in it. "I’ve gotten e-mails from developers over the years who have said, ‘I don’t think you realize what you’re doing to me with this review’ because my review knocked them out of the range of some bonus that they were up for,” Gerstmann told me. That’s something that really troubles me… When I’m sitting down to write a review I’m never setting out to think: ‘I am taking food off this guy’s table.’”
I agree with Garnett Lee (ListenUP podcast). I am glad I wasn't in Metacritic's class when I was in high school.
I don't care what sites or scoring systems are used, publishers should not be using reviews to influence royalties or bonuses. It's a practice that needs to stop.
"@LordAndrew: This I agree with, sales should reflect royalties, not opinionated reviews. Unfortunately sales are often reflected by reviews, and far to often people are using just the number scale to make a final decision. Both of which need to stop."
The thing is, sales don't reflect quality. The logic of this is that reviews will reflect quality, which is what they want to bonus on.
"I agree with Garnett Lee (ListenUP podcast). I am glad I wasn't in Metacritic's class when I was in high school. "
We're not publicizing the conversion scale because we want our readers to go with our new scoring system and not be constantly translating the new letters back to our old scores. We also don't want our reviewers to be thinking about how they translate. It's just easier for us to have everyone move forward and accept the new ratings. But most people can figure it out. Our old "average" in the 5 range roughly translates to the C letter grades (with plusses and minuses), for example.
Reviewers should not have to consider the feelings of developers when writing reviews. Do you want people writing positive reviews for games that don't deserve them, just so that the developers don't miss out on royalties?
"@SmugDarkLoser: Reviews reflect a subjective opinion in addition to objective facts. You can't pay people based on that. Reviewers should not have to consider the feelings of developers when writing reviews. Do you want people writing positive reviews for games that don't deserve them, just so that the developers don't miss out on royalties?"
I never said to curve reviews. I said metacritic's conversion of 1up's scores is wrong.
You seemed to be suggesting that developers should be paid royalties based on review scores. That I vehemently disagree with.
Please Log In to post.
Log in to comment