Subscribe To Scoring Game Reviews Is Stupid (But We'll Keep Doing It Anyway) Updates
I've already subscribed
There's always going to be disagreement over how good a creative work is. This is true with every medium - music, film, video games, commemorative rugs, etc. What makes critical disagreement different with video games is that the debate often centers over the exact number attached to a review. While a film can be measured with two upturned/downturned thumbs to indicate whether it's crappy (two thumbs down), sorta good (one thumb up, one thumb down), or good (two thumbs up), if a game reviewer tried that, he'd wake up the next morning to find Yoshi's severed head in his bed.
Gamers want a numeric score, preferably a percentage. At least that seems to be what they prefer, because that's what they readily convert any alternate scoring system (such as Blend Games' five-star rating system) into. Thus, me giving a game 3.5 stars out of 5 is instantly calculated to mean 70%, which in turns is calculated to mean that the game is teetering on the edge of sucking. Video games are, apparently, supposed to be scored like algebra exams with 65% being the point of failure. Odd, considering that I've been plenty entertained by games that only got 3 stars (60%) or 2.5 stars (50%). Why bother giving such an exact score if two-thirds of the scale count (0-64%) mean that the game is unplayable drivel? From a reader standpoint, do you really care about the degree to which a game is unplayable drivel? Will you play something that's 50% but not if it's 40%?
By the same token, do you really care about the exact level of awesomeness a good game attains? I scored Dead Space as 4 stars, which I take to be very good and worth buying. "It should be a 4.5 or a 4.75," said a commenter on an aggregator where the review was posted. Well, let's put aside the fact that 4.75 stars would be a tough graphic to actually read. What's the difference between 9 or 9.5 or 10? Or 8 or 8.5 or 9? You'll likely try the game that gets a score anywhere in that range if you're interested in the genre.
There exists this belief that the exact quality of a game can be determined with mathematical precision. It's possible that this is a result of people not understanding how a game is actually reviewed. Let me then, give you a run-down of what doesn't happen during the review process:
Dr. Drake lifted the Fable 2 game disc from the pool of mercury with a pair of titanium forceps. The disc had been submerged in the pool overnight to increase the yield in the Review Process. Drake carried the disc over to the steel pedestal and gingerly laid it down in the center. He pressed a string of buttons on a nearby console and metal chevrons slid out from the edges of the pedestal to hold the disc in place.
"Started without me?" Dr. Jenkins said, entering the test chamber in his environmental suit. A rivulet of sweat dripped down between Drake's eyes when he looked up. He'd performed this procedure countless times before; why was he so nervous for this one?
"You're just in time. I'm about to fire up the scanning laser," Drake said.
Jenkins placed the Manila folder he was carrying onto a table and walked over to an identical console on the other side of the room. After counting down from three, he and Drake began typing in tandem on their respective consoles. The room darkened and the conical laser above the pedestal began to hum. A red beam shot down and stuck the disc. As the beam lit up the room, Drake noticed the light reflecting off something in Jenkin's folder on the table.
"What's in that folder?"
Jenkins shrugged. "Documents about Fable 2. Manual, press releases, stuff like that."
"No magazines, right?"
"What difference does that - "
"A gaming magazine?"
The room began to vibrate and the folder fluttered open. Haas could see the cover of the glossy magazine now. "Fable 2", it said, "A New Classic."
"Jenkins, goddamnit, it's going to contaminate th - "
The beam turned blue and split, with one fork arcing toward the magazine. Upon striking the magazine, the beam reacted with the hype molecules inside and set it ablaze. Claxons blared. The disc began to spark and a sudden wall of smoke enveloped Drake. He stumbled toward the exit, nearly knocking over a tray of test tubes on his way out. The thick door of the chamber slam closed behind him - a safety feature designed to prevent damage to the rest of the facility.
Drake pressed himself against one of the chamber's thick glass walls and peered inside, trying to see Jenkins through the smoke. His colleague was on his feet and staggering through the smoke but he was moving away from the exit.
Drake pressed the talk button on the intercom outside the test chamber. "Turn around! Other way!" he yelled. Jenkins didn't respond. He continued walking until he reached the back wall of the chamber. He stopped in front of the computer that tabulated the results of the Review Process. My God, Drake thought, he's actually trying to finish the experiment.
The smoke thickened, obscuring Drake' view of the chamber entirely. He reached once again for the intercom when suddenly Jenkins slumped against the glass, his environmental suit melting in several places. With great effort, he lifted his arm and slapped a piece of paper against the glass so Drake could see. It was the print-out of the Review Process results. "Fable 2: 90.345% Overall," read the top line of the report.
Jenkins managed a smile despite his obvious pain. His body slackened. He slid off the glass and hit the floor. Dr. Drake stood there for many minutes before he heard the hurried footsteps behind him. It was Dr. Harrison and the several other scientists on the team. They hit him with a flurry of questions. He answered the only one he heard clearly: "How did he die?"
"He died a reviewer," Drake said. His sweat mixed with tears. "He died a reviewer."
...So yeah, that's not what happens during reviews. What happens is that someone sits down, plays a video game, and tries to explain what they like or dislike about it. That's it - there's no calculations involved. It's the only way of doing it and it's an imperfect process. It's not just that it's impossible to pin an exact number or percentile to a game - it's impossible to even pin an exact number to your own writing. I doubt most writers can tell how their 90% reviews are different from their 85% reviews. Even breaking the overall score down into individual elements (graphics, sound, fun factor, etc.) doesn't solve this problem because in the end, it all comes down to an opinion which may or may not be accurate in the first place.
Still, in spite of all this, I can't not put scores on reviews. A lot of readers don't feel like sifting through a multiple page review just to figure out if a game sucks or not - I'm sure a sizeable segment just glance at the score and stop reading. Not that I'm criticizing; I don't imagine sifting through pages of smart-assed comments about awkward inventory systems and nonsensical plots just to figure out whether a game is worth a rent is a lot of fun. Thus, in the interest of accessibility, we'll keep putting scores on our reviews. That's right, we're doing it for you assholes so you'd better like it. Just don't expect the score to tell you the whole story.