Sunday, June 20, 2010

E3 ends, and entropy begins... but is there new life?

A quick housekeeping note: yes, I realise that I haven't updated in some time.  I apologise for this, but I've had neither motivation or energy to post anything for the last couple of weeks.  Coupled with things going on in my personal life, including my recent university graduation, it was just a bit tough to get myself into a mind-state where I was capable of writing something interesting.  Now that much of that has been resolved, I hope to be more productive.

It's that magical time of the year: E3.  Everyone who follows videogames in some form knows about this event, especially as it's now screened on mainstream television channels for the world to see.  To say that there's a lot of hype going into E3 week is an understatement: the entire industry finds itself holding its breath in anticipation, and news practically dries up for weeks in advance of it as publishers hold their cards closely for the trade show - though it's also interesting how most of the big surprises and stories seem to increasingly show up during the lead-up to E3, likely a case of publishers wanting to capitalise on their information-starved fanbases.

This particular E3 was exciting for a number of reasons.  Nintendo in particular managed to steal the show, with a new handheld that, according to reports, really is quite phenomenal, and a huge slew of new games; Kid Icarus alone would have been enough to cause celebration, but to see Kirby, Donkey Kong, Goldeneye, and more all in the same presentation was nearly stirring for old fans.  Microsoft and Sony came off as far more tame, with far fewer surprises to their names.  Kinect and Move are both interesting pieces of hardware, to be sure, but even with better technology, it's hard not to look at them and think "aren't they four years late?"

 Hmm, haven't I seen this game before?  Oh, wait!  This one has grass in it!

The thing that really struck me, though, was just how uninspired the entire event felt.  Now, I wasn't there, and I'm sure that it was quite exciting for those who were, but from an observer's perspective, when examining what actually went on display, there's surprisingly little to truly get excited about, beyond the 3DS and the potentials that the Microsoft and Sony motion controllers might hold.  By my count, the biggest games of E3 were all follow-ups to established franchises, most of them in the action genre: Gears of War 3, Halo: Reach, Call of Duty: Black Ops, InFamous 2, Gran Turismo 5, and so forth.  For someone who is desperately searching for something new in the games industry, this endless precession of sequels doesn't present too many new ideas.  In fact, some of the potentially most interesting games, including Deus Ex: Human Revolution and Portal 2, were barely even shown.

Let me make it clear that there's nothing wrong with sequels.  I am actually quite a fan of sequels, simply because there's so many games I hold close to my heart.  I try to keep a level head, but we all know how it feels when the theme of our favourite game hero begins ringing in our ears; sometimes nostalgia takes over and it's hard to not to be swept away in memories of years past.  Yet when I take a step back and realise that nearly every single mainstream title being premiered (even those which are likely to be original, clever and all sorts of good things) is a follow-up to another mega-hit, well, it's hard for me to not be a bit upset about that. 

Of course, we all know how this works.  Gamers develop strong emotional attachments to particular franchises, characters and so forth.  While developers would love to run wild with new ideas, the economic realities of game development mean that increasingly, the industry needs to rely upon the most popular names in order to remain financially stable.  Most gamers know that this shift to big-budget, universally-appealing games has gone hand-in-hand with increasing monetisation of games, from microtransactions, to downloadable add-ons, to pay-to-play models.  As healthy as the industry would like to present itself, the reality is that it's sputtering for air - and it's not out of a lack of ideas, quality or effort.

I don't mean to cast a dark shadow and say that gaming is doomed, that our favourite developers are going to boil in a sea of fire and be swallowed up by a dark abyss.  Games are simply too profitable a sector of the entertainment industry to abandon.  Like films, they aren't going to be abandoned; interactive entertainment is here to stay, even if its nature changes due to technology or economics.  However, the reality of the games industry right now is that creativity and innovation are dying, not because the people making games aren't capable of it, but because the market demands that efforts be directed towards reliable ideas.

The more observant of the industry may have noticed that the most creative titles to appear lately have been small indie games released on Xbox LIVE Arcade, Steam, and other digital distribution services.  The more observant still will have noticed just how much this business model mirrors the early days of PC gaming, where shareware games were common, games were smaller and bite-sized (not to mention cheaper), and more importantly, were able to get a way with a whole lot more.  Making a big-budget hardcore role-playing game is a near impossibility given the extreme technical requirements of the genre, but on a platform where we expect smaller, niche games, it's possible to survive and even thrive.  The developers, too, mirror those of the golden era of PC gaming - with small teams of only a dozen people, some of the most popular indie developers are able to be as creative as they want, and their low costs ensure they can get away with it.

This observation isn't new, by any means, but in light of it, we have to ask ourselves a few questions.  For starters, what does our obsession with big-budget games that eschew creativity in favour of predictability say about us as gamers?  Is this obsession a vicious cycle headed towards an implosion much like the crash of the 1980s?  If that's the case, then are these indie developers going to rise from the ashes and revitalise the gaming industry, only to fall into the same problems later down the road?  If we are not at the end of an era, then I think we are very close to it.

Comic Jumper is one example of originality that is all too rare in gaming today. 

I only have so much time and money that I can put into games, and it's increasingly becoming fewer and fewer.  Are gamers as a whole going to feel the same way?  What happens if they just... get tired of it?  I love gaming, but I have a pile of games that I have bought and have yet to get started on, and I know many people who are the same way.  I clamour for new ideas, yet I also continue to buy sequels to games that I enjoy, even though the only improvements made come in the form of shinier visuals and some new variations on the established formulas, and I know plenty more people who are in that same boat as well.

E3 has managed once again to stimulate the industry for another year, but I have to wonder just how long that's going to last.  We need some sort of radical new model for the games industry, I'm relatively sure of that much.  There are about two ways that we can take things: we can go for cheaper, smaller, more niche games, like the indie developers are releasing right now, or, we can create games that are more expensive to produce, significantly more expensive for players to buy, and offer less value as content takes increasingly more resources to produce.  I'm hoping that the games industry collectively moves towards the latter model, even if it's not perfect.  But what about everyone else?  Modern Warfare 2 has sold 20 million copies.  It's not looking so great from where I'm standing.

[Image credit 1]
[Image credit 2]

Wednesday, June 2, 2010

The reviews industry and the construction of quality

One can't have followed games closely over the last several years without noticing the reviews industry, which is the umbrella term I like to use for popular games journalism.  While critical reviews serve an important part in promoting sales (or failures) in nearly every industry, in games it's seen as perhaps the driving force behind sales figures.  Effectively, reviews are advertising, and the amount of money poured into building hype, generating positive press coverage, and eventually landing a high score, goes far beyond simple press relations and informing consumers.

Certain aspects of the reviews industry have already been covered in detail, including, but not limited to, Metacritic (the popular reviews aggregate site which gives games a "metascore" based on average review numbers), and the supposed race to the bottom when it comes to giving out higher and higher average scores for games, which don't accurately represent the average quality level.  What I'd like to turn my attention to in this article is the discussion of the disconnect between what gamers and reviewers say they want, versus what they actually want, and how reviewers are able to dictate the terms of discourse surrounding the quality of games.

An example I'd like to use here is Alpha Protocol, Obsidian's latest role-playing game, released just this week.  The game had a tumultuous development cycle, as indicated by multiple delays and a protracted development, and gamers know that a delayed game is either better or worse for those delays.  Upon its release, the game has been, more or less, a bit of a critical flop, with generally mediocre review scores, and even some extremely low ones.  Most of the complaints are centred around dated graphics and its skill-based combat, with most reviewers comparing unfavourably it to BioWare's Mass Effect 2.

Jeez, what an ugly guy!  No wonder this game was rated so low.

Though I'm not here to convince anyone that Alpha Protocol is in fact an excellent game, and that the reviewers got it all wrong, examining the reviews paints a very curious picture.  Gamers and reviewers alike are known for touting gameplay over everything else, with technical issues taking a backseat unless they really get in the way of playing or finishing a game, and graphics are often considered a secondary or tertiary factor.  Yet, judging from the negative reviews, it is presentation aspects that most reviewers are actually concerned with - apparently, stiff character animation is enough to damn something to the void, regardless of redeeming features.  That these redeeming features, namely, story, characters, decisions/consequences, replayability, and deep character building, aren't enough to make up for poor visuals, suggests that they hold far less importance, and this is troubling.

It can be said that, at the root, gamers are most concerned with whether a game is fun.  Fun is a difficult thing to pin down, because it's different for every person.  For some, fun means fast action and tight controls.  For others, it means an intriguing story with lots of twists and turns.  For still more, it's multiplayer.  Yet in general, gamers who are drawn to particular genres expect certain things from those genres.  Shooter fans expect good shooting, and aren't primarily concerned with complex narratives, puzzle fans want addictive gameplay that doesn't get boring quickly, and so forth.

Because games are so varied, and our ideas of "fun" are so varied, reviewers are primarily in the business of dictating what is good and bad.  Through individual reviews and metascore, a hierarchy of games is created, whereby certain ones are valued over others on the basis of a predetermined set of criteria.  These criterion can vary - what may have been valued ten, five, or even two years ago are no longer important in evaluating a game, while other things have become far more important.  Part of this is, of course, due to the general progression of the games industry as a whole - as new games come out, they incorporate successful elements from previous games, while adding new ideas.  This results in a "survival of the fittest" system, where certain things are valued over others, with things deemed "good" eventually becoming standards, and things deemed "bad" discarded, with future games being heavily penalised for keeping them.

When gamers read reviews, they aren't reading an isolated opinion of a game, an objective statement of quality.  This should be obvious on a superficial level, but what gamers typically don't acknowledge is the greater discourse which leads to the formation of these ideas of good and bad.  Each review is a microcosm of the history of games reviews, containing within it as many unconscious assumptions as there are clear statements of fact.  When a reviewer calls controls "clunky", he or she expects the readership to know what that means; this applies to nearly every statement and piece of terminology, not just when it comes to understanding language, but also understanding evaluations and histories of quality.

 This game is exactly 66 percent good!

Through this use of language, reviewers are able to have an incredible effect on the games industry.  Reviewers don't just tell us whether games are good or not - they formulate the reasons why we are supposed to think games are good or bad, our expectations about game content, our standards for technology, and so forth.  By quantifying these things with a score, typically out of ten or a hundred, reviewers create an objective scale to measure games by, even though no such objectivity exists.  Game designers, fearing the negative repercussions of low scores, pay attention to the opinions created by reviewers; they internalise those metrics of quality and build their games with the yardstick in mind.  We are so accustomed to this process that we rarely think about it - nor do we stop to consider whether our calling a game "bad" is because it genuinely is bad, or because it too readily deviates from our expectations.

Of course, there is one more vital party in this process, and that is the player base, the people who actually play games and enjoy them.  Reviewers are supposed to represent the wants of gamers, and to speak for them, but as I examined above, often those wants reviewers claim to have are far out of line with what they actually do want.  Is the average gamer any different from this?  Do regular players really hold on to their ideals when it comes to games, or is there the same sort of disconnect?  One way to judge this is sales figures - and if we do that, then the answer is a clear yes, with the highest-rated games by and large selling far better than the lower-rated competition.  Another way of gauging this is via case studies of the actual opinions of gamers about particular games, which unfortunately is a bit beyond the scope of this article.  But, if we examine how much time players spend with games and use that as a metric to determine wants, then the results seem to mirror those of sales, with top online like Halo, Grand Theft Auto and Call of Duty dominating both the sales charts and the reviews.

Although the similarities between the supposed wants of gamers and the wants of reviewers suggest that reviewers do accurately reflect gamers as a whole, I don't think that this is the case.  As I've examined, reviewers have a fundamental hand in deciding exactly what parts of a game are "good" and what parts are "bad".  It stands to reason that if we are able to change the way that games are reviewed, we will also be able to change which games sell, what people value in games, and what we get out of them by playing them.  Hopefully, this means that in the future, games that don't pander to expectations, like Alpha Protocol, will be received in a more positive way, and will be able to redefine what gaming is capable of as an artistic medium.

[Image credit]