Wednesday, October 26, 2011

The Automation of Player Skill

In a recent feature on Gamasutra, "The Abstraction of Skill in Game Design", Josh Bycer created a framework for understanding the degrees to which games abstract or "filter" the abilities of the player through the skills of an in-game avatar or the game mechanics themselves.  Upon reading this piece, however, I found myself wanting to more deeply explore the nuances involved in the player-skill equation.  Consequently, in this article, I'll be discussing some of the more specific ways in which developers automate or even apprehend player skill, much to what I believe is the detriment of the gameplay experience.

Let Me Get That For You

The most obvious way in which games automate player skill is quite literally by, well, automating things for the player.  Whether this manifests as auto-aim in a first-person shooter, your squad automatically taking ideal positions and weapons in a strategy game, or your character automatically gaining skills in preset areas in a role-playing game, the end result is always the same: the player's direct control over the action and the game mechanics are suspended and superseded by the game designer's own will.

I'll take the most common example of this, the auto-aiming in a first-person shooter, most commonly seen in console games.  Auto-aim is typically considered a necessity by shooter developers, whether or on console or PC, mostly in order to provide newer and less experienced players with the ability to get through the game without necessarily possessing a high skill level.  As an accessibility tool, this is all well and good - after all, it makes little sense to deny a new player or a disabled gamer the "privilege" of finishing the game.  However, on consoles, auto-aim is more or less a universal constant: finding a game without it is impossible, and finding one which lets you disable the feature in the first place is often quite difficult.

Shooters on consoles can still be fun and complex games, but taking out the aiming reduces skill involved and cheapens the play experience
Shooters on consoles have rarely worked well for me.  Although I can enjoy the spectacle, the visceral thrill of firing at enemies (or friends in multiplayer mode), as well as the story and characters, rarely do I find myself enjoying the gameplay in particular, the fundamental "bread and butter" gameplay mechanics which all shooters revolve around.  I think, at least in part, a lot of this comes down to the fact that so much of my play experience feels automated as a result of auto-aim, and, in more recent games, the snap-to feature when aiming down iron sights.  Sitting down to play Call of Duty: Modern Warfare 2 with a friend in split-screen, it often feels as if there's a third presence in the room - an invisible game designer sitting between the controller and the television, who anticipates my own movements, says "oh, you want to shoot the guy on the left?  I see, let me get that for you!" and pulls my crosshair in the right direction.  Actually winning in such a scenario feels less about my skill in the fundamental act of shooting, and a lot more to do with my ability to simply select the right targets at the right time.

Now, don't get me wrong, shooter developers already know this.  Bungie, pioneers of the modern console shooter themselves, were smart enough to anticipate this problem in the original Halo, and designed everything from the game environments to the enemy AI around auto-aim, the player's more limited degree of movement, and slower aiming speed.  Furthermore, in downplaying the skill importance in shooting, they were required to compensate by emphasizing other areas of the game, such as more distinctive weaponry with strong balance, the addition of vehicle sections, AI comrades who fight capably alongside the player, etc.  I don't mean to say that it's not possible to make a fun game with auto-aim, either, because clearly I'd be in the wrong, and I've enjoyed many shooters with degrees of auto-aim built into them over the years, both on consoles and PC.  Despite all this, though, the movement of auto-aim from an accessibility and difficulty feature into a standard feature which games are designed around feels, to me, like the epitome of dumbing down the game experience for the player - rather than having to master the game mechanics in order to win, now I am made master by default, with a silent, invisible hand guiding my way all along.
The Reduction of Player Judgement

Another recent example of this I ran into, in a completely separate genre, was in Neverwinter Nights.  I've always had a strained relationship the game, having played it since its release on and off but never really getting far into it or particularly enjoying it; I've always felt like I've been missing something, as if there's something about the game that everyone loves that has flown right over my head.  Now, I believe I've cracked the code, and it lies in the removal of the player's judgement in utilizing the skills he or she might have.

Despite being a solid implementation of the Dungeons & Dragons 3rd Edition rules (a considerable step up from the "Advanced" D&D rules of earlier CRPGs), Neverwinter Nights is also extremely automated.  Rather than building one's own party and then directly controlling each of them in order to overcome tactical combat encounters, instead the AI takes over everything from the inventory management, equipment, leveling up, and fighting in everyone but the "main" character.  Supposedly to speed up the game pace and give the companions more personality, all it does is utterly sap the life out of the game by turning combat into a boring, repetitive slog where you spend 95% of your time watching your party auto-attack their way to victory.  Even as a spell-caster with lots of active abilities, ultimately gameplay comes down to simply hitting a few hotkeys when necessary, and looting the occasional chest.

Neverwinter Nights loses much of the inherent depth in its ruleset by automating many aspects of play, from combat to feat and skill use.
Furthermore, when the AI breaks down, it often takes the player with it.  I recall a recent attempt at a battle against a high-level spider enemy, wherein my AI companion, rather than taking the fight and absorbing some of the hits for me, instead decided to run off to a nearby trap and attempt to disarm it, while my player character was forced to take on the spider alone.  As if that wasn't annoying enough, my companion apparently failed in disarming the trap, and the resulting fireball blew both himself and my own player character to kingdom come.  I was not amused.  At least I had a quicksave on hand.

Then there are the more subtle ways in which the game automates player actions.  Going back to traps, rather than having to search the environment manually, either by using visual cues, logic and reasoning, or even just entering a "search mode", instead the function takes place automatically (at least when one has an AI companion capable of the feat), and almost never fails.  This might seem like a small thing, but it makes all the difference.  In a CRPG, the player's skill comes into play not in terms of twitch reflexes, but in terms of making the right decisions in succession, based on strong judgement and knowledge of the game mechanics.  The Infinity Engine games such as Icewind Dale turned traps into a sort of resource management of their own.  Usually it was good practice to scout an area with a stealthy rogue or character made invisible by way of a spell, both for enemies and for traps.  However, often traps would be placed near enemy encampments, and disarming them would break stealth and expose your squishy mage or rogue.

Hence, trap disarming was an interesting risk and reward decision tree - one that a player could ignore entirely should he or she want to put up with the occasional inconvenience traps cause, or could handle intelligently.  Indeed, much of the fun in playing a rogue or magic-user is in using those unique skills to one's advantage to tackle the game's obstacles in interesting and creative ways.  In Neverwinter Nights, the entirety of the nuance of detecting and disarming traps has been completely and utterly removed, replaced with a skill-less automation which not only reduces the depth of the game considerably, but also makes all those complex feats and talents a player can invest in a lot less interesting as well.  Rather than taking advantage of the inherent depth in the D&D rules to provide the player with interesting skill-based gameplay (reasoning, logic and risk-management), instead, Neverwinter Nights decides for the player how to play the game, and ends up actually being a less faithful implementation of the tabletop game - even without changing the rules themselves.

Cinematic Experiences

One of the most common points of praise for modern videogames is when they are compared favourably to movies.  Many game franchises have made it their key selling point in order to provide the player with a "cinematic experience", to place the explosions, characters and dramatics all around the player as the universe's centre.  I certainly won't deny that a cinematic game can be enjoyable - as I said, I love spectacle, and despite not being much more than an on-rails shooter, I can't help but be visually stunned by a game like Call of Duty, whose sole mandate appears to be to provide the player with a roller coaster ride of action and excitement.

However, rarely do we really see how this focus on cinematics can hamper the game experience itself - in fact, most players and developers alike tend to assume that those cinematic elements are good in and of themselves.  Eurogamer's Simon Parkin recently posted a review of Uncharted 3: Drake's Deception in which he gave the game a solid score, but provided a strong critique of some of the game's limitations - most notably, as stemming from developer Naughty Dog's attempts to provide the player with an action movie's worth of excitement.
"But it also reveals another truth. Uncharted 3 is the most exciting game in the world, but only until you deviate from the script. Even in this chase the conflict between the developer's theatrical choreography and player-controlled interactions is clear. In order to ensure each set-piece is set off correctly, the game commits the cardinal sin of insinuating you have full control of your character, but in fact tugging you towards trigger points - making sure you're in the right spot to tumble over the bonnet of that braking car, for example."
The point of criticism is clear: while Uncharted 3 might be an incredible thrill-ride, it's also, in some senses, the worst game in the world: control is apprehended so that the choreography of the piece is maintained, and the game's pacing moves not  out of the player's own actions, but out of the developer's want to adhere to an extremely (and exceedingly) specific flow of events.

Uncharted 3 provides enthralling visuals, but in automating much of the gameplay, it can feel as if the player is just getting in the way of the action.
There are two problems most notable with Uncharted 3 as I can gather (and apologies if I'm wrong, as I haven't played it myself).  The first is that oftentimes the player doesn't feel in direct control of the experience.  As Parkin states, often he found protagonist Drake running and jumping around in ways which clearly defied the (in-game) laws of physics, and his own expectations for the character, in order to make a cinematic leap forward or avoid the game over screen.  Not only does this break immersion and the player's understanding of the game world's rules, it also takes control away and subtly tells the player "hey, you're doing it wrong, move over and let me handle it."

Second, oftentimes the game simply refuses to work in a way the player might anticipate or expect, especially in cases when the player defies the wishes of the designers.  I've seen this plenty of times in the past, with absurd instant-death scenarios for stepping outside the dotted line, though much less common in modern games, and most developers have got pretty smart about providing realistic, natural, and coherent explanations for why the player can't do X or Y.  Yet Parkin writes,

"Your freedom of choice risks ruining the shot. Indeed, throughout the game, if you jump into an area you are not supposed to visit, Drake will crumple on the floor dead, Naughty Dog switching role from movie director to vindictive god. That is not your predestined path: Game Over."
Uncharted 3, put simply, doesn't allow the player to experiment, to explore, or solve problems in creative ways.  Instead, everything must adhere to the tightly controlled script that Naughty Dog has laid down, and deviating from that path is met with an instant failure and reload.  The most absurd thing about such a limitation is that Uncharted has always, to a degree, presented itself as a game about adventure and exploration - the protagonist, Nathan Drake, is a modern-day Indiana Jones, who gets ahead by the skin of his teeth, jury-rigging one solution after another to just barely beat out his enemies.  Yet when the player tries to behave as Drake, in a way not anticipated by the developer, the result is not a reward, but the most absolute of punishments.  It's an inherent contradiction, and one developer Naughty Dog seems content to leave be, even if it ultimately harms their game in the long run.


There are, of course, plenty more ways in which one could observe or categorize the ways in which player skill is mediated by game mechanics, and plenty more examples out there to put under the microscope.  I hope that in this article I've been able to both shed light on some interesting examples of what I think is a disturbing inclination, and also that I have made the case as to why they ultimately hinder the play experience rather than help it.  Videogames should be all about providing the player with interesting mechanics, tools and scenarios, and, in my opinion, cutting into that via automation is contrary to the very foundations of the medium, and is something that deserves more consideration and scrutiny than it is given presently, by both developers and gamers.

Thanks for reading, and please share any thoughts in the comments!

Friday, October 21, 2011

Building Better Game Development Tools

It's pretty incredible how far game development tools have come over the last generation of games.  Even just 5 years ago, getting one's hands on game development tools was a difficult process, and learning those tools was even more difficult.  Documentation was extremely scarce, and unless you were lucky enough to be tutored by an experienced team, or ended up in a game development job yourself, learning to use those tools effectively was even more difficult.

Today, in 2011, getting one's hands on game development tools and actually coming to grips with them is a much more friendly process.  For one, there are dozens of freely-available commercial-quality game engines available, from CryEngine 3 to Unreal Development Kit, and even those few games that ship with official software development kits are extremely well-supported by comparison.  What's more, the rise of online video now means that getting concise, visual demonstrations on how to use those tools is easier than ever, and with a human touch behind it, it's very easy to learn tools quickly, even if you're a complete novice.

In spite of this progress, however, I've found that modern software development kits still have a number of stumbling blocks they need to get over in order not only to make building games more intuitive, but also to make the process faster and require less fine-tuning, bug-fixing and other assorted issues.  In this article, I'll be taking a look at a few SDKs I have experience with, examine where the problem areas lie, and provide comprehensive suggestions on how they can be improved.

A quick disclaimer: I'm a design mind, not a programming one, and it probably shows in this piece.  I'm sure others will have all sorts of issues they can bring up, so bear with me here, as I'm talking about what I'm familiar with above all else.

Designer/Programmer Dichotomy

This is the all-time biggest stumbling block in my opinion.  People who are trained in scripting, programming and other more "technical" tasks tend to not only have certain skill sets and degrees of familiarity, they also tend to process information and think in different ways from designers.  Speaking broadly (and I do want to stress "broadly"), designers, artists and so on tend to understand the world in more visual terms - for them, what they can visualize is reality, and creating something that is pliable, "physical" and visible is far more intuitive.  Programmers, on the other hand, tend to think in terms of data - visual organization is very important, but for them, speaking in very broad terms, it's possible to understand how something works and will impact a final product without necessarily seeing it first-hand.

Consequently, the organization of data tends to be different between these two general archetypes - artists like to keep things simple, uncluttered, and easy to understand, data-driven minds tend to be more focused with smaller details, on having information freely accessible at all times, and in general want to be able to consult any bit of information at any time regardless of what it might be relevant to.  Again, I want to stress I'm speaking in very broad terms, and using stereotypes to do so - I don't want to suggest all people (or even most) think according to these, but rather only for the sake of conceptualizing the problem.

That problem of interests conflicting, of course, arises when you start to build a user interface that is able to accommodate both sensibilities.  Designers tend to think in terms of visuals, so, for instance, level-building tools tend to present themselves in terms of end results - i.e. if I want to make a mountain, I use the terrain tool and sculpt one.  However, programming, scripting, and even animation tend to exist more in terms of logical building blocks which inevitably require more functions - rather than hiding things behind the scenes and letting the visuals speak for themselves, and this may not be immediately intuitive for the visually-minded.  When you try to combine both sides of the coin into the same toolset, you inevitably end up with something that, to an outsider or an inexperienced developer, can look positively intimidating, as the sheer number of functions required ends up going over the tipping point.

Far Cry 2 provides an excellent and intuitive level editor... but it's limited by nature of its reduced feature set, making for a friendly but incapable SDK.
One of the clearest examples of this I've seen is Far Cry 2's map editor.  Running the Dunia engine, the home version available is stripped down to the essentials for those who want to build multiplayer maps.  The basics of terrain manipulation and object placement are here, as well as texturing and vegetation, but beyond that, the interface is about as easy to figure out as any I've seen.  For those who want to just get in and start building levels, frankly, it's great, and for rookies especially it's excellent in helping to teach the basics of level design.

However, for those who want any more advanced functions - basic scripting, AI characters, triggers and mission objectives, they're out of luck, and anything more advanced, like manipulation of local and world variables, cutscenes, animation and so on are also missing.  If you included all those features in this toolset, it'd be far more intimidating, harder to learn (as more interface elements means more to sort through, more room for error, etc.), which would be inappropriate for a level editor aimed at general audiences.

More generally, however, these cluttered interfaces which arise from the dichotomy tend to cause greater problems - namely, they make the tools hard to learn and even harder to master, as the number of functions they must cater to ends up outstripping the talents of any single team member.  What this ultimately means is that tools look cluttered and offer up, from an individual perspective, too much information to easily process and digest, and even for those familiar with the tools, can lead to a breakdown in workflow as more mental breathers are required to take everything in.

The Dragon Age Toolset offers up huge amounts of functionality, but keeps them quarantined in relevant areas to help keep content creation more intuitive.
 One of the best solutions I've seen to this problem so far lies in the Dragon Age Toolset, BioWare's internal game editor for the Dragon Age series.  While the updated version compatible with Dragon Age II is a no-show for now, the original version still has a logical way of organizing information for the end user, regardless of experience level.   Effectively, game files, rather than stored within large level archives, are instead manipulated by an individual "mode" within the editor, and saved as individual files, i.e. *.dlg for dialogue.  This means that, while level designers can easily create maps in the level editor, game designers can then hop in and create the actual gameplay via the area editor, by placing NPCs, triggers and so on.  This extends to cutscenes, plots and quests, scripting, creature and NPC creation, and so on.  The compartmentalized nature of the Toolset makes it easy to learn and easy to break up into small, intuitive pieces while still appealing to the more advanced sides of each discipline.  I think that more game SDKs should take on this approach - if the tools need to be unified, they should still be small enough to learn the individual pieces of, rather than monolithic and intimidating.

Don't Waste Time

This might be a bit of a vague topic, but I think it's an extremely important one nonetheless.  It's extremely common in game editors to have certain functions which take an extremely long time to perform, yet the function they provide is utterly essential.

The most common I personally run into, and the most aggravating as a level designer, is that generating lightmaps, more often than not, takes just about forever to accomplish.  I realize that, at a basic level, this is something that is hard to control - lightmap generation is something essential to modern 3D games, there are certain underlying factors that can't be hurried or sped up, and the size, complexity and number of lights in A level very often contribute to rendering time.  Even so, more often than not I find that, in many game editors, half my time is spent waiting for lightmaps to render rather than actually building more content.

UDK provides extremely high-quality in-editor rendering, which saves a huge amount of time in building levels.
There is an "easy" way around this lightmapping problem, and that is to have sophisticated and high-quality real-time previews of what in-game lighting will look like.  Things in the editor are rarely identical in the game owing to optimization, in-game options, video drivers, and any number of things, but the faster I can conceptualize how a level is going to look in the game, the faster I can build it, and getting a sense for lighting and atmosphere is absolutely key - in fact, I'd say there is nothing more important in creating a level than knowing what the lighting is going to look like by the end of it.  On top of it, lighting helps designers understand spatial relations, as without real depth perception, lighting and shadows help to give a sense of depth that is otherwise lacking when flat, static lighting is all you have to go on.  If it's impossible, then at least provide me with fast, quick-and-dirty preview renders that take as little time as possible but still give a general idea of what the game is going to look like.

Another thing that tends to take a lot of time is testing out changes in-game.  It's easy to build content, but most time is generally going to be taken tweaking it, and fixing all the little issues and bugs that crop up.  In my experience, the more often you can test something in-game, the better.  Unfortunately, some game SDKs still do not support this feature adequately.  The Dragon Age Toolset as mentioned above is just one such example, as it requires a full export of project files and a full game restart in order to be able to see results - and that's not even counting how much time you might spend actually getting to the piece of content you want to spend.

CryEngine 3, through some arcane magicks, is able to allow you to drop into a game level at any time and place, and test exactly as it appears in the game itself.  This.  Is.  Good.
Although I'm not certain whose editor the first was to do it, Crytek's CryEngine and CryEngine 2 really popularized the idea of being able to jump into the game world and test things out exactly as they would occur in the game itself.  I'm not privy to the technical wizardry behind this, and don't presume to know what kind of a feat it is, but suffice is to say that it makes building content much, much quicker on a design end, and means that content creators get real, accurate feedback on what they've done even moments after it's been put into the game.  From a level creation point of view, this is especially important, as most know that placing 3D objects accurately without clipping errors, floating bits etc. can be a real pain sometimes, as can ascertaining a sense of scale relative to the player.  Moreover, being able to test out scripting in-game is extremely convenient, and allows bugs to be fixed in seconds instead of hours.

I'm sure there are dozens of other solutions that can save developers' time on the content creation front, everything from better standards between programs, to consistency in file formats, to removing or automating redundant and repetitive operations.  These couple of examples are the two that stand out most for me, but depending on the field, I'm sure that there must be dozens of other issues - for example, I know audio designers would probably be happy to never again have to bother converting their *.wav files into some obscure and disused file format using a stand-alone compression tool that only works properly on Windows 2000, and batch actions when importing scores of diffuse, normal, specular, etc. maps would be a godsend for artists.  Suffice is to say, any time-consuming tasks that could be made faster, should be made faster, end of story.

Standards, Standards, Standards

However, the fact is that oftentimes it's impossible to realize the demands for better tools, especially for developers which rely on their own technology.  Whether it's because someone on a development team has a certain obsession with a proprietary format, because game technology is purpose-built for a specific product, or simply due to a lack of resources, a lot of these ideas I've presented get left on the cutting room floor for one reason or another.

The solution to this, I think, might be controversial, but in the end I feel would be best for the whole games industry and the development process.  Standardization of tools has already begun, to a degree, with just about every developer out there needing experience in Photoshop, Pro Tools, Maya, etc. depending on the particular field, but even beyond asset creation, game tools have also begun to homogenize.  Ten years ago, accomplishing anything more than what a game editor was explicitly designed to do was a complex task involving lots of jury-rigging and a good deal of luck, and transitioning from one SDK to another was a long, arduous process that required a lot of re-training.  These days, the interfaces between SDKs are so similar, at least on a basic level, that most people should be able to jump in and create something useful within a week or two, and that is most certainly an improvement.

At the same time, much as we have those standards for the asset creation, we don't really have standards for development tools themselves, despite the similarities.  I think, more than anything, what is necessary to cut down on development time and expenses, is the creation of a unified SDK front-end common to every game engine.  The best analogue I can provide for this is Photoshop - while it can export a variety of image formats, compatible with just about any other program or device, the actual creation process is extremely standardized. Photoshop's interface has come to define modern image editing, and to a degree, even 3D modeling and texturing tools like Mudbox, to the point where any artist, photographer, etc. can use Photoshop and create something suitable for any job.  The dream, for me, is to have a Photoshop for level editing, another for scripting, another for animation, another for audio work, and so on, so that skills would remain consistent and practical across just about any project, while the code underneath, the game engine itself, remains in the hands of the game developers.

Front-end SDKs similar to Photoshop could revolutionize and standardize game development... but potentially hurt it as well.
Adobe Photoshop is also a great examine in itself because in many ways it has been razor-refined for its intended purpose.  It is well-organized and concise in presenting information and functionality, the tools are intuitive and self-explanatory, the software itself is supported by a huge range of hardware and runs extremely smoothly provided you have a system supporting GPU acceleration, it's highly customizable in terms of layout and keyboard shortcuts (something that can't be said for all SDKs), and it has defined the workflow for image editing across virtually all relevant fields.  Tools like this, from a content creation perspective, would be incredible for game developers - ones which cater to the specific needs of different individuals, with sets of logical and intuitive functions which speed the creation process.  Being able to envision my game levels layer-by-layer, painting level details with brushes and swatches, or adding fog and post-processing using simple brightness, level and hue controls would be excellent, and I'm sure those in other fields would find similar benefits in taking the best features from other professional tools.

There are a lot of pitfalls and problems to anticipate with such a solution.  One is that of the monopoly - namely, whose standards are selected for the industry as a whole?  While the colossal growth of Unreal Engine over the last ten years has come to define even the aesthetic of modern games and the workflow of game development, there are plenty of other equally capable alternatives.  Two, the technology that powers one game might not be suitable for another, so any tools would have to be fully inclusive of just about any game design, would need to be cross-compatible with other tools, and would need to be routinely updated in a timely manner so that developers could take advantage of them.  Three, videogames are a technology-driven medium, and innovation exists as much in tech as it does in design, unlike the film industry where the filming and editing processes are by and large already cemented and it's the creative side that reigns.  Putting the development of some of that technology in the hands of a single company could stifle progress across the entire industry, even if it does lead to faster and easier development.  Four, the one who actually ends up creating the tools would most likely have to be an uninvested third-party rather than an actually game developer - we already know that developers and publishers especially do not have much interest in using the competition's tools.


I have some confidence that, in the future, game development software will be able to naturally answer these questions in its own course, but I also can foresee the transition being bloody and fraught with problems and casualties along the way.  Most importantly, however, is to recognize that all this is easier said than done - the real-world limitations of development, economics, copyrights, and everything in between represent a huge barrier, and that, what other fields like film have managed to do across the better part of a century, the games industry has only really been operating in such an organized and standardized fashion for the last decade or so.  What I've covered here is only a fraction of potential interface and workflow improvements for game development tools, and I'd be intrigued and delighted to hear any other suggestions that might come up.

Monday, October 17, 2011

Ding! The Devaluing of the Level-Up

Whether it's the swell of an orchestra, the thunder of a distant war drum, a chorus of angels, a guitar riff, or a simple, distinct "ding" sound, everyone loves to hear it: the chime that signifies a level-up.  Although it has its roots in role-playing games, the notion of leveling up abilities, characters, items, and more has crept its way into just about every facet of gaming - so much so that it's nearly impossible to find a game which doesn't have some sort of leveling, experience points, or an equivalent progression system realized and presented in a numeric fashion.

You'd think that, as a fan of role-playing games, I'd love all this talk of levels, stats and attributes, and relish in the chance to up those numbers no matter what the context - after all, there's nothing I don't enjoy more than a good RPG.  On the contrary, however - the more games I play which involve leveling and progression in such strict, metered and discrete ways, the more and more I tire of leveling.  In this article, I'll first outline my own general conceptualization of what levels represent, and then I'll get into exactly why I feel the move towards leveling up in just about every game genre out there has contributed to the devaluing of the concept.

Conceptualizing Leveling

Role-playing games, having their roots firmly in the tabletop space, borrow many of their conventions from the mechanics necessitated by the limitations of the tabletop itself.  To put it plainly, there is no "hood" to look under in the tabletop realm - all of the gears, the machinery, the underlying operations are exposed for all to see, and any pretense of fantasy, fiction, and aesthetic come solely out of the imaginings of the players involved in the game.  Effectively, a role-playing game is a set of raw mechanics which interact with one another, in order to facilitate cooperation, interaction and competition within a fictionalized world outlined with a set of natural and physical rules, i.e. the ruleset.  All the elves, dwarves, and magic that so many players love are ultimately secondary to the mechanics of the game itself, despite the fact that it's those aesthetic elements which are so iconic of Dungeons & Dragons.

Traditionally in RPGs, leveling has been about expressing a set of rules about the world - not about the Drow Ranger in the character portrait.
Within this framework, and the numbers exposed, a concept like "leveling up" makes a lot more sense.  Everything in a tabletop game is expressed in a numeric fashion, governed in a consistent and mostly predictable way, and leveling up is just one way of understanding the progression of characters and abilities.  Furthermore, although we tend to think of leveling up in terms of character level, most pen and paper games have plenty of other ways to level up - feats that need to be purchased, attributes that need to be raised, etc.  Many systems will allow players to freely level up in different classes, as well, allowing for a significant amount of variety and control over progression.  These tie in with the aesthetic and our understanding of the game - the "I'm a half-elf sorcerer!" fantasy - but at their lowest level, these are merely mechanics.

What this all means is that leveling up is not an end in itself, but rather a way of understanding the progression of a character's ability and proficiency, not just in a vague and general way as it's realized in most videogames, but in very particular, controlled and specified ways, often along multiple paths simultaneously.  The "leveling up" is just one aspect of a much larger system, and while perhaps one of the most rewarding of those aspects, is still ultimately only a very small part of what makes up that complex set of interactions and rules.  Leveling up may tie into that fantasy, our sense of empowerment and progress, but like my half-elf wizard, that's all something built on top to provide meaning to the experience, rather than something integral to those operations.

Defining Progress

Usually in videogames, progress can be expressed in some pretty intuitive and self-explanatory ways - completing a mission, gaining a new weapon, item or ability, killing a powerful boss, beginning another chapter in the story, moving from one environment to another, different one, and so on.  All of these contribute to a feeling of movement through the game, an expansion of gameplay mechanics, and the overall sense of pacing that keeps the game interesting throughout its running time.

In The Legend of Zelda, we might emotionally respond to the acquisition of a new item, but what really matters is that we're able to move forward in the game - the Blue Candle, while snazzy and fun to think about, ultimately just grants us a new set of abilities that we can capitalize on.  The same is largely true of getting a new weapon in Half-Life 2 - the SMG allows me to take on far more enemies than before, and the grenade launcher attached to the barrel is useful for clearing out rooms in one fell swoop, but those are functional things.  The sound, the feel, the look of the weapon are all important, certainly, but again, it's what this new weapon allows us to do, what hole it fills in our repertoire, that really gives it its staying power and its gameplay function.

In Psychonauts, progression comes through acquisition of points and manifests as different abilities - is this all that different from the XP and levels that govern other games?
 Of course, leveling up is just as much a form of measuring progress as acquiring a new weapon might be, so long as leveling up actually contributes to an increase in the player's abilities or provides new options in solving gameplay challenges.  In fact, the difference between framing progress as "a new gun" or "boots that let me jump higher" and as a level up on a character sheet is actually much smaller than what might initially be apparent.  In Psychonauts, for instance, I get new abilities by earning new merit badges via collecting arrowheads, imaginary figments and mental cobwebs, either via my own exploration or through story progress - if I called these "levels" and "experience points" instead, would I suddenly have a more compelling progression mechanic?  I doubt it.

Some might argue that systems revolving around experience points and levels are inherently more open-ended, flexible and so on.  It's true, certainly, that generally leveling up tends to be a bit more freeform than more traditional conceptualizations of progression - however, this is another case of aesthetics deceiving people.  Rather, whether or not a game has a linear progression or an open-ended one is a structural concern, not anything that comes hand in hand with the system itself, even if we do tend to think about them in slightly different ways and conceptualize them accordingly.  After all, there are plenty of linear RPGs with leveling mechanics built into them, just as there are plenty of non-linear action games without any real leveling to speak of - the difference is superficial no matter whether you're shooting gangsters or slaying bugbears, and earning gold or bullets.

The Degradation of Context

So, if leveling up is an expression of progress within a strict system of rules and mechanics, and if progress can be expressed in myriad ways without fundamentally changing the gameplay itself, exactly what's wrong with leveling?  It's a bit of a complex answer, but generally it concerns the degradation of the context in which leveling up traditionally has taken place in - rather than existing within the bounds of a ruleset, instead, leveling has by and large been transformed into the sole measurement of progress within all games, regardless of genre, and the result is that leveling no longer feels significant to me in the same way it used to.

Largely as a result of games like Call of Duty 4: Modern Warfare and Gears of War, but also partially due to overarching online networks such as Xbox LIVE, over the last several years, is that the idea of leveling up and metagame progress has begun to supersede the focus on progress within the game itself.  Now, just beating the game and getting a good record in the multiplayer component isn't enough - you've got to level up to 60, max out those weapon challenges, and have the biggest, baddest profile around.  Is there a particular reason why leveling is in the game? I really don't think so - sure, it keeps players playing because the numbers keep going up, but the act of leveling feels completely divorced from and even contrary to the game itself.

Call of Duty has more ways to grind XP than Final Fantasy, but what relation does this progression system have to the gameplay itself?
 Put simply, the leveling which has ended up in shooters, beat-em-ups, action games, and so on, almost never has any direct relationship with the game mechanics.  In RPGs, leveling up is just one way in which the rules of the world are expressed - when I've become sufficiently experienced with something, I am able to gain new talents and improve my attributes, simulating the natural incline of ability over time that anyone in a given profession can attest to.  In leveling up, I improve my character, and I do so in a way that makes sense within the game world - and in a way that the game world is able to summarily respond to.  This is even more evident in more open RPGs, like Fallout, where improving, say, the stealth skill can open up an entirely new path to complete a task, which the game then acknowledges.

But what about a shooter - exactly how does getting 1,000 kills have anything to do with unlocking a new scope for a particular gun?  Sure, you can try to justify this in some way that the player has now "earned the right" to use better equipment, but this is rarely if ever formalized in the game, except perhaps in some vague suggestions of rank along with those character levels.  The fact is that these unlocks, these "levels" gained by the player, do not really have any direct correlation to any consistent simulation of the world - they are arbitrary in the extreme, existing only as a carrot to keep the player moving forward on the treadmill, with the only end either boredom, or the inevitable sequel.

What's more, these arbitrary and contextless mechanics tends to tie into the achievement and trophy systems found on persistent online networks like Xbox LIVE, PlayStation Network, and Steam.  It's gotten to the point where challenges in games exist for their own sake and not necessarily because it's really fun for the player, because they add anything to the game mechanics, etc.  Do the collection quests in Gears of War tie into the storyline much, if at all, or do they give the player new abilities or bonuses?  No, not really, but... well, here's a little badge to tell the world you scrounged in the dirt like a moron for an extra ten hours!  While achievements can sometimes be inspired, by slapping an experience mechanic onto a shooter, you also give yourself a lazy excuse to just turn those achievements into milestones - so now not only is experience divorced from the context of the game, it also exists to satisfy a system entirely outside of the game itself.

Leveling as an End in Itself

Worst, however, is that the new understand of leveling up as a treadmill, rather than a logical outcropping of the rules of the game world, has also begun to define modern RPGs as well, not just shooters and action titles which hold up leveling as a pretense of depth.  One of the most telling quotes I've heard about RPGs in recent years, regardless of the original context, comes from a Torchlight developer: "RPGs are always best when the numbers are going up."  I think, in a certain sense, I can agree with this - it's always good for the player to be making progress in the game and moving forward, and that's true in pretty much any game genre whatsoever.  Giving strong feedback on that progress is also one of the bigger parts of the art - after all, the derogatory term "corridor crawler", if nothing else, implies a static experience.

Torchlight may have a million numbers, but that's all they are - there's no real justification or meaning behind them.
 Where this mentality breaks down is that it begins to forget exactly what purpose leveling up serves in the first place.  I've already touched on the appropriate context of leveling mechanics, so I won't go into that again, but suffice is to say that there needs to be a consistent and strong basis for including such a mechanic in your game.  As a designer, one shouldn't be content to say things like "well, it's an RPG, therefore we've got to have leveling up."  It's both practical and good design sense to look at those mechanics and question exactly what role they serve within the game, and adjust them accordingly.  Sure, it's good to make progress in a game, but is doing so through discrete XP gain, leveling up, and new skill points always a good thing?  I can't answer that question definitively, because it's inherently subjective, but the important thing is to ask in the first place, and genuinely try to provide an answer - otherwise, creatively, you are running on the very same treadmill you've given your player.

The sad truth of the matter is that, at this point, I don't think leveling up in games, and RPGs especially, really has much at all to do with leveling in the more traditional sense that I discussed above, where the goal is to understand and articulate character progression within a strict, organized framework.  Rather, it's based on one thing: the desire to see those numbers keep going up, and up, and up... to where?  Considering the constant demands by players to see level caps raised in expansions, patches and DLC (to the point where that is now a selling point in and of itself, i.e. World of Warcraft), the desire for more perks and skill and weapons which end up breaking the game balance even more beyond what a maxed-out player could previously pull off (another DLC fodder item), and the tie-in to persistent online personas, achievements, and multiplayer profiles, the only true answer I can give is "to nowhere... at least, until the sequel comes out."


Again, I like leveling up as much as the next person.  The prospect of gaining a new ability to play with, of being able to wield a new weapon, or just knowing my character is even better in a fight, all of that appeals to me.  At the same time, to me it feels as if leveling up has lost meaning - an abstracted, context-less ideal of progression without actually being situated logically within a game world or ruleset.  Ultimately, the choirs of angels, war drums, and guitar leads are, in today's games, functionally equivalent to the "ding" of a Pavlovian dinner bell... and in reducing such a mechanic to a carrot on a stick, a stimulus-response algorithm, we in turn degrade not only the fabric of role-playing games, but the depth and breadth of which we understand progression in all games as well.

Tuesday, October 11, 2011

Understanding Difficulty

Although we frequently have discussions about difficulty in games - is it too hard?  which parts did you have trouble with? was it too easy and therefore boring? - we rarely direct our attention to the different fundamental types of difficulty which make up our experiences and colour our perceptions of the challenge a game provides.  In this article, I'd like to go over a few of those most basic types of difficulty as well as the problems associated with implementing them, as well as bring out that it's often not just the sheer challenge of a game that matters, but the nature of those challenges that matter.

Trial and Error

The first, and most obviously identifiable type of difficulty that we find in games, and by far the most common, is trial and error.  Put simply, trial and error revolves around getting the player to perform a task, either through experimentation (i.e. "I don't have anywhere to go, maybe I'll try this") or outward suggestion (i.e. "these are your orders, soldier, now move out!").  At least theoretically, the main difficulty this presents to the player is that the degree of challenge (types and numbers of enemies, for instance) will always be slightly higher than what the player is comfortable with, meaning that he or she will have to rise to the occasion in order to come out on top, either by trying out new tactics, by taking greater risks, or through sheer force of will and dumb luck.

As many of us can attest, trial and error difficulty treads a very fine line.  Typically, too many failures, and players will become frustrated, while too many successes and players will feel as if the game isn't going hard enough on them.  The main issue with this, aside from basic balancing, is that different players have different thresholds for difficulty.  Whereas a more casual player who's just enjoying a game for its story will find that more than the occasional death is a turn-off, the hardcore player who plays on the "insane" setting will want to be challenged at every turn and made to work for every single victory.  Ultimately a developer might run into a situation where they're balancing not just one, but three or four versions of the same game, due to the different needs of different players.

Of course, pacing is also a chief concern by and large governed by the ebb and flow of difficulty, usually of the trial and error nature.  The player needs to have portions of the game which fly by quickly and without too much issue, breaks in combat to absorb the world and feel unchallenged, and nail-biting experiences that are tense and have a feeling of urgency to these.  Building these into a game when taking different gameplay preferences into consideration is a difficult process; after all, while it can be easy to balance a single encounter out to give the player the desired experience, doing so within the context of a full game is another thing entirely.

Adaptive difficulty settings are one way to get around this problem.  On the most basic level, this will typically change the amount of resources (health, ammo, etc.) provided to the player, as well as the proportion of powerful versus weak items based on the player's performance (i.e. more "full heal" pickups if the player is struggling).  This feature is actually extremely common in games, either because developers want to avoid providing separate difficulty levels (a poor decision in my mind), or because players have a curious habit of selecting difficulty levels that aren't appropriate for them (everyone has a different understanding of what "normal" should be).

Screw Alyx, these crates were my best friends in Half-Life 2.  Turns out the reason was a bit more calculated than my platonic love of all things boxy.
 Adaptive difficulty can be both explicit and hidden from plain sight.  Prey, for example, has adaptive difficulty as a toggle option in the game's options screen, and so it can be disabled based on the player's preferences.  Half-Life 2, on the other hand, while providing three difficulty settings (easy, normal and hard) also has a layer of code dedicated to analyzing the player's progress in the game, level of resources, the ease at which certain encounters are completed, and so on; the game will then adjust the items enemies drop, the amount of resources available in breakable crates, and so on in order to make sure the player is always kept on edge by having "just enough" health and ammo to get through an encounter, but never quite enough to feel completely safe or fully-loaded.  Other games will implement it in still subtler ways, like allowing the player to finish off a tough boss monster just a little bit more quickly than normal if the player's death is imminent, creating a dynamic feeling of getting through by the skin of his or her teeth.

The biggest issue for me with adaptive difficulty is that, when left as a built-in feature that can't be disabled, it removes control from the player's hands.  Although I'll usually take an entertaining and engaging experience over one that's simply difficult for the sake of difficult, I also fully understand that some players don't want hand-holding provided that they explicitly ask for it.  Furthermore, adaptive difficulty can also lead to a feeling of predictability and sterility, without a hand-made feel to encounters (which was a major source of criticism for The Elder Scrolls IV: Oblivion).  To this end, I feel that adaptive difficulty is best left as it is in Prey - a toggle switch in the options menu - or specific to a difficulty level, with the hardest mode taking off all assists, which mitigates the problem of too much challenge by allowing the player to rationalize it as his or her own choice (i.e. "well, I picked hardest, I should have known it would be too much for me").

Endurance & Attrition

Another way to test the player focuses on the long term rather than the short term.  All forms of endurance, at their most base level, revolve around resource management, with the player given a limited quantity of a valuable or vital item, its distribution carefully controlled.  Resources are controlled in three main ways in just about every game:

  1. "Random" drops.  It's quite common for enemies to part with valuables when defeated, or for the player to uncover supplies in crates, chests and so on.  By tinkering with the tables that control those supplies, based on difficulty, the player's progress, the amount of resources the player already has, and the player's level of ability, character level, number of party members/companions, and so on, difficulty can be precisely controlled and monitored in order to provide a degree of challenge.
  2. Attrition rate.  Depending on the game, the rate at which a player burns through supplies can be highly variable.  For instance, in a shooter, going up against a tough boss monster might not consume too much ammunition, but may consume a huge amount of health.  Conversely, going up against many smaller hordes of enemies will end up with a player ill-equipped to proceed, but chances are, a healthy one.  Learning to anticipate what the player needs in order to continue in the game is important.  If a game uses an adaptive difficulty system, this might already be handled, but even so, careful consideration of how quickly the player goes through certain resources will lead to better encounter design and a game that feels more alive and responsive to the player's needs.  Strategically denying certain resources can be just as important as strategically providing them, too, in building tension and pacing the player's progress.
  3. Player ingenuity.  Most common to role-playing games, smart players will often stock up on useful items like potions and ammunition before heading out into a difficult encounter; the duration the player can stay out in the wild before returning to stock up on supplies again is by and large controlled by the player's prior action, as well as whatever the player might uncover during his or her outing.  This is one thing that is hard to control in a game, and frankly, shouldn't be.  Keeping aware of what players can and can't do, and building challenges around that is a good thing, as are systems, such as encumbrance and fatigue, which can provide a soft limit on how much the player can carry.  However, imposing unreasonable hard limits (i.e. "you can only hold three health potions at once") rarely feels like a fair way of managing this.
You don't need to be Arcanum to have compelling attrition and resource management (but it helps).
 Long-term attrition may not be suitable for many games, but looking at attrition in different ways can actually reveal interesting opportunities for mechanics that may go unnoticed with a casual glance.  For example, a puzzle game like Tetris has a strong element of attrition in the sense that the available space on the game board is continually shrinking based on the player's performance, the difficulty level, and which puzzle pieces the player is provided with.  On top of that, game speed is another gradually-depleting resource the player must carefully manage as things move quicker and quicker over the course of the game.  There is a veritable economy of space and time in Tetris, even though there is no health bar, ammunition counter, etc. to speak of.  Recognizing that attrition and endurance can exist as more than just basic physical resources will help flesh out and provide depth to existing mechanics.

"Fake" Difficulty

A subset of trial and error difficulty, what I'll term "fake difficulty" here is something which is actually quite common in the games industry, but depends a good deal on the genre in question.  Fake difficulty is a fairly broad spectrum of difficulty, but in common with all of the various permutations is the fact that they typically revolve around tricking the player or bending the rules of the game in order to provide their challenge - often causing significant frustration and annoyance for players, whether they're keen to those tricks or not.

One of the most common forms of fake difficulty actually fits within the category of adaptive difficulty - namely, it revolves around manipulating the rules of a situation in order to provide the player with increased challenge, usually referred to as "rubber-banding".  The key difference is that while adaptive difficulty works in favour of the player (for example, you'll find 50% more health kits if you're low on health), fake difficulty tends to work in favour of the enemies or opponents.  However, since enemies rarely compete on fair terms with the player, and in fact tend to use an entirely different set of rules, this usually means that the bonuses given to the player's opposition fall into the realm of super-human - increased speed beyond normal limits, temporary damage boosts, the ability to negate the player's own abilities when normally they can't, and so on.

A great (and persistent) example of this type of difficulty can be found in Mario Kart - in fact, the series is somewhat infamous for it.  While the goal of the game's rubber-banding is to provide a tense and exciting experience for the player, making sure that each race is as close a finish as possible, and that enemies are able to always keep players on their toes, in the long run, or for more experienced players, this form of difficulty tends to only breed contempt.  While the illusion created is often enough to fool players who are of a lower skill level, as the effects are much more subtle and can often work in the player's favour, when that same system is put up against players who are able to make a mockery of even the high difficulty levels, the computer is forced to go to incredible levels to try and keep up with the player, to the point of blatant cheating, gaining items and abilities far in excess of the player, and even defying the laws of physics (or whatever analogue exists in the Mushroom Kingdom).

Another form of fake difficulty that rears its head is that of the false challenge (which I admit, sounds a little redundant).  In the false challenge, the player is typically asked to perform a standard feat - defeat some enemies, race to the finish in the allotted time, etc.  However, what starts out as a relatively routine task quickly turns out to be an extreme test of reflexes and ability, as the player is beset with all manner of unpredictable obstacles, traps and powerful enemies.  The key thing is that in all of these situations, the player is caught off guard, and unable to sufficiently prepare.  Usually, this results in a quick and frustrating death, as the player likely felt he or she was successful up until that point.  Worst, usually, the only way to surmount this type of challenge is to try it again, often from the very beginning of the sequence, armed with the foreknowledge of the hidden challenge ahead.  When these are compounded one after the other, it can lead to rage-inducing moments for the player.

You can't see it here, but in 0.25 seconds, the driver in the blue car is going to develop a sudden case of sociopathy and swerve straight into the player's bike.
One game series which is notorious for this is Grand Theft Auto.  While the game's mission-based structure suggests that the challenges faced are relatively self-contained and straightforward, it's very common for the games to prey on the player's expectations in the worst way possible.  One example from Grand Theft Auto: Vice City I frequently cite is a race sequence where the player has to reach a number of checkpoints in a set time.  No big deal, right?  That would be the case, if it wasn't for the fact that other cars, trucks etc. are scripted to pull out around difficult corners and immediately as the player passes by at full speed - the player is almost guaranteed to hit these cars and ruin his or her attempt outright, unless he or she is able to slow down and let them pass instead.  This just doesn't happen once, but close to five or six times throughout the race, meaning that even if the player does everything right, there's still a huge statistical probability that he or she will fail anyway, solely due to the designers pulling a fast one.  A similar occurrence can be found in Max Payne, where enemies are scripted to throw grenades at the player at certain triggers, and these are literally impossible to avoid without prior knowledge.
Suffice is to say, fake difficulty, no matter the variety, isn't fun for players, even if it's built into the game with the best of intentions.  Although often the goal is to provide an unpredictable or challenging experience regardless of the player's skill level, more often than not it just comes across as mean-spirited, and at worst, can completely turn a player away from the game by rendering attempts at competition null and void.  Unlike most forms of difficulty, this type is actually best avoided altogether, unless your goal is to make players hate your guts.

Random Number Gods

Although this is typically a type of difficulty reserved for strategy and role-playing games, random mechanics do exist in a wide variety of genres, whether they manifest in terms of how enemies behave in combat, the spread and accuracy of weapons, or whether or not the player is able to sneak by a foe successfully.

I've occasionally seen mechanics based on random elements derided by people, claiming that it takes away from the skill of the player to hinge success upon unpredictable odds.  The key thing to understand about building difficulty out of a random number generator is that challenge is not substituted for "luck", as some might claim.  Rather, difficulty arises as the player is forced to respond intelligently to new developments that aren't entirely predictable - it is the culmination of actions over a period of time that are important, not the individual actions themselves.  Unlike trial and error, which typically tests reflexes and coordination, systems built on random elements test the player's ability to respond to change and to cope with new situations.

As mentioned above, it's also important to mention that random elements are often a staple in all types of games, regardless of whether or not difficulty is provided by trial and error, by manipulation of odds, or, ahem, fake difficulty.  Driving a car in a racing simulation, for instance, there's bound to be some random effect in the vehicle's handling, or on varying types of terrain, even if it's only a small piece of the overall picture.  There is absolutely nothing wrong with this, because usually player skill is able to account for random elements anyway.  More to the point, random doesn't necessarily mean unpredictable - it just means that there can be a certain degree of noise or interference in playing the game, to prevent things from playing out exactly the same way every single time.  Otherwise, when playing Tetris, we'd see the same blocks always become available in the same order, and that wouldn't be nearly as fun to play, as the game itself is based wholly around bringing a degree of order to that randomness.

Unfortunately, building systems out of random number generators, particularly in role-playing games and strategy games, it's easy to fall prey to a problem - not in the mechanics themselves, mind, but in the player's perceptions and understandings of them.  This usually manifests as what's commonly called the gambler's fallacy.  The simplest example is a coin flip.  Even though a coin only ever has a 50/50 chance of landing heads or tails (assuming it's a fair toss), we tend to assume that the 50/50 probability applies to all instances of the event in sequence, rather than the isolated event.  In other words, we form a narrative as we flip that coin over and over again, perceiving each coin toss not as a single incident, but part of a larger whole - and as such, we also tend to assume that prior events have an influence on future events, or, put simply, that the more the coin lands heads, the greater the chances we think it has of landing tails.

Frayed Knights is about as fun as a game with spells like "Exploding Kneecaps" can be, but I often got the sense that the Random Number God was out for my blood.
 In gaming terms, this can be described in the context of a turn-based role-playing game.  A skill might have a 70% chance of success when used, yet we become frustrated when, turn after turn, the skill misses and we end up wasting both our time and resources trying to rectify the problem.  What just happened?  Surely, the game is fudging the numbers!  Well, no, not really.  We assume that, because the skill has a 70% chance of working, it should (or will) succeed seven out of ten times, like clockwork.  This is, of course, not at all the case, as each individual attempt has the same odds as the last, and therefore, it's possible to chalk up a huge string of losses despite what should be good odds.

There's no easy solution for this problem, because you aren't battling the numbers, you're battling player expectations.  Many developers actually get around this problem by instituting measures to make sure that random odds are, in fact, more predictable.  For instance, if I have that 70% chance of success, I might program a clause into the game where it's impossible to miss more than one time in a row - even if ultimately the math is completely off.  That's right, often, the random odds most players feel they rely on aren't actually random at all, but instead manipulated to fulfill the expectations players have.  The irony of all this is that usually the player only ever notices that there's a "problem" if the math is correct in the first place.  Obviously this is a controversial decision, and not everyone will agree with it one way or the other, but in the end it's probably better to fulfill player expectations than it is for those same players to wind up frustrated over what they feel are unfair and incorrect odds.

Presentation is Everything

The header here might draw some flak, but I think that this is a lesson that is very much unsaid when designing games, and yet at the same time one of the most important to learn.  Difficulty, as I've outlined, comes in many flavours and is highly subjective - however, it is also important to recognize that the way difficulty is presented to the player is also just as, if not more important.  Similar to the gambler's fallacy, sometimes it's not a particular mechanic that's the problem, it's the way that players perceive it that's at fault.

Let's take a recent example in Dead Money, the Fallout: New Vegas DLC add-on.  The game came under attack from both players and press alike for what they perceived as a steep difficulty curve.  In Dead Money, the normal endless freedom of Fallout gives way to slavery, as the player is thrust into a very specific and mostly linear path through the game by way of a bomb collar, which will instantly kill the player if he or she strays too far for the beaten path.  Many of the challenges in the game rely on destroying the radio transmitters that broadcast the detonation frequency, which are often hidden underneath tables, inside closets, or are otherwise difficult to reach.  The goal in this situation is to create tension for the player as he or she desperately rushes to find the radio transmitter before his or her head is explosively removed.

It's pretty clear, from an outsider's perspective, to see why this mechanic would be frustrating to players.  The bomb collar produces a high-pitched, persistent beeping when under threat of detonation, which players quickly learn to avoid like the plague, for one.  There's also something particularly demeaning about being enslaved in such a way by the antagonist.  Other games that do this typically do so in such a way as so that the player regains his or her freedom quickly - while it's a good way to breed contempt for the villain, draw it out too long and that contempt falls onto the game developer instead.  Last, this kind of enforced limitation goes against what most players take the newer Fallout games for, namely, open-ended role-playing games with a variety of solutions for every situation; in Dead Money, frequently there is only one solution, and it's often the one players aren't happy with.

Bomb collars got you down?  Don't worry, we've got a special offer on brain surgery to keep your spirits up!
 However, the problem with Dead Money isn't the mechanic itself.  Analyzed at a basic level, all it is a simple race against time to remove an environmental threat - turn off the switch before you die.  The bomb collar mechanic, while effective in terms of the storyline, could have been replaced with any number of similar mechanics and still would have been just as effective.  More importantly, it wouldn't have been nearly as frustrating to players.  For example, radiation and toxic hazards are extremely common in the Fallout world - why, then, didn't Obsidian choose to instead implement the same threat in the form of radiation and, say, vents to clear it up?  Interestingly, this variation actually exists in Dead Money, but is used to a much lesser degree.  Had the bomb collar been replaced with a game mechanic which was functionally identical, but less at odds with Fallout's design tenets, I think there would have been far fewer complaints about the game's difficulty, because in that case, the challenge would have been perceived by players as fitting far better into Fallout's world, and less limiting overall - after all, if it's just radiation or acid blocking your way, that's a much more incidental threat than the villain's scheming, which if anything comes across as deliberate griefing.

Looking around, I think you'll find more and more examples of perception of difficulty being a bigger problem than the difficulty itself.  I can already think of a few off the top of my head - the jarring and repetitive taunts made by the bosses in Deus Ex: Human Revolution, for instance, are extremely grating on the nerves even if the boss fights themselves aren't overly challenging with a little preparation.  Usually, in fact, associating a character with a given type of difficulty (say, Boswer and his castles in Super Mario Bros.) can quickly cause players to become frustrated and annoyed in situations when that character is either already rather annoying, or when the game mechanics themselves aren't enjoyable - it gives people a face to yell at.


This analysis, while far from complete, should have given a pretty good overview not only of a few different types of difficulty, but it should also have made understanding why people get upset at different types of games, different scenarios, and different sorts of difficulty a bit clearer.  Creating and fine-tuning difficulty is always an ongoing process, and it's extremely difficult to get it right for all players.  Even so, hopefully this piece has shed some light on exactly why that is, and what steps can be taken at a more fundamental design level, in order to ensure that your game is fun to play, and challenging, without being frustrating as well.