On a recent episode of the gaming podcast Rebel FM, IGN editor Arthur Gies said, “I most gamers wouldn’t know a good story if it bit them in the ass.” He expressed his concerns cautiously at first, aware of the serious troll-baiting he was about to do. Instead of ushering in a new Age of Enlightenment, the Internet’s given us an Age of Entitlement. And few niche cultures wear their entitlement so proudly as gamers do.
Gies’s comments spawned from a discussion of Crysis 2’s story, which on the surface seems no more unique than any other shooters. Aliens are attacking. You better stop them. Despite the presence of these familiar tropes, Gies argued, the tertiary material (environmental items such as emails) fills in a picture well worth looking at.
There’s one problem: the material’s tertiary.
Emergence, that great boon to and bane of developers everywhere, is gaming’s most unique trait. It’s also the biggest storytelling liability.
When I read a book, I ingest the words in the other the author intended. When I watch a movie, the frames flip by in the exact manner agreed upon by the director and the editor. If I don’t, I’m not reading the author’s book or watching the director’s movie.
Game developers, however, create an interactive space that lends agency to the player. No matter how tight the corridor (ahem, Call of Duty), we can always take a step to the left instead of the right. So when they want us to encounter a chunk of narrative, developers have two options: cleverly steer us to it or wrest control from us and ladle the story down our throat.
The latter is certainly the easier of the two. Developers use cutscenes – narrative bursts that range in length from a few seconds (Donkey Kong) to a few lifetimes (Metal Gear Solid 4) – to dole out story like dog treats. “Oh, you beat the boss? Good boy! Here’s a brief scene explaining why that was a good idea.” The advent of full motion video in the Playstation era so whetted Square-Enix’s (then Squaresoft) appetite for filmic cutscenes that they just up and made a movie.
Cutscenes are the developer's way of saying, “Stop whatever you’re doing and pay attention.” They’re rarely elegant, and so few of them deliver on the rich fiction the developers dreamed up (exceptions include the stuff coming out of Blizzard and BioWare, who regularly nail it).
But if emergence – the potential for a player to create his own experience within an interactive environment – is gaming’s chief asset, why the cutscenes? Why do we as gamers surrender control to the game? (BioShock addresses this head on.) Why are developers so afraid of surrendering control to players?
Achievement Unlocked: Follow the Breadcrumbs
Your ability to move left when the story is over to the right is problematic to developers. When cutscenes feel too intrusive or otherwise aren’t an option, developers must guide you to the story. Enter what I’ll call breadcrumb narratives.
Breadcrumb narratives are the pieces of lore that developers sprinkle throughout their game worlds to enrich the fiction. Examples include the audio logs in BioShock and Halo 3: ODST, the wall-scribbles in the Half-Life and Left 4 Dead series, and the various collectibles in the Assassin’s Creed games. These are what Gies is talking about with Crysis 2. It is possible to progress in these games while skipping this content entirely.
Developers don’t want that. Most gamers don’t want that either (I know I want to get the most out of any game I spend $60 on), but it’s easy to get caught up in all the shooting and adventuring and forget to stop and smell the story.
So many of these story tidbits become collectibles. These collectibles then get tied to Trophies and Achievements. On Rebel FM, Gies followed this process to its logical end: now that Achievements are involved, these collectibles can’t be easy to find. All of that story then gets hidden or becomes a chore.
The Achievement problem ties into larger attention deficit concerns that plague the Internet generation. Immersing the player is hard when his console’s constantly telling him what his friends are doing, how far he’s progressed in the game, and how long it will be until his Live Arcade download is finished. These notifications can be turned off I realize – and it’s no different than constantly responding to text messages or Twitter while reading a novel – but it’s a problem developers have to account for.
Obstacles as Obstacles
We, the players, aren’t a developer’s only enemy when it comes to telling a good story. They get in their own way when trying to design challenging obstacles to for player to overcome. Oh wait, we are still the problem.
Videogame failure rarely aligns with interesting narrative failure. The “You died, now try that again” convention doesn’t gel with a story that progresses forward. Simon Jones over at Potential Gamer wrote about this just over a week ago, saying that repeatable character death can often be the most incongruous thing about games:
“This is made all the worse when you’re supposedly playing a hero character, a conceit which rapidly falls to pieces if he keeps getting killed by the enemy, or keeps missing that last jump. The Dark Knight would have been rubbish if, in the Hong Kong sequence, Batman had misjudged his night-time cape flying and landed awkwardly on the street, having to take the lift back up to the top of the skyscraper to try again.”
Adventure games have largely sidestepped the issue, says Jones, by eliminating most forms of character death. Last year, I cited Western RPGs (Fallout, portions of Mass Effect) as games that incorporate some elements of failure into the forward progression of the story. These are exceptions to the rule, of course, that will hopefully be looked to by developers eager to spin a good yarn.
If developers can find better ways to tell stories through gameplay as opposed to in-between or around it, maybe gamers will be better able to tell when a good one’s biting them in the ass.