Thursday, May 20, 2010

"Clatter, mill, clatter on and on, you clatter just for me."

In Franz Kafka's last novel, The Castle, a land-surveyor named K. finds himself in a small village in the middle of nowhere, desperately trying to begin the project for which he was conscripted. He had been called upon to travel here by the authorities in the castle that sits atop the town, who govern with incredible and self-defeating efficiency. The bureaucracy the authorities have established is so minutely managed that it has paralyzed the lives of its citizens and hermetically sealed away its own branches, such that regulation is about the only dynamic force in the entire organism. The villagers are anxious around K. because he is not initiated into the ways of the castle secretaries, a condition that either heroicizes or villifies him to his acquaintences and often in equal measure. Compelled by a sense of self-preservation, K. exhausts the power of every contact he makes, acts on every imaginable impulse, gaining and losing influence in the village at random, all in a futile effort to gain access to the castle begin his engagement as land-surveyor.
Reading Kafka's unfinished story has made me think a bit about how I engage with the moral codes of current-generation games.
This generation has had it's fair share of games that have incorporated systems of player-directed character development. The idea that a game's protagonist(s) become more relatable and entertaining to the player if their relationship to the game world can be altered, however slightly, in specific and largely arbitrary scenarios through choices that involve elements of human behavior has taken hold of several genres, especially RPGs, first-person-shooters, adventure games and their many variations. Instilling videogames with the actuality of moral choices is a design concept that carries with it the promise of elevating the platform to new plateaus of artistic legitimacy. Far and wide, player choice of this kind is at best, a method of propping up the the strength of the player's attachment to the protagonist through the otherwise linear progress of the game's story and at worst, utterly transparent shell games.
Take the obvious example, Bioshock. As glorious as its story was in exploring the relationship between the player and the parameters of the game's objectives, its moral code was absurdly reductive. Being good or evil in the game was effectively a toggle, flipped on or off at conspicuously underlined moments in the game. Games like Fable II and Fallout 3 have a more expansive feature set at the player's disposal, but even these allow for an easy reversal from either absolute evil or sainthood with a modicum of dedication. Essentially, these games are marketed as games with moral choices but they do without the aspects of real consequence. If and when the effect of your moral choices in the game precludes or enables your player to do something unique to your array of choices, you can backtrack explore the other choices through trial and error that effect a different outcome. In essence, these games endorse flailing in all directions. If you're immersed in the game's world, the flimsiness of the system doesn't matter as much because you can transcribe your own thoughts and feelings onto the game's world in a way that supplants the pretense of the game's moral choices.
In The Castle, the villagers exist within the cushion of the bureaucracy that surrounds them. They think nothing of waiting years for an audience with a secretary or for a message to be delivered. When K. enters their lives, it is considered an opportunity to gain better favor or to lose any status they may have been able to construct. They act on these expectations only, complicating K.'s singular desire to gain access to the castle. And it is only in relationship to K. that any of the villagers are compelled to act.
My sense is that the reason games like Bioshock involve systems of choice that seem so transparent is because the systems of choice they involve are only enacted by the player. No other presence in the game appears to make choices, save in response to the player. Is it possible to give artificial intelligence the appearance of believable sentience? Perhaps not, and it's arguable that this would simply be re-packaging the the same contrivances of morality games. For this reason, the answer doesn't seem to lie in making the choices more numerous or complex either. Ultimately, perhaps it's worth exploring why we should need to be convinced we are making any moral choices in games. If NPCs in games were given scripts that were not dependent on the input of the player and if they could fulfill objectives that the player has not, would it modify player behavior in a more fulfilling way?

If in fact, Suda 51 still wants to develop a videogame based on the book, maybe we'll just do away with objectives altogether.