(Microsoft Game Studios)
US: 26 Oct 2010
I take back what I said last week about Fable III. It is indeed entirely possible to achieve the best ending with no sacrifice to one’s morals, but it came at the expense of something even more valuable: my belief in the system.
Unless you’re a computer programmer, the logarithms of modern video games are infinitely more complex than most players can understand under short exposure to them. For that reason, mastering a game is only one part being trained by the system; the other part is the search for the programming’s underlying logic. In fact, in a lot of cases to remain immersed and fooled by the game’s hidden mechanics, the player has to willfully ignore the same tools needed to become fully proficient with it.
To start with something fundamental, consider enemy respawn. An experienced gamer knows that if he comes back to an area again after a given amount of time, enemies will reappear. He’ll also observe where the hotspots are, what the enemy party will be comprised of, what weapon works best against which class of enemy. But a player who disregards studying the finer details might be surprised every time or will keep using the same weapon. It might take until the latter player has heard lines being repeated that he’ll get a sense of just how pre-scripted and limited the entire encounter is. Once he does, the mystique is gone, and the latter player starts to resemble the former.
Nothing could be more tired in games discourse than a Matrix analogy, but bear with me, as I’m going to use one anyway. It’s a little like becoming Neo. Once the player starts that transformation—in fact, as soon as the player even becomes aware of the possibility—his relationship to the game irrevocably changes. The mere thought that the player is inside a system that can be bent to accommodate less than obvious solutions begins to lend power to the act of play.
Recognizing the patterns in simplistic AI programming is one example of how a changeover can occur, but there are many that we could list. Reductive binaries, false choices which don’t really bisect the plot, gaps in logic, and so on. There can at times be only a fine line between elegant simplicity and an exploitable, dumbed down system, and until a game is put through its paces, it’s hard to tell on what side it might come down.
Like the Fable III morality scale.
Though it’s far from the game’s only flaw, there is something fundamentally broken about a game in which one can joyously lead a life of crime and exploitation, up to and including the murder of 68 royal guards, only to be immediately absolved the moment 12.2 million in ill gotten gold is deposited into the nation’s treasury. Did no one notice that I’d swept in, bought up every residential and commercial property in the country, and jacked up the rent? Did no one care about the dead civilians? For god’s sake, what about the orphanage that I turned into a brothel?
The game’s ending confirmed how little my misdeeds weighed upon the result. By acting like a good King (generally, excluding the brothel), maintaining alliances and protecting lakes, the NPCs praised me for “doing the impossible.” That is, I managed to come out of the final fight without a single civilian casualty, with high approval ratings, and a budget surplus. Yet the game was right. It should have been impossible. It was only thanks to the limitations of the system, which, having revealed themselves to me, broke all semblance of realism—or even an acceptable line of causality.
And I couldn’t have been more pleased with myself. Building off the knowledge of my first campaign and adapting quickly once I saw the holes in the morality scale, I bent the thing to my will and mastered it in short order. Unlike my first game, in which I was legitimately swept up in the emergent story, now it was not a story at all. It was simply a process of exploiting gaps by whatever means necessary . . . Including the time that I worked off the bounty for those 68 murdered guards by making meat pies for about 45 seconds.
At that point, the game had ceased to be anything except the gleeful abuse of a system that was clearly unprepared for aggressive extremes. The game was no longer a fight for Albion or differentiating myself from my sibling but was now a battle against what I saw to be an unfair binary, in which I could be a savior or a humanitarian, but not both. So with endless enthusiasm I turned the game in on itself, flaws and all, and beat it. Utterly. But was that worth completely objectifying its components, shattering the illusion of a living world?
And yet, games are frequently about those impossible accomplishments, which by their nature break with the same logic that gives rise to them as problems. Thus, the dichotomy. Why design games to be fluid, seamless, and immersive if breaking that immersion is needed to be good at them? It isn’t enough to simply fight my way through an onslaught of bodies in God of War; I need to adapt to the delay between command and action, memorize combos and Rock Paper Scissors-like rules of what works best, look for any amount of give or slipperiness that the mechanics will allow. In Assassin’s Creed, I must rely on the terrible attention span of the city’s populace to practice and perfect my skills in the first place. Realism takes flight the first chance it gets.
This is what Espen Aarseth talks about when he calls games anti-narratives. In their fundamental form, the way in which mastery of play supersedes mastery of story means that we, ourselves, are the only real story happening in a game. All else is artifice disguised as narrative, responding to action, but not genuinely reacting to it. It does not have to be this way, but until game stories can be designed to unfold as “things which are,” not a series of direct consequences of limited iterations affixed with value judgments, being Neo is the only way to game. It’s just a matter of when we stop being Mr. Anderson.
// Channel Surfing
"Series creator Nic Pizzolatto constructs the entire season on a simple exchange: death seems to be the metaphysical wage of knowledge.READ the article