The challenge of the virtual value code
I often think about the challenge of transmitting values through games. I mean, how to make a player act according to the moral code held by its protagonist voluntarily? Or perhaps the other way around: how do I, the player, can make my moral code to be enforced by the mechanics available to the protagonist. In other words: if I believe killing is wrong, how would I transmit that belief to the avatar I’m controlling on a first-person game? Should I even try that?
Since we are talking about moral codes, creeds and malleable ethics, let’ start with the one from the center of the Assassin’s Creed series: “Nothing is true, everything is permitted”. Taken from the novel Alamut, this is creed is useful for the Assassins: it is vague enough to be interpreted in a way that justifies whatever morally ambiguous – and, sometimes, downright hypocritical – action the Assassins may need.
So here is a more useful, less cynical creed: Everything is permissible for me–but not everything is beneficial
This one is from the Bible. ICor 6,12. It’s one of those phrases that distinguish the civilization from the anarchy. It is also (and Objectivists, please don’t abandon this article just yet) purely Kantian.
Essentially, it says that, it’s not because we can do something that we should do that. It’s what sets apart the man who doesn’t kill because he thinks that’s wrong from the man who doesn’t kill for fear of being arrested. Though neither man actually kills, the first man is inherently more valuable than the second, who needs a system of laws to prevent him from unsettling the society all the time.
In other words, there are two major concepts that shape our decisions: the values we follow and the law enforced upon us.
In video games, however, mostly second concept applies: the enforced law a.k.a. the limits of the game. Since, in the game, nothing is true, everything is indeed permitted. If a game allows us to act like assholes, we shall act like assholes – which is great fun, by the way. It does, however, often come into clash with the avatar’s characterization. That’s when player agency becomes a problem. Rare and praiseworthy are the games that manage to achieve such a deep level of characterization that it’s able to influence that agency. In an extreme situation, the player actually cease to act a certain way because the game truly convinced that such act is “wrong”.
Let’s be realistic though: expecting that level of “conversion” in every game is just as hopeful as waiting for Godot.
The remaining option is to design game for the second kind of man, the one that requires rules. Not an elegant solution, I know, but we take what we can get:
But it could be more than that. In the future, we could ditch the current and innocuous customizations options such as determining how high the avatars’ cheekbones should be, for the possibility customizing the avatar’ values as well.
Not unlike the role of Dr. Kaufmann in Silent Hill: Shattered Memories, instead of a face-generator tool, games could present us with a psychoanalyst asking us probing questions that would shape the avatar’s moral code. If the result is that “the protagonist is strongly against killings of any kind”, the game should then prevent that from happening: if that character ever comes to hold a gun, pressing the ‘fire’ button would be fruitless for it would either flat out refuse to shoot (a la Solid Snake when Gray Fox asks him to fire a missile in Metal Gear Solid) or misfire. If the protagonist is sexually immature, its interactions with the opposite gender would take a turn towards the awkward and brass – and no other dialog options would be shown.
Naturally, the game cannot directly communicate what set of values the character believes in directly to the player. That must be only inferred by actions available to the avatar. Otherwise it becomes like an Elder Scrolls game: where the illusion of making a decision is always trumped by the over-exposition of the game’s structure.
It would be interesting to see what gamers would find out about themselves with these restrictions. The implication is that developers would also be forced to assess the values their game characters are transmitting. Odd situations such as the only lived by Connor in Assassin’s Creed III would become rarer. In it, we see that Connor’s ideals slowly crumble… and he isn’t able to find a suitable replacement in time. There was a plot twist, the situation has changed, and yet he carries on the same, only slightly aware. He just carries on with his butchering, willingly ignoring whether or not his actions continue to be ethic in this new scenario. Perhaps this was one of the reasons he was so reviled by gamers: he had become a monster, but the game never assessed nor became aware of that.
In the Dr. Kaufmann example, the decisions players make at the beginning will shape the protagonist’s values and may even bite them back in the ass as the game reflects these values by limiting the mechanics available. Maybe then the realization of the monster the avatar ends up becoming would make gamers question themselves.
Or maybe they would reevaluate what a “monster” is.
There are lots of actions society deem as “monstrous” being filtered out of the gamer’s gaze. More and more we talk about them. Yet, they linger. That’s despite the fact we already have the worst kind of monsters there could be in our games. We have the killers.