Chasing Rainbows

The Ethics of Lying to the Player

by Jonathan Ahnert

I try not to lie, ever. I even have a counter on my phone that tells me how long it’s been since I last succumbed to the temptation (5 months, 6 days, 12 hours at the time of writing). When I do, it’s typically to improve the quality of someone else’s experience. They slip out faster than I can catch them and momentarily summon a mirage in the mind of another. In horror, I watch as they rely on that mirage, make it load-bearing in some way. Ideally I intervene before it causes more than a few bruises. “The truth would hurt them unnecessarily” was a common excuse of mine, but so much of that damage is because they were sent chasing rainbows and using them for bridges. 


Game design is rife with slight of hand, smoke and mirrors, and overt and subtle lies. Most multiplayer games on mobile have simple computers control your opponents rather than tackle the complexity of real-time multiplayer. Odds are fudged or misrepresented in strategy games. False choices are strung together in narrative interactions. Not to mention the friction that is inserted behind the scenes during calculated moments optimized for getting players to purchase the next microtransaction. 


Lying to players is sexy. The idea of getting a moral blank check to manipulate others as a “service” to them is the sales pitch. It relies on making the immediate experience of the player paramount to everything, including their future experience (as all lies have an expiration date). The blank check of deception is also found, safely contained, in the play of social deception games. The player discussion around what sorts of lies are acceptable in such games gives us an underlying sense of their ethics, what lies have we ever taken off the table? 


Maybe I’ll make a board game with loaded dice, that has players lying to one another about the nature of the game as a parody to illustrate these points.


Truth keeping as gatekeeping. There are a number of what I’ll call “summer camp social games” in which the rules are intentionally obscure and counter intuitive. The game is to try and figure out what the rules are. Or, if you already know them, enjoy the lamentations of those failing to navigate the game successfully. Once everyone knows the rules, the game is over, until next year's summer camp when newcomers are subjected to the same initiation ritual. This may help explain the lack of large-scale player outrage in response to the revelation of the lies of a game like X-com. Once you know, you’re in the in-group, you’ve made it! Cast your gaze out on the fools still staring at shadows on the cave wall. Enlighten them with your new found wisdom, whispering “did you know…” to those worthy of joining the inner circle.


The cost of lying is an iceberg. I do not know how to calculate the cost, I’m likely not the right person to do it. But I feel it, deeply. It is a lack of player trust in the integrity of what games are and a lack of designer-trust in what games could be. We allow ourselves to lie both overtly and subversively, then we point to the absurdity of trying to tell the truth when the player is not looking for it, asking for it, or interested in paying for it. “The truth would hurt them unnecessarily” we might say in that classic justification.


Computers are predisposed to lies of omission. The fact that, by definition, technology outpaces our common understanding of it, leads to each new experience beginning as a black box. Black boxes are not infinitely vague, they become understood over time, and it is precisely this uncertainty that requires trust to navigate. Because that trust is not there, players simply take whatever pops out of the box, knowing that if they looked too hard into it their enjoyment would evaporate. Some players do look, they want to play the game of figuring out how the magic tricks are being done, and their reward is a sense of superiority and experiential safety not built on trust but on self-taught knowledge (I’ll refrain from pulling in the larger societal death of facts, and the “do your own research” mentality for now).


What do you get in this, admittedly idealized, world of bedrock trust between players and designers? You get players looking into the black box of your game. This time not out of suspicion of your trickery, but out of a desire to truly understand its message. They trust that the harder they look, the richer their experience will become.


Deep critique of other media is reliant on trust as well. Is this food full of cultural history or simply a bunch of things that make my brain happy? Is this writer trying to progress understanding or using bad faith arguments and fallacies to manipulate what I think of the world? This is the mire we find ourselves in. 


Additionally, games contend with the common presumption that they are un-serious. So the ethical cost of their errors is also un-seriously taken. We applaud a new trick, without noticing the Machiavellian political schemers and dictatorial propaganda machines echoing those same celebrations one room over. 


There are 350 billion dollars of distractions. We love to talk about how big the game industry has gotten. Smuggled into that is the capitalist moral structure of something's financial power being an acknowledgement of its social support and therefore ethical purity. As an industry not taken seriously in so many ways, getting to swing our rise to prominence around is satisfying. This colossus must be scaled and slain if we are to pursue truth. We need to carve out a space free of the industry’s influence if we are to see what else is possible. The shadow of this colossus is long and within it our darker tendencies are protected. It financially rewards each new discovered manipulation, co-opting the ethics of staying alive. Advertising surfaces games that have mastered these manipulations allowing them to define the industry (Fortnite’s near monopoly on the 2023 Game Awards’ interstitials as a small example of this) while drowning so many of those unwilling or unable to perform the necessary illusions. 


We will always be able to trick them. We will also always be able to point to the outcomes of said trickery as justification for its supposed benign benefit. Its costs are secret, a black box. And that is the box we have the responsibility to look into. Look hard.