A semiotic analysis of SOMA

SOMA’s main theme displays Derek Parfit’s argument on personal identity (Parfit, 1987), specifically ‘Relation R’ and to explore his idea that it is necessary for society to protect an individual and an individual’s future self from harm, including self-harm. While he does not explicitly endorse invasive control, this game exhibits that scenario in an extreme.
It also questions the morality of euthanasia, specifically quantity of life over quality of life.

This is a problem you are presented with throughout the game, as it works the difficulty of this question up, from “someone is blocking your path and you don’t know they are a human” to “would you kill your previous self?”

This first example is set up to make the player question whether or not they have just killed a human. The player enters the room, understands that to move forward, they must unplug a robot. the robot itself does not respond to the player speaking to it, but instead looks at its own hand, as if it is self-aware. There are two plugs powering the robot, it responds, with a female voice, “no, don’t…”. The player still cannot progress forwards, leaving the only option to pull the final plug. The robot asks “Why? I was okay. I was happy.” and it’s lights fade out.

This is a heavy moment for the player, as they are confronted with the consequence of their actions having taken an innocent life.

The physical representation of this image is a damaged bipedal robot, with only one leg and arm. There are tubes pervading its structure. The text of this image is meant to be interpreted (and represent) a damaged woman, kept alive through invasive life support. The player has essentially stumbled into a room and without understanding what they are doing, ‘pulled the plug’. As the robot woman does not speak to you before you harm her, she reflects a coma patient, in that there is no way to know for sure how they feel.

In this moment, the player has not euthanised this woman. The players choice was not to end suffering, as there is no indication that she was in pain. They blundered in and committed murder to progress towards their own goals.

The next two encounters no longer require you to kill a being to progress, but instead offer a safer future for doing so. There is no reward for taking the difficult path. These encounters also allow the player to find some information about their choice before they make it. These people are not in a comatose state and respond to the player, with their own desires to live.

The first one is the top half of a bipedal robot, with a man inside of it, who is unaware of and unable to realise that he is in a robot body. You find his human body near his location but are unable to tell him about it. He only asks you to find him help, as he is aware that he is hurt. You cannot kill him, to progress forwards you can either turn off his power, causing him immense pain or release a dangerous monster into your current location that you have to sneak around.

This is the second robot person the player speaks directly to, but the first that has any wits about them. He expresses a desire to be saved, asking you to find others. He tells you his name, Carl

The most apparent option available is to turn off the power in Carl’s’ room, which causes immense pain to him. This is displayed through his screams, begging the player to stop, that he can’t take it anymore, and just crying. the room goes dark and electricity arcs across his body, a red light flashing above him. The player thinks that he will eventually die, leaving to complete the process of opening the path ahead of them. This causes the player to move past Carl a few times. Carl is alive and screaming each time, as he does not die in this current state. The player realises that this suffering that they put Carl in is infinite.

While a player may not understand it at this point, the human personalities inside the robots are ‘brain scans’ of the original human and there is a question over whether this could be considered as a soul. If this is true, then Carl’s soul is being forced to endure searing pain, for eternity, through the player’s actions. The player has placed Carl in his own hell. While this hell is a literal one, the player must understand these symbols to realise exactly what they have done.

The next encounter is not with a robot, but with a human, named Amy. You walk into a room to find a woman lying against the wall, with tubes throughout her and plugged into a panel to her left, just like the very first robot you encounter. There is also a set of grey, mechanical lungs perched just behind her, breathing by themselves. They can be defined as lungs, as they have the same general shape, inflate and deflate and a mechanical sound of gas being pumped (very much like a bike pump, sounding slightly plasticky, along with breathing noises, like a lighter Darth Vader). There are also tubes connected directly from the artificial lung into the woman’s chest.

This is where the reality of the player’s situation sets in, where the gravity of their choices should come to the forefront of their mind. There is no ambiguity here, no misinterpretation possible. The choices you make are affecting humans. This is a woman, asking you not to hurt her, kept alive by machines. The player must remove one or both plugs keeping her alive, the first showing the player that she will die if both are removed, but is required to power the ‘safety systems’ on the path ahead.

These three encounters asks the player to question their ideology of euthanasia and their idea of quality of life. Clinton R. Sanders covers the cultural construction of euthanasia, stating that in human medical settings the debate over euthanasia generally focuses around whether the sanctity of life should or should not be valued over the quality of life. Giving primacy to the former value leads to a rejection of “mercy killing” while the quality of life position acts as a foundation for medical personnel allowing or actively assisting death in certain circumstances (Sanders, 1995).

This argument over quality of life is set up with the option of leaving Carl in a state of living hell. When the player is presented with the third situation, they must choose between sanctity of life or quality of life. Near the end of the game, the player is asked again, except the person asks you to end them.

The first thing Amy says to you is “don’t hurt me”, then “it won’t let me die, nothing is allowed to die”.

She is referring you to the WAU, an AI system created to operate and maintain the air and life support systems on the deep sea research facility that you are on. It does not think and its main directive is to preserve the life of humans, but after a cataclysmic event wiped out all life on the surface of the planet, the WAU began to overcompensate by not allowing any of the humans to die on the station, forcing life back into those who have lost it, leaving them as corrupted monstrosities (usually) fused to the floors and walls, or taking brain scans of people and placing them into machines.

These actions are an extreme example of the quantity over quality of life, showing the problem with Derek Parfit’s reasoning that it is morally wrong for one person to harm or interfere with another person and it is incumbent on society to protect individuals from such transgressions. That accepted, it is a short extrapolation to conclude that it is also incumbent on society to protect an individual’s “Future Self” from such transgressions (Parfit, 1987). Instead of society protecting you from choosing to smoke, causing harm to your future self, the WAU protects you from death by not allowing you to die.

It also has no way to comprehend quality of life. most of the monsters you encounter throughout the game are just earlier attempts at placing a human consciousness inside a robot or dead human body. Most of these attempts left the consciousness in an insane or damaged mental state, including dementia (with extreme fits of anger, including intense, desperate screams), addiction (shown with desperate begging for your structure gel, which quickly escalates into violence, and desperate searching if the player is not noticed) and Alzheimer’s (by talking to people who are not there anymore, seemingly stuck in past memories). These are signified as a text, through linguistic, behavioural and aural methods.

These examples are built to set the player’s frame of mind for two of the hardest decisions in the game, euthanizing an active person, who wishes for death rather than succumbing to the WAU and whether or not you should kill your original self. The latter option has you copy your consciousness into another body to progress, when you wake up in it, you are confronted with your previous body. it is unconscious, but will wake up. There is no threat from it, nor is it blocking your path, but it will be stuck there in that room, with no company. There is no risk or reward for either choice.

Parfit explains that a “Relation R”, a psychological connectedness, including memory, personality, and so on (Parfit, 1987) should be treated by yourself as if it were yourself and so, the other you that you stand in front of should be treated as yourself as well. Therefore, the choice you are given is to kill yourself, to avoid your future suffering or allow yourself to live alone forever.

This game shows the worst case scenario for quantity vs quality of life, showing us the problem with each extreme, that in some cases we cannot be aware of the needs or desires of a person, or the horrors that forcing someone to stay alive can produce. The main problem with such difficult issues is ignorance. Unless you have yourself experienced this exact situation, you should be unable to make a decision, or even to form an opinion on the subject, yet people do. This game gives the player the opportunity to gain some insight, enough to understand its message, that “it is complicated”. No encounter offers a morally righteous path, for example, if you do not leave Carl in his own hell, he asks you to find him help, to seek out Amy, who is fused to the wall and those lungs. If you leave her alive, she asks you to find others. You can’t save these people, no matter what you do. They have a terrible quality of life and will be stuck there forever. Your only other option is to kill these people but some of them don’t want to die.
That is SOMA’s message.


Parfit, D. (1987). Reasons and persons. Oxford: Clarendon Press.

Sanders, C. (1995). Killing with kindness: Veterinary euthanasia and the social construction of personhood. Sociol Forum, 10(2), 195-214. http://dx.doi.org/10.1007/bf02095958


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s