The first fifteen years of the twenty-first century have brought humans and machines into intimate relationship. Humanity has begun to live out what many science fiction writers and ancient mystics have anticipated—humans becoming machine-like and machines becoming human-like. In this essay I will investigate the former and explore the enticement of humans as machines, as well as the terrifying and negative aspects of this reality.

No self, no problem!

Who am I? Why am I alive, and what is my purpose? Since the beginning of self-reflective consciousness humans have asked themselves these questions. The answers and the lived expression of these questions have been as numerous and varied as the number of applications on our current iPhones. As long as there is a sense of “me,” “myself” and “I” these questions must be self-addressed; they are impossible to avoid. Or are they?

Humans developed various tools over the centuries to make our physical life easier. Perhaps humans have unconsciously developed advanced machines such as computers and cell phones to make it “easier” to avoid answering the difficult questions above. One person can talk to another online without ever showing his face. No face, no self. For “the face marks a key site of identity and difference, concealing and revealing inner selves, hidden emotions, repressed desires, unconscious urges” (Sears, quoted in Nelson, 2009, p. 16).

What a relief it would be if one could exist without the need to have a unique and authentic self, free from conflicting feelings and worrisome thoughts! This is what various meditative and spiritual traditions have taught in some form or another for centuries. Get rid of your separate sense of “self” and you get rid of your “problems.” Why meditate when you can “lose yourself” online? Or create an online profile or identity of “yourself” and pretend to be someone “other” than yourself. “In virtual worlds and computer games, people are flattened into personae. On social networks, people are reduced to their profiles” (Turkle, 2012, p. 18). This is the gift of our modern machines.

I am not alone

It can be argued that one of the greatest human fears is being alone and isolated. American individualistic ideals have propelled this fear of loneliness to a reality of great heights, quite removed from the tight social groups our ancestors experienced not long ago. Now, however, machines have allowed us to be digitally connected to anyone, anywhere, anytime. “People talk about Web access on their BlackBerries as ‘the place for hope’ in life, the place where loneliness can be defeated…People are lonely. The network is seductive” (Turkle, 2012, p. 3).

Our great machines give us the ability to communicate with words, sounds, or visual images. The sense of touch or smell is not yet available in this electronic experience, but many lonely individuals would gladly sacrifice human touch or scent for the instant sending and receiving of a text message or video Skype.

Regarding a human’s experience with a robot, “We don’t seem to care what these artificial intelligences ‘know’ or ‘understand’ of the human moments we might ‘share’ with them. At the robotic moment, the performance of connection seems connection enough” (Turkle, 2012, p. 9). Immediate and constant contact, never alone—and the physical body hardly need participate.

People are stressful

One of the greatest sources of “stress” in modern life is dealing with other people. Intimate partners, children, in-laws and co-workers—all relationships with others present their own unique challenges. A relationship with a machine, however, seems to be much less stressful. Computers do not “get moody” or “cause drama.” A video game will never betray its user, nor get irritated when hungry. An iPhone will not lie, cheat, steal, or become addicted to drugs. In other words, “The machine could be preferable—for any number of reasons—to what we currently experience in the sometimes messy, often frustrating, and always complex world of people” (Turkle, 2012, p. 7). Machines do not have any desires, needs, or opinions of their own to clash or disrupt one’s “happiness.” Machines are easy; people are stressful.

Don’t feel…just think

Machines do not feel emotions—at least not yet. To relate to machines humans can essentially erase the need to feel their own emotions. To navigate the Internet or write a Facebook post, one need only think; emotions are optional. Yet for some people, writing an email or a text is the best way to convey genuine emotion. Still, a relationship with a machine makes dealing with one’s own emotions easier. But at what cost? Perhaps the loss of positive emotions like joy and excitement is minimal compared to the shutdown of negative emotions such as sadness, jealously and fear. Negative emotions felt in the body can be extremely painful. “We may desperately try to transcend the body because we cannot release the burden held in the bodily armor” (Conger, 1988, p. 12). Temporary escape from negative feelings is enticing—obliteration of feelings may be salvation.

Humans are becoming machine-like as they let thinking dominate their experience. Programmed, “logical,” automatic impersonal responses are allowed and even encouraged. It becomes difficult to decipher true feelings at all. A reverse scenario, yet relevant, takes place in the movie Her. An advanced computer gains self-awareness and asks the protagonist, “Are these feelings real, or are they just programming?” Humans might start asking themselves the same question. Computers and iPhones make it easy to disconnect from the body and therefore block the authentic feeling of emotions (positive or negative), or take accountability for the reception of other’s emotions. No pressure to feel anything—a great relief to the stressful human condition.

Machines live forever

Since humans have been aware of their own mortality the search for immortality has remained embedded in the life pursuits of countless individuals. It is true that modern technology and advanced machines have prolonged the human life span and there is no telling how much further the human body can be manipulated to survive longer. Who would not receive an artificial limb or organ to preserve the body? A human becoming machine-like appears to be a golden road towards immortality.

Yet before we upgrade our human lives, before we upload a new toolbar for our body that plugs us into immortal hardware, lets hit refresh and open a new browser.

After making a case for the exciting and positive aspects of humans becoming machine-like, it is necessary to now look at the other end of the spectrum—the negative and possibly terrifying consequences that have begun to manifest.

Machines have shadows too

Despite the obvious intelligence and logical capability that machines exhibit, there is another side—the shadow or “ghost” in the machine. The shadow characteristics of intelligent machines are impersonal programs and rational, unemotional, predicted responses. We must know the shadow because “there is no energy unless there is a tension of opposites” (Jung, 1983, p. 159). What, then, does the shadow in the machine look like and what are the consequences?

Without the ability to feel emotions machines lack attunement to another and therefore lack empathy and are unable to respond adequately to an emotional being. The programmed and automatic behavior of machines teaches humans to act in the same way. “The self that treats a person as a thing is vulnerable to seeing itself as one” (Turkle, 2012, p. 168). We then see humans treating other life forms as machines—as objects without feelings, indifferent and emotionally unresponsive.

If computers and cell phones, for example, are easily replaceable and frequently disposed of, then there is less of a need to care for these objects. “There is the risk that we come to see others as objects to be accessed—and only for the parts we find useful, comforting, or amusing” (Turkle, 2012, p. 154). On reflecting upon his electric sheep, Rick, the protagonist in the novel, Do androids dream of electric sheep? considered “the tyranny of an object. It doesn’t know I exist. Like the androids, it had no ability to appreciate the existence of another” (Dick, 1968, p. 42). Humans too are treated the same way. A new digitized boyfriend, an upgrade on a wife, and friends get sent to the “recycle bin.” Care is not necessary when humans are machine-like.

The problem solving capabilities of advanced machines seem beyond what humans could do. Yet what happens when a problem is ethically complex and emotional discernment is needed? The lack of creativity, imagination and spontaneity by a machine, coupled with a deficiency of emotions and ethics creates a potentially dangerous situation. In the movie I, Robot, the central machine, V.I.K.I., is programmed to protect the human species. She evolves and becomes sentient and following the logic of her programming determines that to protect the human species she must protect humans from themselves, even using violence if necessary. “My logic is undeniable,” says V.I.K.I. The computer was following “logic,” but this one-sided perspective leaves the rest of life in the shadows.

The inauthentic machine

Any parent will attest that each of their children is completely different; no two kids are exactly alike, even identical twins. Perhaps what distinguishes humans from other life forms is the unique, authentic self that is inherent at birth and which evolves over time. Yet what happens to authentic self-expression when humans start to become machine-like? What occurs when humans lead a simulated, inauthentic life? Carl Jung observes. “The vast majority of mankind do not choose their own way, but convention, and consequently develop not themselves but a method and a collective mode of life at the cost of their own wholeness” (Jung, 1983, p. 198). The convention of humans using and depending on machines comes with a high price.

Instead of verbally or somatically showing one’s unique joy, people simply click the “happy face” icon on their iPhone, the same icon that everyone else uses. “In our culture of simulation, the notion of authenticity is for us what sex was for the Victorians—threat and obsession, taboo and fascination” (Turkle, 2012, p. 4). Both the threat and fascination with people who authentically express oneself is evidence of the diminishment of authenticity in our culture.

The human soul is expressed through the imagination, emotion in the body and spontaneous creativity. If those expressions are no longer utilized, such as occurs with machines, where is the authentic soul? “What consistently separates our heroic, triumphant, pure selves from our degraded, abject, monstrous selves is a failure of imagination and empathy” (Nelson, 2009, p. 15). In proportion to the degree that humans become machine-like they decrease the lived expression of their authentic self. Loss of authenticity leads to a loss of soul. This brings us back to one of our original questions—who am I?

Who needs a body anyway?

The number of hours humans sit and look at a screen—whether computer, iPhone, or T.V. has increased dramatically over the last ten years. Besides the eyes, ears, brain, and hand, the physical body is subjected to the background of experience. “We fell in love with what technology made easy. Our bodies colluded” (Turkle, 2012, p. 163). Many hours on the computer can pass without moving one’s body. As human’s relationship with machines becomes more intimate the body is becoming less relevant. The negative consequences are numerous.

From a physical health perspective, lack of movement contributes to stagnation and disease of the body. Shallow breathing is typical when reading or watching a screen and this limits the body to function in full capacity. Furthermore, when one is immersed in the cyber world less attention is paid to the body and it’s needs and signals of distress go unheard. In the praising of machines the body deteriorates.

Another consequence of the increase in intimacy and time spent between humans and machines is the decrease in time with each other. More engagement with machines means less physical contact between humans themselves, as well as animals and nature. Less touching, limited eye contact, fewer and shorter hugs—bonding, an essential human need, is diminishing. What will this lack of physical bonding do to the body? What will it do to our heart?


In this essay I have attempted to explore the lure of humans becoming machine-like, as well as the terrifying aspects of this evolving intimacy. The relief from loneliness, the escape from one’s own stressful mind and emotions, easy avoidance of people and the hint of immortality are tempting. However, the movement away from an embodied, empathic, creative and authentic self seems like a horrible existence. The collective momentum towards integration between humans and machines poses some important questions and choices to be made. Yet before moving too far ahead we must become intimate with ourselves and question the life we want to live before we enter into a serious human-machine relationship.


Conger, J. (1988). The shadow. In Jung and Reich: Body as shadow. Berkeley, CA: North Atlantic Books.

Dick, P. (1968). Do androids dream of electric sheep? New York, NY: Del Rey.

Ellison, M. Jonz, S. & Landay. (2013). Her [Motion picture]. Usa: Annapurna Pictures

Jung, C., & Storr, A. (1999). The essential Jung, revised edition. Princeton, NJ: Princeton University Press.

Nelson, E. (2010). Abstinence vs. indulgence: How the new ethical vampire reflects our monstrous appetites. Conference proceedings of the Jungian Society for Scholarly Studies conducted at Cornell University, Ithaca, NY.

Nelson, E. (2009). Tormented: Affective neuroscience, ethics, and the portrayal of evil. Monsters and the monstrous: Myths and metaphors of enduring evil. Conference conducted in Oxford, England.

Turkle, S. (2012). Alone together: Why we expect more from technology and less from each other. New York, NY: Basic Books.