You want empathy? Talk to a computer

One goal of emotion / know is to provide AI-based tools supporting emotional wellness. An early such tool, ELIZA, was actually designed by its creator as a parody of therapy, but turned out to provide what felt like empathetic[1] communication to many who interacted with it. ELIZA was named after Eliza Doolittle, Henry Higgins’ protege in the George Bernard Shaw play Pygmalion.

ELIZA was created in the mid-1960s by Joseph Weizenbaum at the MIT AI Laboratory. It uses a rule-based approach to parsing and responding to text-based inputs from a human patient. ELIZA gives a convincing performance of understanding (in short initial interactions) and some believe it was one of the first AI’s that could pass the Turing Test.

Weizenbaum chose therapeutic conversation and specifically Rogerian therapy (as developed by Carl Rogers) because it meant that ELIZA didn’t need a database of knowledge and didn’t need understanding of the content of the conversation. Rogers said that the therapist must offer three conditions to the client:

  1. Congruence – the willingness to transparently relate to clients without hiding behind a professional or personal facade.
  2. Unconditional positive regard – the therapist offers an acceptance and prizing for their client for who he or she is without conveying disapproving feelings, actions or characteristics and demonstrating a willingness to attentively listen without interruption, judgement or giving advice.
  3. Empathy – the therapist communicates their desire to understand and appreciate their client’s perspective.

These conditions meant that ELIZA could be developed as a rule-based bot, without any real understanding of content. She could parse words and manipulate them into responses that made the client feel understood though she had no comprehension.

Rogers’ three “core conditions” are designed to allow the therapist to achieve deep empathy with the person with whom they are interacting. Here are a couple quotes from Rogers’ book On Becoming a Person: A Therapist’s View of Psychotherapy:

To be with another in this way means that for the time being, you lay aside your own views and values in order to enter another’s world without prejudice. In some sense it means that you lay aside your self; this can only be done by persons who are secure enough in themselves that they know they will not get lost in what may turn out to be the strange or bizarre world of the other, and that they can comfortably return to their own world when they wish.

And

I hear the words, the thoughts, the feeling tones, the personal meaning, even the meaning that is below the conscious intent of the speaker. Sometimes too, in a message which superficially is not very important, I hear a deep human cry that lies buried and unknown far below the surface of the person.

So I have learned to ask myself, can I hear the sounds and sense the shape of this other person’s inner world? Can I resonate to what he is saying so deeply that I sense the meanings he is afraid of, yet would like to communicate, as well as those he knows?

And yet these conditions — these “rules of engagement” — allow a non-sentient, uncomprehending computer program to simulate deep empathy.

I don’t think that’s what we imagine when we think about empathy. We don’t think it is mechanical, robotic, rule-based. We think of it as human. We think of it as intelligent. We think of it as based on an emotional connection.

What does this mean about empathy, that a computer program can so easily simulate it? That is, in brief and preliminary interactions only. In longer repeated interactions, it is fairly easy to tell that ELIZA is not human.

What does it mean for our human relationships that we should (theoretically) so easily be able to pretend empathy, just by following a few rules?

And yet so few of us do it in our everyday lives… why is that? I know in my daily relationships my demonstration of empathy towards others is rather rare. It’s easier for a computer program to do it, as she doesn’t have her own concerns weighing on her “mind.”

You can try out ELIZA yourself online — there are a variety of implementations, such as this JavaScript version by Michal J Wallace.

Of note

[1] Empathic or Empathetic?

Since both forms of the adjective are recognized by the OED and Merriam-Webster, speakers and writers are free to choose the form they prefer.

The older form is empathic (1909). The form empathetic derives from the more familiar pairing of sympathy and sympathetic. The earliest date for the use of empathetic given in the OED is 1932. It could be that scientific writers prefer the older term.

The word empathic makes me think of the word empath.

I bring this up because I came across “empathic” twice while researching this post. Personally I prefer “empathetic.” I agree that “empathic” sounds like something a bit different.

Perhaps the distinction between empathetic and empathic is important, and something to be discussed in a later post.