App review: Wysa, an “AI life coach”

Wysa is a free therapybot that uses a variety of techniques such as cognitive-behavioral therapy, dialectical behavior therapy, meditation, and motivational interviewing. Wysa also includes personalized coaching with a human, for a subscription fee.

The good

  • Wysa’s bot kept track of ways I was feeling and things I reported happening and brought them up to me later
  • Wysa’s mistakes were occasionally charming rather than offputting
  • Wysa doesn’t make you create an account when you start using it, which is nice. Wysa’s developer says “Wysa is and will always stay anonymous.”

The not-so-good

  • Wysa seems too intent on using CBT techniques for addressing negative thoughts, even when they’re not appropriate
  • Guided visualizations and meditations seemed like more of a distraction than a help
  • Wysa sometimes got my emotional state wrong and didn’t check with me to ensure it understood

Wysa’s approach is muddled: Rogerian or CBT?

Wysa’s developers want to build an algorithmic coach that is emotionally intelligentThey say they learned from their experiences working with users that “people don’t want their problems ‘fixed’. Mostly, they just want to talk through them, with someone who doesn’t judge.”

This sounds very much like the Rogerian person-centered psychotherapy upon which the original therapybot, Eliza, was based. In this approach, the therapist uses an empathetic and non-directive style to encourage the client to solve their own problems and find internal motivation, thereby actualizing themselves. 

Wysa did make an attempt to echo back what I told it, as though it were an empathetic listener. When I said that a particular situation made me “feel like a loser” and  that I wasn’t being as productive as I’d like. Wysa echoed this back to me in a charmingly inept way: “You mentioned that you were feeling loser and ‘pretty well, not as productive as I’d like’.” Then said it (she? he? they?) had some tools for me based on this.

Yes, I was feeling loser! That sort of mistake seems fine. No “uncanny valley” here.

However, Wysa seemed too focused on directing me into CBT-esque thought disputation techniques. One time, I chatted with Wysa about an interpersonal problem that was weighing on my mind. The problem wasn’t with any dysfunctional thoughts but rather that I needed to solve the problem in a way that honored my wishes and didn’t harm a friendship. Dialectical Behavior Therapy includes interpersonal effectiveness skills, so this would have been an opportune time to introduce them. But Wysa didn’t detect that I was dealing with an interpersonal problem and didn’t offer any help. Instead, it started walking me through identifying problems with my thoughts.

Sometimes Wysa doesn’t even get basic reflective listening right, as this morning. It asked me how I was feeling (using a pretty cute emoji-style interface). I indicated I was feeling pretty happy.

Sidenote: it asked me why I was feeling happy and I said I had coffee. Then it advised me to control my caffeine intake. A good Rogerian therapist wouldn’t toss off a recommendation to limit my caffeine intake so blithely.

Wysa asked what I was doing, and I said I was writing a review of this app. For some reason, it then concluded I was in a bad mood, and offered me tools to deal with this.

In this interaction, Wysa failed to display emotional intelligence. First, it misunderstood me and didn’t check that its evaluation of my mood was correct. Second, it did exactly what its developers said it wouldn’t: went immediately to trying to “fix” my bad mood (which didn’t even exist).

It seems that Wysa’s purpose is a little muddled. Do its developers want it to take a person-centered, empathetic Rogerian approach? Or are they building a CBT bot? These two foundations for a therapybot may be opposed.

Do they really have techniques from DBT and motivational interviewing built in, as they claimed? I didn’t see any evidence of these.

Lack of focus on the therapybot

I’m not a big fan of the recorded visualizations and meditations that Wysa offered to me. I am not opposed to this sort of thing. I meditate regularly and I even use EFT (aka “tapping”) on occasion when I’m feeling bad and want to do some DIY exposure therapy.

img_2101

The confidence visualization that Wysa suggested was apparently based on neurolinguistic programming, with which I am not very familiar. It guided me to think of a time I felt confident, then imagine how my confidence looked and felt, for example like a “warm golden circle in your chest.” The visualization included imagining I could change its shape, color, temperature, and movement. This technique may be useful but I wasn’t in any state of mind to benefit from it without some background on the approach. The meditation it offered me at another time was as well not aligned with my own preferences for such things. A sometime Buddhist, I like Tara Brach’s guided meditations.

Perhaps Wysa could focus on its therapybot rather than directing users to recorded visualizations and meditations. There are lots of good resources for those sorts of things and other apps like Calm and Headspace probably do it better. Wysa has a good start on an empathetic bot, and this is not a space that is well-addressed yet.

Coaching with a human

Wysa offers coaching with human coaches through a premium service. I didn’t evaluate this. Looks like it’s $14.99 a month for the first month and goes to $29.99 after that.

Conclusion

Wysa’s always-anonymous promise and focus on developing a system that actually has emotional intelligence make me excited for what they’ll accomplish. The Wysa bot actually does show some beginnings of empathy. At this point, however, the recorded meditations and visualizations seem like a distraction. And the bot’s enthusiasm for CBT-style cognitive distortion disputing techniques can be offputting, in situations when they’re not appropriate.