emotion / know / analysis: Walmart’s customer dissatisfaction detection patent

As one of the main goals of emotion / know is to develop AI tools supporting emotional intelligence, I am keeping watch over current efforts in the field, such as Walmart’s recently surfaced U.S. patent on video surveillance of customer dissatisfaction. The patent, granted in March of 2016, makes for interesting reading.

No need to be alarmed

Just like when we shop with them online, retailers want to identify us and then optimize customer service using analytics when we’re shopping with them in person. This is not the apocalypse, though I agree, if deployed, this customer dissatisfaction detection system could be “invasive, annoying, and prone to errors.” It’s based upon a flawed understanding of how emotions work, assuming that we (and AIs) can detect basic emotions from biomarkers. It can be done, but you need to teach the AI more emotional granularity than described in the patent filing.

 

What Walmart proposes doing

From the patent abstract:

A video feed of a camera viewing a POS queue is analyzed to identify customers and measure customer biometric data. The biometric data is analyzed and used to generate customer service actions such as alerting a representative or calling in additional staff if the data indicates customer dissatisfaction. The biometric data of a customer may be correlated to transaction data of the customer in order to detect changes of the purchase habits of the customer due to dissatisfaction. Changes in purchase habits, such as loss of a customer, may be used in combination with the biometric data to establish thresholds of biometric data used to generate customer service actions.

One of the requirements for patenting an invention is that it “must be disclosed in an application in a manner sufficiently clear and complete to enable it to be replicated by a person with an ordinary level of skill in the relevant technical field.” Has this been achieved?

No, because the patent takes the main problem itself as solved, that of translating biometric data captured by video feed into a measure of customer dissatisfaction. That’s the AI task you need to solve. How do you do it, Walmart?

Here’s Andrew Ng’s rule of thumb for what AI can and can’t do either now or in the near future:

If a typical person can do a mental task with less than one second of thought, we can probably automate it using AI either now or in the near future.

Ask yourself this: if you saw someone standing in a checkout line with biomarkers such as a dejected look on their face and slumped shoulders (neither of which is mentioned in the patent), could you distinguish among whether they were feeling the following:

  • Romantic heartbreak over a recent breakup with his long-term boyfriend?
  • Utter discouragement over a particularly difficult AI project assigned by her manager who didn’t understand what AI was and was not capable of?
  • Customer dissatisfaction because the flavor of toothpaste he wanted wasn’t available?
  • Kill-ALL-the-things fatigue due to being on day four of a Whole30 detox?

No, you could not. This is an emotional intelligence task a human cannot do, not with any amount of time. This is a task an artificial intelligence cannot be taught to do either.

This patent applies only to improving customer service. One of the keys to doing good artificial intelligence is doing it in a constrained domain. Here we’re building it only in a retail space. The AI can assume something about the human’s state of mind: he is in line at a store. He has the intent to check out.

So this is NOT an impossible emotional intelligence task. It is doable. It’s just not something this patent filing has made any realistic attempt at addressing!

A human could detect customer dissatisfaction during checkout in the following ways:

  • Crest Be Inspired Vanilla Mint Spark Flavor Toothpaste 4.5 OunceAsk the customer, “did you find everything you needed?” and receive a negative answer “no — my favorite toothpaste was out! I wanted Be Inspired Vanilla Mint Spark!”
  • Note increasing fidgeting among waiting customers if one transaction took a particularly long time (for example if a manager had to be called to void a mistake)
  • Hear a note of frustration in a customer’s voice when she tells the bagger “Please don’t put my raw chicken and vegetables in the same bag!” and starts bagging herself to get her groceries bagged the way she wants them.

An AI could be taught to do these things too, given enough context. I’m just not convinced, given the detail in the patent application, that this particular AI can do it. What biomarkers is it detecting and in what order? What is it actually capturing about what’s happening in line? Is it noting how long the person is standing there? Is the person initially standing still, then starting to tap her toes? Does she start with a neutral expression on her face and then it turns sour? Does it check for sighing?

These are things that need to be described for someone to build a POS queue customer dissatisfaction detection AI. The patent repeatedly mentions capturing biometric data but does not once say what biometric data is used in detecting customer dissatisfaction or how this is translated into a measurement of customer dissatisfaction. It does not get into the nitty-gritty detail and granularity of the emotion of customer-dissatisfaction-in-a-Walmart-POS-queue. That is the granular emotion that an AI could possibly be taught to detect.

The science of emotions

The root problem is that Walmart’s patent is built upon a flawed notion of emotion detection. Like almost all work in the EAI field today, it assumes that you can generically detect “customer dissatisfaction” (i.e., anger) instead of the more granular “customer dissatisfaction in a Walmart/discount retailer POS queue because you’ve been standing in line too long.”

The classic theory of emotion assumes that we have “dedicated circuits” in the brain for a few basic emotions. In the 20th century, researcher Paul Ekman identified the six basic emotions to be anger, disgust, fear, happiness, sadness, and surprise. Customer dissatisfaction would presumably be an instance of “anger.” I’m thinking that Walmart’s patent is based on the notion that detecting customer dissatisfaction is merely a job of detecting an instance of anger. So, take a picture of the customer’s face, do emotion detection on it using Microsoft Azure’s emotion detection API, and there you have it. Voila! Customer dissatisfaction detection! Any junior data scientist or even data scientist intern could do that. No need to describe it in the patent filing.

However, there’s ample empirical evidence that this theory of emotions is wrong. Instead, consider the theory of constructed emotion:

In every waking moment, your brain uses past experience, organized as concepts, to guide your actions and give your sensations meaning. When the concepts involved are emotion concepts, your brain constructs instances of emotion.

This theory is justified and detailed in Lisa Feldman Barrett’s book How Emotions are Made: The Secret Life of the Brain. And here’s an introduction to the theory on the NPR podcast Invisibilia.

Most EAI work today is based upon the old theory of emotions. It doesn’t take a granular enough view of emotions. Microsoft Azure’s Emotion API claims to detect “anger, contempt, disgust, fear, happiness, neutral, sadness, and surprise” and says these “emotions are understood to be cross-culturally and universally communicated with particular facial expressions.” That’s the old (debunked) theory of emotions.

emotion / know takeaways

Two things to take away from this analysis of Walmart’s patent filing:

First, it isn’t crazy to try to detect customer dissatisfaction in a retail POS queue with artificial intelligence and thereby start building emotionally intelligence into our checkout lines. Walmart, I salute you for your efforts in this direction! I don’t find it apocalyptically wrong! Seems totally doable and worthwhile to me. Let’s make our checkout lines more emotionally intelligent, with artificial intelligence tools.

My 17-year-old daughter worked as a bagger (a.k.a. “courtesy clerk”) at King Soopers over the summer. She definitely could detect which customers were happy or unhappy with what was going on in the checkout line. But she did it with very granular knowledge of what was happening. She knew if they were happy or unhappy with the bagging situation or the length of time of the transaction or how the checker was treating them. She didn’t do it simply based on how their face looked! She didn’t simply look for an instance of “anger.” She looked for very granular instances of emotion, like “bagging dissatisfaction related to overly high expectations for how bagging should be accomplished” or “unhappiness with checker attitude” or “frustration because the person ahead of them didn’t get out coupons before the transaction was complete.” We can teach AIs to do this too but they need the full context of what’s happening in order to detect the granular emotion the customer is feeling. Simply seeing what looks like an angry face is going to lead to errors and invasiveness, not the experience of an emotionally intelligent AI at checkout time.

I think of my favorite checkers at King Soopers — Don who addresses me as “young lady” without making it seem smarmy (it makes me happy), or Doris who greets me with graceful good will every time I go to the store. I try to get in their lines every time I can even if it means waiting an extra few minutes to get out of the store. If we can equip checkout lines with some of this charm and attentiveness, let’s do it. But it’s not going to be done with simple face recognition of Ekman’s six basic emotions.

Second, this is a great demonstration of how we must develop artificial intelligence with domain specificity. Horizontal artificial intelligence like Microsoft’s emotion API is not going to get us very far if what we want is to create AIs that can help us do our work with greater effectiveness. Simply noting that someone looks angry is not enough.

References

U.S. Patent No. 9,299,084 dated March 29, 2016 Detecting Customer Dissatisfaction Using Biometric Data.