Large corporations as well as tech companies are betting on emotion recognition – and so is the Chinese state. A lot of it is mumbo jumbo. But it has its uses.
Mention the name Adam Smith, and people immediately think of the “invisible hand” that supposedly allows a great, good whole to emerge from the accumulation of individual egoisms. This idea has shaped our economic order for more than 250 years. Smith describes this meaning of the metaphor prominently in "The Wealth of Nations" (1776). Less well known is that the Scottish national economist was also thought a lot about emotions.
His "Theory of Ethical Emotions" (1759) postulates that human use moral rules as a means for locating themselves in reality. Humans are constantly observing themselves as they interact and communicate with each other. It’s how we discern when we ourselves and others are suffering or experiencing pleasure, and make the perceived similarities and differences the moral standard for interacting with one another.
Adam Smith would have had no use for "emotion recognition." Emotion recognition is the “next big thing” for all those prediction fanatics who believe nothing in the world should be spared from data-based prediction.
And so underage students in Hong Kong sit in front of a screen, learning remotely, while an artificial intelligence constantly analyzes their facial features. The AI aggregates and evaluates micro-movements of their facial muscles, sometimes even minimal changes in skin pores, to determine how happy, motivated and attentive the learners are.
Coca-Cola and Intel use marketing service providers specializing in facial recognition technologies to test the emotional response to their advertising. And tech companies like HireVue and Human use facial expressions to analyze whether job candidates are conscientious and reliable as early as the initial application process. Those who think this is mumbo jumbo are right.
Emotion recognition endangers the human core of humanism
Every individual can be identified by his or her facial features. It is also possible to read a lot from them – but unfortunately not necessarily the emotional state a person is in. AI systems are trained with data. How correct the result is depends on this data.
How often these technologies confuse people or, for example, fail to recognize the faces of black people in the first place, is now abundantly documented. On the other hand, the systems also recognize good looks.
When I recently tried a newer version of emotion recognition, the system reported "poker face" several times. Over the course of a lifetime, people learn not only to express emotions, but also to withhold or disguise them.
There is no one standard for how joy or fear are expressed. It varies individually, but also culturally. The theory of six universal emotions (fear, anger, joy, sadness, disgust and surprise) developed by U.S. psychologist Paul Ekman has long been widely challenged.
So much of emotion recognition is indeed mumbo jumbo. But it’s profitable. The market for AI-powered emotion recognition is expected to grow from just under $20 billion to $37 billion in the next five years.
So there's a lot of money at stake. And extensive control. The BBC recently published research showing that China is using algorithmic analysis to monitor the emotions of the Uighur minority in Xinjiang and further restrict their rights. Alleged suspects are strapped into chairs in police stations, questioned, and filmed to determine whether they are in a "negative" or "fearful" emotional state that indicates guilt.
This "emotion detection" threatens the human core of humanism. Adam Smith saw it as a given that man takes an interest “in the fortune of others, and renders their happiness necessary to him, though he derives nothing from it except the pleasure of seeing it."
If we are so inclined, caution is advised in the field of emotion recognition. Emotions allow us to balance in-betweens, explore trade-offs, and endure ambiguity. AI-based emotion recognition turns emotions into analytics, and moral understanding into computational principles. It is an example of uncreative destruction.