AI systems that measure joy and stress are, according to the supervisor ‘dubious’

Security Camera, Street

Various AI programs claim to recognize emotions such as joy, anger, or stress in people. However, the Dutch Data Protection Authority (Autoriteit Persoonsgegevens) doubts whether they can actually do so after conducting research. In practice, these systems bring risks and ethical issues.

With cameras in public spaces, a posture or facial expression can be read. Other systems use heart rate or voice to measure emotions.

According to the Dutch Data Protection Authority (AP), these techniques are used by various organizations. “For example, for marketing, customer service, job applications, education, public safety, and healthcare.”

In addition, the market for monitoring stress and preventing burnouts is growing, says the supervisor. “Emotion recognition can also be used without you realizing it,” the report states. For example, emotion recognition in advertisements can be used to estimate and influence someone’s buying behavior.

Emotion recognition is not always reliable

The authority investigated how customer services, wearables (such as smartwatches), and language models handle emotion recognition, among other things. It turns out that it is often unclear how AI systems recognize emotions and whether the outcomes are reliable.

Ruth Ruskamp, AI advisor at the AP, points out several drawbacks to emotion recognition. “Emotions are not universal,” she says. “It can differ per culture. And even per person, emotions can be expressed and felt differently.”

Measurements based on heart rate are also not always clear. A high heart rate can be an indication of stress, but it can also simply mean that you are exerting more effort because you are moving a lot, for example.

Is this desirable?

The Dutch Data Protection Authority concludes that caution should be exercised with this type of technology. “Emotions are closely related to your human autonomy,” says Aleid Wolfsen, chairman of the AP. “If you want to recognize emotions, it must be done very carefully and based on reliable technology. That is often not the case now.”

Incidentally, the use of AI systems that recognize emotions is already prohibited in some places according to European law. For example, it may not be used in the workplace and in education.

The AP says that it should be further discussed in organizations and in politics whether the technology is desirable at all. “It is an ethical question whether you as a society find the recognition of emotions with AI acceptable,” says Wolfsen. “And if so, in what form and for what.”

Scroll to Top