The Electronic Privacy Information Center (EPIC) has urged the Dutch Data Protection Authority to protect students and employees from the harms of emotion recognition.
The EU AI Act prohibits the development, deployment, and placement of emotion recognition systems in the EU market intended for workplaces and educational institutions, with limited exceptions for certain medical and safety reasons. However, the Dutch data protection agency Autoriteit Persoonsgegevens (AP) opened a consultation requesting feedback on the implementation of this prohibition.
The Washington DC-based EPIC has urged AP to define emotion recognition systems broadly and to either allow for no exemptions for its use, or construe the medical and safety exemption narrowly. EPIC’s recommendation is based on the “complete lack of scientific evidence that these systems work,” the organization writes, and that they “violate” various protections enshrined in the EU Charter of Fundamental Rights and other EU regulations.
EPIC regularly advocates for the protection of civil liberties and privacy rights, with focus on biometric surveillance, and has previously complained to the FTC on a job application screening tool that used emotion recognition. In addition, it has advised the United States Department of Education on the harms of emotion recognition, and warned the United States Department of Justice on the invasive nature of emotion recognition technologies.
AI-based emotion recognition systems make predictions about an individual’s emotional state based on biometric data such as heart rate, skin moisture, voice tone, gestures or facial expressions. However, the science behind “emotion recognition” can be barely construed as science. This is for the simple reason that inner emotions can be very hard to objectively measure based on a person’s external features.
For example, a skilled movie actor can be read as sad or anguished or extremely happy, but it does not mean that they are genuinely experiencing those emotions within themselves. Researchers have found that facial expressions can convey varying emotional states and that these can also vary substantially across different cultures, situations, and even across people within a single situation.
Therefore, “objectively” assessing emotions is a misnomer. Furthermore, such technologies can display discrimination based on race, gender and disability. Australian researcher and lawyer Natalie Shard recently explained in a piece for The Conversation why she believes the Australian government should have specific regulations surrounding the use of such technologies, which can be read here.
Article Topics
AI Act | biometrics | data protection | emotion recognition | EPIC | expression recognition | face biometrics | Netherlands | regulation
This post was originally published on here