B

usage “Immature” biometric technology that claims to offer emotional analysis of staff should not be relied upon, the Information Commissioner’s Office (ICO) has said, warning that such technology could discriminate against some people.

The Data Protection Authority’s intervention refers to artificial intelligence-based technology that claims to analyze things like facial movements and expressions, gait and even gaze tracking as a way of monitoring workers’ health and well-being.

The ICO said the process of collecting personal data, which can focus on subconscious behavioral or emotional responses to try to understand emotions, is much riskier than more traditional biometric technologies used to verify a person’s identity.

It says that the algorithms used in these systems, which have not been sufficiently developed to detect emotional signals, may exhibit bias or even discriminate against certain people.

The only sustainable biometric deployments will be fully functional, accountable and science-based

The regulator urged organizations to assess public risk before using such technology and warned that any firms that do not act responsibly, put vulnerable people at risk or do not meet the ICO’s expectations will be investigated.

“Developments in the market of artificial intelligence biometrics and emotions are immature. They may or may not work yet,” said ICO Deputy Commissioner Stephen Bonner.

“While there are opportunities, the risks are currently greater.

“At the ICO, we are concerned that incorrect data analysis can lead to assumptions and judgments about a person that are inaccurate and lead to discrimination.

“The only sustainable biometric deployments will be fully functional, accountable and backed by science.

“As it stands, we are yet to see emotional AI technology develop in a way that meets data protection requirements, and we have more general questions about proportionality, fairness and transparency in this area.

“The ICO will continue to scrutinize the market, identify stakeholders willing to build or deploy these technologies, and explain the importance of improving data privacy and compliance, while encouraging trust and confidence in how these systems work.”

The ICO has also confirmed that it will publish new guidance on biometric technology next spring to help businesses better understand how and when to use the technology.

https://www.standard.co.uk/business/business-news/emotion-analysis-technology-could-lead-to-discrimination-watchdog-warns-b1035310.html