×
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT

Scientists create online games to teach people about risks of emotion recognition: Report

The researchers hope that by playing the game, people will be more open to discussing this technology
Last Updated : 07 April 2021, 11:17 IST
Last Updated : 07 April 2021, 11:17 IST

Follow Us :

Comments

Facial recognition technology is available in several places around the world, such as airports, but is at the centre of the debate over possible racial bias and privacy concerns.

Emotion recognition, at least on paper, is a more advanced form of the technology: Rather than just scanning a face against a database to identify a person, emotion recognition attaches a value to a person's facial expression to arrive at a given conclusion about their mental state.

It is a relatively nascent term despite its apparent widespread use, such that one could be forgiven for not hearing about it at all, but it is debated among ethicists. Now, a group of social scientists, citizen scientists, and designers have created a game with the intent to amplify public debate by letting people experience it first hand.

The website — called emojify.info — is a platform where people can try out a real-world application of emotion recognition technology through two games, one which explores how emotion recognition systems can fail to understand the context behind an emotion, and another game where a person uses their computer camera to try and trick the technology by making faces.

The researchers hope that by playing the game, people will be more open to discussing this technology.

Talking about the technology, Dr Alexa Hagerty, project lead and researcher at the University of Cambridge, told The Guardian: “It is a form of facial recognition, but it goes farther because rather than just identifying people, it claims to read our emotions, our inner feelings from our faces.”

A video on emojify.info details how emotion recognition can be used in job interviews to assess candidates, at airports to flag security threats, and even court hearings to determine the guilt of a person.

The video also says that emotion recognition has bias baked into it, much like in facial recognition. It cites a study that showed that these systems read the faces of black men as "more angrier than the faces of white men," regardless of their expression.

Hagerty said emotion recognition technology might have some potential benefits these must be weighed against concerns around accuracy, racial bias, as well as whether the technology was even the right tool for a particular job.

“We need to be having a much wider public conversation and deliberation about these technologies,” she told the publication.

“The claim of the people who are developing this technology is that it is reading emotion, but there is lots of really solid science that says that is too simple; it doesn’t work quite like that,” Hagerty told the Guardian, adding that human experience showed it was possible to fake a smile.

“I think we are beginning to realise we are not really ‘users’ of technology, we are citizens in world being deeply shaped by technology, so we need to have the same kind of democratic, citizen-based input on these technologies as we have on other important things in societies,” she said.

ADVERTISEMENT
Published 07 April 2021, 09:00 IST

Deccan Herald is on WhatsApp Channels| Join now for Breaking News & Editor's Picks

Follow us on :

Follow Us

ADVERTISEMENT
ADVERTISEMENT