Amazon’s facial recognition technology can now identify fear
Amazon’s facial recognition technology has learned a new trick. Called Rekognition, it now has the ability to identify fear, which adds another emotion to its belt of identifiable emotions, including being able to tell whether a person is happy, sad, angry, surprised, disgusted, calm, or confused. You may have experienced a few of those emotions after hearing about this.
To help figure out just how concerned everyone should be, Shankar Narayan of ACLU Washington joined the Candy, Mike and Todd Show to discuss the ramifications.
“We’ve known for a while that the large companies like Amazon and Microsoft … have been building this kind of sentiment analysis into their facial recognition product, of course,” he said. “These products don’t work particularly well even to identify people, and there have already been studies on the affect analysis part of it that shows that they don’t work particularly well either, and in fact are less accurate for certain groups like people of color, for example.”
One of the ACLU studies used to test the facial recognition technology highlighted the inaccuracy of the tech when it inadvertently listed that one in five California lawmakers are criminals.
“We took publicly available mug shot photographs, jail booking photographs … and we compared them to publicly available photos of members of Congress,” he said. “And as noted, there were no actual matches — just to make clear: None of the legislators were actually people who were in the mug shot database — and yet it wrongly identified, in the case of Congress, 28 of them.”
How the police might incorporate emotions in facial recognition
Along with facial expressions, the technology can identify gender and age range. What concerns Narayan is how the possibility of a flawed emotion detection system being used by police in dangerous situations.
“Imagine a situation in which a police officer has a body camera with facial recognition technology incorporated into it, but also this emotion detection that’s going to tell that person purportedly, ‘Am I angry? Do I have the propensity to be dangerous?'” said Narayan. “And that officer will be making a life or death decision to perhaps use deadly force based on an algorithm that’s likely not accurate for me, and in fact, is really based on some very discredited science that that has attracted a lot of critique. That’s a scary scenario.”
Despite those concerns, he says such technology is unlikely to be used by Seattle police any time soon.
“Fortunately, in Seattle, we have a very strong surveillance ordinance where something like this would have to go through public approval by the City Council,” he said. “And in fact, the Seattle Police Department actually dropped their face surveillance product that they previously had been using, I think, in part because it would have gone through that high level of scrutiny and maybe not survived it.”
Listen to The Candy, Mike, and Todd Show weekdays from 3-7 p.m. on KIRO Radio, 97.3 FM. Subscribe to the podcast here.