King County Council’s ban of facial recognition technology is ‘misguided’
The King County Council recently passed an ordinance that bans government use of facial recognition technology. That was fairly controversial within the council, but also with a group called the Information Technology & Innovation Foundation.
The foundation is a nonprofit, nonpartisan think tank based out of Washington, D.C. According to vice president Daniel Castro, it looks at policies around the world focused on innovation and technology.
So what was it that bothered Castro about what the King County Council did?
“Well, what was surprising to me is that they seemed to be basing their decision on no real evidence,” he said. “So this ban was primarily based on the idea that facial recognition technology is inaccurate and highly biased towards people with darker skin and sometimes women.”
“But when you look at the testing that’s come out from the federal government, the best performing facial recognition algorithms now perform not only better than any human, but they exhibit basically no bias,” he added. “And so to ban the technology because of those concerns is really misguided. And the use and procurement of this technology across the entire county is prohibited. So you can’t even use the technology, for example, to unlock a door.”
The outdated study Castro cites is a report from the ACLU, specifically the ACLU of California.
“They never released their data, they never released their methods, even when outsiders asked to look at it and validate it,” he said. “And that’s been a motivation for many of the laws that we’ve seen around the country.”
“And the other thing that came out was there were a number of studies that looked at facial analysis,” he explained. “So facial recognition is when you match one photo to another photo, facial analysis is when you look at a photo and you try and decide is this person old or young? Are they male or female? And that technology has been shown to have some racial biases, but facial recognition where you’re matching one photo to another, in the most recent tests, the best performing algorithms show no bias.”
Those studies were done in late 2019.
In addition to that knowledge, facial recognition has been shown to be more reliable and less biased than relying on eyewitnesses.
“Right now, we still of course do facial recognition ourselves,” Castro said. “People aren’t very good at recognizing and matching people. That’s actually the cause of a number of misidentifications in the past. And what this technology does is it allows you to do this at a much more accurate level and more reliably than you could do with a human. Also, of course at a much faster speed when time is of the essence.”
Castro says his foundation has asked ACLU to release their data and methods, but they haven’t.
“The problem is the ACLU has, of course, long-standing opposition to public cameras and surveillance in general,” he said. “So the fact that facial recognition is being used on some of these images is really secondary for their long-standing opposition.”
“To be clear, there are very good reasons why we might not want police to surveil protests. There are legitimate concerns about policing and racial bias in policing,” he added. “None of my critique here is to take away from that. What is so much more important than enacting these types of bans is enacting oversight, and transparency, and accountability in our police departments and law enforcement agencies so that this type of inconclusive police work can’t happen.”
Listen to Seattle’s Morning News weekday mornings from 5 – 9 a.m. on KIRO Radio, 97.3 FM. Subscribe to the podcast here.