Are AI cameras in Seattle’s Magnolia neighborhood start of troubling trend?
It was recently revealed that Seattle’s Magnolia neighborhood is one of 10 neighborhoods in the city using cameras powered by artificial intelligence as a security measure. But is this actually a positive step forward for safety, or is it the start of a troubling trend?
First reported on by The Seattle Times, Magnolia is among the neighborhoods in Seattle employing the services of a company called Flock Safety. Flock uses a system of cameras that can do everything from simple surveillance, to tracking the frequency of a single vehicle entering and exiting a neighborhood, all powered by AI technology.
That’s raised some concerns over both the necessity of such a thorough security system, as well as the hazy moral consequences.
“Taking pictures of cars going into the neighborhood and tracking who’s coming in is a little much for Magnolia. What are they going to track me doing? Not being able to afford an apartment there?” joked MyNorthwest writer Chason Gordon, filling in as KIRO Nights co-host Monday night.
As for the moral implications, AI surveillance has proven problematic throughout its development. That’s been seen firsthand as Amazon has faced criticism for its facial recognition technology — used by police and Immigrations and Customs Enforcement — and its shortcomings in identifying light-skinned women and darker-skinned individuals.
Those issues can be exacerbated when largely white, affluent neighborhoods like Magnolia begin to adopt security technology of their own.
“It tends to be high income places with very low crime rates that early-adopt this sort of high-tech security technology,” KIRO Nights co-host Aaron Mason pointed out.
This also isn’t Seattle’s first go-around with a controversy like this either, after the Seattle Police Department purchased a pair of surveillance drones in 2012. Outcries from the public led to the program’s hasty cancellation before the drones ever got off the ground.
Clarification: Flock’s AI technology employed in Seattle does not use facial recognition, does not keep a database of license plates, and deletes all footage after 30 days.
Listen to KIRO Nights with Gee Scott and Aaron Mason weekdays from 7-10 p.m. on KIRO Radio, 97.3 FM. Subscribe to the podcast here.