ACLU Object to Amazon’s Facial Recognition Outreach to Police
Facial recognition software is not new, but using it in real-time by police departments has raised some eyebrows. Add to those sentiments that Amazon is behind this latest marriage of big tech and government agencies, and privacy watchdogs are concerned to say the least.
NPR:
Until now, American police have used facial recognition primarily to compare still photos from crime scenes with mug shots. But now Amazon and Orlando are taking it further, by using facial recognition to spot people in real time.
“City of Orlando is a launch partner of ours,” Amazon’s Ranju Das recently told a developer conference in Seoul, South Korea. “They have cameras all over the city. The authorized cameras are then streaming the data … we are a subscriber to the stream, we analyze the video in real time, search against the collection of faces they have.”In this video presentation, Das is seen saying the system can be set up to notify the city if cameras see a “person of interest,” and it could be used to reconstruct a person’s past movements. He showed the conference a demo of real-time facial recognition using video from a “traffic cam that was provided by the city of Orlando.”
In a written statement, the Orlando Police Department called the Amazon facial recognition system a “pilot program” and said it “will be used in accordance with current and applicable law.”
The statement also says the department “is not using the technology in an investigative capacity or in any public spaces at this time.”It did not say whether the system has been used that way in the past, or will be in the future. NPR tried to follow up, but OPD said it wasn’t doing interviews on the topic.
Amazon also wouldn’t do an interview with NPR. In a written statement, it pointed out that its visual analytics tools have a wide range of applications beyond policing, and that “[o]ur quality of life would be much worse today if we outlawed new technology because some people could choose to abuse the technology. Imagine if customers couldn’t buy a computer because it was possible to use that computer for illegal purposes?”
Amazon’s statement added, “[W]e require our customers to comply with the law and be responsible when using Amazon Rekognition.”
The American Civil Liberties Union has asked Amazon to stop marketing the technology, to no response.
The American Civil Liberties Union and other privacy activists are asking Amazon to stop marketing a powerful facial recognition tool to police, saying law enforcement agencies could use the technology to “easily build a system to automate the identification and tracking of anyone.”
The tool, called Rekognition, is already being used by at least one agency — the Washington County Sheriff’s Office in Oregon — to check photographs of unidentified suspects against a database of mug shots from the county jail, which is a common use of such technology around the country.
But privacy advocates have been concerned about expanding the use of facial recognition to body cameras worn by officers or safety and traffic cameras that monitor public areas, allowing police to identify and track people in real time.
The tech giant’s entry into the market could vastly accelerate such developments, the privacy advocates fear, with potentially dire consequences for minorities who are already arrested at disproportionate rates, immigrants who may be in the country illegally or political protesters.“People should be free to walk down the street without being watched by the government,” the groups wrote in a letter to Amazon on Tuesday. “Facial recognition in American communities threatens this freedom.”
Amazon released Rekognition in late 2016, and the sheriff’s office in Washington County, west of Portland, became one of its first law enforcement agency customers. A year later, deputies were using it about 20 times per day — for example, to identify burglary suspects in store surveillance footage. Last month, the agency adopted policies governing its use, noting that officers in the field can use real-time face recognition to identify suspects who are unwilling or unable to provide their own ID, or if someone’s life is in danger.
“We are not mass-collecting. We are not putting a camera out on a street corner,” said Deputy Jeff Talbot, a spokesman for the sheriff’s office. “We want our local community to be aware of what we’re doing, how we’re using it to solve crimes — what it is and, just as importantly, what it is not.”
It cost the sheriff’s office just $400 to load 305,000 booking photos into the system and $6 per month in fees to continue the service, according to an email obtained by the ACLU under a public records request.
But Police departments are defending their trial of the technology, and downplaying what it is it actually does.
In a statement, a spokesman for the Orlando Police Department, Sgt. Eduardo Bernal, said the city was not using Amazon’s technology to track the location of elected officials in its jurisdiction, nor did it have plans to. He said the department was testing Amazon’s service now, but was not using it in investigations or public spaces.
“We are always looking for new solutions to further our ability to keep the residents and visitors of Orlando safe,” he said.
Early last year, the company began courting the Washington County Sheriff’s Office outside of Portland, Ore., eager to promote how it was using Amazon’s service for recognizing faces, emails obtained by the A.C.L.U. show. Mr. Adzima, a systems analyst in the office, told Amazon officials that he fed about 300,000 images from the county’s mug shot database into Amazon’s system.
Within a week of going live, the system was used to identify and arrest a suspect who stole more than $5,000 from local stores, he said, adding there were no leads before the system identified him. The technology was also cheap, costing just a few dollars a month after a setup fee of around $400.
Mr. Adzima ended up writing a blog post for Amazon about how the sheriff’s office was using Rekognition. He spoke at one of the company’s technical conferences, and local media began reporting on their efforts. After the attention, other law enforcement agencies in Oregon, Arizona and California began to reach out to Washington County to learn more about how it was using Amazon’s system, emails show.
In February of last year, before the publicity wave, Mr. Adzima told an Amazon representative in an email that the county’s lawyer was worried the public might believe “that we are constantly checking faces from everything, kind of a Big Brother vibe.”
“They are concerned that A.C.L.U. might consider this the government getting in bed with big data,” Mr. Adzima said in an email. He did not respond for a request for comment for this article.
Deputy Jeff Talbot, a spokesman for the Washington County Sheriff’s Office, said Amazon’s facial recognition system was not being used for mass surveillance by the office. The company has a policy to only use the technology to identify a suspect in a criminal investigation, he said, and has no plans to use it with footage from body cameras or real-time surveillance systems.
What say you? Login and Comment.
Imagine if we could use this technology to build a facial recognition database of all law enforcement officers so as to monitor and track them all in real time, effectively making undercover work impossible.
We often forget that once a tool is developed and released into the wild anyone can put it to use.Report
Fine pointReport
Internet. Facebook. Other social media. Facial Recognition Technology. Google Glass. Bodycams.
My expectation is within 10 years the police will ALWAYS know everyone they’re dealing with, and within 20 everyone will.
And yes, that means “no undercover” for any cop who has ever wore a uniform.
It will be fine. Abuses mostly happen in the shadows, we’re getting rid of shadows.Report
I have to admit this doesn’t bother me all that much. If the cameras are only placed in public places viewing public places, I have no expectation of privacy there, and don’t really object to either people or machines recognizing me.Report
I want to agree with this but the recent revelations about social media give me pause.
I’ve actually argued that social media posts aren’t much different than spouting off your opinions at a pub, i.e., everyone knows your name, occupation, history, and political leanings so what is the problem?
I was sort of taken aback at the revelation of how big data and algorithmns take all this to a new and unfamiliar level.
I wan’t prepared for how the damage was done, not by government agents learning how I voted, but in the weaponization of our fears and hatreds.
No one expected this, not the ACLU or privacy advocates. They were like generals fighting the last war, manning the Maginot Line of digital privacy, unprepared for the blitzkrieg of social interaction which was guided and exploited by the harvesting of seemingly innocuous data.
So while I am not worried that the LAPD will see me at Starbucks, the harvesting of yet one more piece of data about Chip Daniels, added to my purchases, blog comments at Ordinary Times, web searches, social connections, and the DNA of my relatives gives power over my life to a degree that makes me unsettled.
Because the one big takeaway from our digital life should be that data itself is power. Even the stupid trivial bits of data are able to be turned into weapons.Report
This is very good insight and thought. I think the analogy you draw to the Maginot Line is a really good one. The technology changed so fast most people just had no concept of what the fight was, let alone how to fight it.Report
I have to admit that the scale and consistency of monitoring matterReport
Team ChipReport
If you haven’t watched Person of Interest, you seriously should check it out.Report
Bob Arctor laughs at this problem.Report
He’s kind of a dick.Report
That’s it, we’re going off the grid.Report
I agree with @vikram-bath here. The existence and use of a technology, per se, is rarely if ever the issue. It’s precisely how it’s used that can be troubling. We’ve always had a superb face-recognition system at our disposal–the human eye and brain. So if it’s okay for a human LEO to scan a public space for a criminal suspect by eye, I fail to see how it’s some horrible imposition on privacy to do the same thing using technology.
It’s my belief, which I trust is supported by law and custom, that privacy is NOT equivalent to anonymity. The general right to privacy isn’t about who you are but about what you’re up to. So a FR system that scans public areas and alerts on a legitimate subject of interest is fine. What’s not fine is using such a system to build up a database of the movements and activities of the general populace. It wouldn’t be any less not okay were LE to do so using notes generated by hand; it’s just that the tech makes such a system more feasible.Report
The existence and use of a technology, per se, is rarely if ever the issue. It’s precisely how it’s used that can be troubling.
I assume that someone like Donald Trump will have it at his disposal when I measure the whole “precisely how it’s used” thing.Report
It’s like everything else the government and law enforcement are doing with technology. They’re building the architecture of a police state. Building the architecture and actually instituting one are two seperate things but it seems like the proportion of capability to safeguard is hugely lopsided.Report
Depends on what privacy standard we’re using. 20 years ago? Hell yes. Current? Maybe not.
Everyone in a certain (increasing) age/income range carries a video recorder and computer and puts an absurd amount of “personal” data online. All cops and encounters with cops either are or will be monitored.
Similarly society’s standards for all sorts of other things are changing. BLM’s critics are presumably correct in claiming the vast bulk of police interactions are fine (and we’ve gotten better) and they’re focusing on a tiny sliver of corner cases, but the ability to video record everything means we can focus on those. Similarly I doubt we could get away with deliberately setting hundreds of thousands of civilians on fire today because social media would make what that means vivid on every computer screen.
Since this is all technology driven, in theory the fact that the gov will know far more about us will be more than balanced by us knowing far more about the gov.Report
Too bad that garage where Philip and Elizabeth kept their wigs was probably bulldozed 12 years ago to make way for Nats Park.Report
Regardless of how you feel about the use of this tech by the the police in general, racial and gender bias is a troubling issue. Especially Amazon’s algorithm, which they either aren’t testing for bias or they aren’t revealing the results.Report