ACLU Object to Amazon’s Facial Recognition Outreach to Police

Avatar

Andrew Donaldson

Born and raised in West Virginia, Andrew has since lived and traveled around the world several times over. Though frequently writing about politics out of a sense of duty and love of country, most of the time he would prefer discussions on history, culture, occasionally nerding on aviation, and his amateur foodie tendencies. He can usually be found misspelling/misusing words on Twitter @four4thefire.

Related Post Roulette

18 Responses

  1. Avatar Chip Daniels
    Ignored
    says:

    Imagine if we could use this technology to build a facial recognition database of all law enforcement officers so as to monitor and track them all in real time, effectively making undercover work impossible.

    We often forget that once a tool is developed and released into the wild anyone can put it to use.Report

  2. Avatar Vikram Bath
    Ignored
    says:

    I have to admit this doesn’t bother me all that much. If the cameras are only placed in public places viewing public places, I have no expectation of privacy there, and don’t really object to either people or machines recognizing me.Report

    • Avatar Chip Daniels in reply to Vikram Bath
      Ignored
      says:

      I want to agree with this but the recent revelations about social media give me pause.

      I’ve actually argued that social media posts aren’t much different than spouting off your opinions at a pub, i.e., everyone knows your name, occupation, history, and political leanings so what is the problem?

      I was sort of taken aback at the revelation of how big data and algorithmns take all this to a new and unfamiliar level.

      I wan’t prepared for how the damage was done, not by government agents learning how I voted, but in the weaponization of our fears and hatreds.

      No one expected this, not the ACLU or privacy advocates. They were like generals fighting the last war, manning the Maginot Line of digital privacy, unprepared for the blitzkrieg of social interaction which was guided and exploited by the harvesting of seemingly innocuous data.

      So while I am not worried that the LAPD will see me at Starbucks, the harvesting of yet one more piece of data about Chip Daniels, added to my purchases, blog comments at Ordinary Times, web searches, social connections, and the DNA of my relatives gives power over my life to a degree that makes me unsettled.

      Because the one big takeaway from our digital life should be that data itself is power. Even the stupid trivial bits of data are able to be turned into weapons.Report

  3. Avatar Jaybird
    Ignored
    says:

    If you haven’t watched Person of Interest, you seriously should check it out.Report

  4. Avatar Aaron David
    Ignored
    says:

    Bob Arctor laughs at this problem.Report

  5. Avatar InMD
    Ignored
    says:

    That’s it, we’re going off the grid.Report

  6. Avatar Road Scholar
    Ignored
    says:

    I agree with @vikram-bath here. The existence and use of a technology, per se, is rarely if ever the issue. It’s precisely how it’s used that can be troubling. We’ve always had a superb face-recognition system at our disposal–the human eye and brain. So if it’s okay for a human LEO to scan a public space for a criminal suspect by eye, I fail to see how it’s some horrible imposition on privacy to do the same thing using technology.

    It’s my belief, which I trust is supported by law and custom, that privacy is NOT equivalent to anonymity. The general right to privacy isn’t about who you are but about what you’re up to. So a FR system that scans public areas and alerts on a legitimate subject of interest is fine. What’s not fine is using such a system to build up a database of the movements and activities of the general populace. It wouldn’t be any less not okay were LE to do so using notes generated by hand; it’s just that the tech makes such a system more feasible.Report

    • Avatar Jaybird in reply to Road Scholar
      Ignored
      says:

      The existence and use of a technology, per se, is rarely if ever the issue. It’s precisely how it’s used that can be troubling.

      I assume that someone like Donald Trump will have it at his disposal when I measure the whole “precisely how it’s used” thing.Report

      • Avatar InMD in reply to Jaybird
        Ignored
        says:

        It’s like everything else the government and law enforcement are doing with technology. They’re building the architecture of a police state. Building the architecture and actually instituting one are two seperate things but it seems like the proportion of capability to safeguard is hugely lopsided.Report

        • Avatar Dark Matter in reply to InMD
          Ignored
          says:

          the proportion of capability to safeguard is hugely lopsided.

          Depends on what privacy standard we’re using. 20 years ago? Hell yes. Current? Maybe not.

          Everyone in a certain (increasing) age/income range carries a video recorder and computer and puts an absurd amount of “personal” data online. All cops and encounters with cops either are or will be monitored.

          Similarly society’s standards for all sorts of other things are changing. BLM’s critics are presumably correct in claiming the vast bulk of police interactions are fine (and we’ve gotten better) and they’re focusing on a tiny sliver of corner cases, but the ability to video record everything means we can focus on those. Similarly I doubt we could get away with deliberately setting hundreds of thousands of civilians on fire today because social media would make what that means vivid on every computer screen.

          Since this is all technology driven, in theory the fact that the gov will know far more about us will be more than balanced by us knowing far more about the gov.Report

  7. Avatar Kolohe
    Ignored
    says:

    Too bad that garage where Philip and Elizabeth kept their wigs was probably bulldozed 12 years ago to make way for Nats Park.Report

  8. Avatar Road Scholar
    Ignored
    says:

    Regardless of how you feel about the use of this tech by the the police in general, racial and gender bias is a troubling issue. Especially Amazon’s algorithm, which they either aren’t testing for bias or they aren’t revealing the results.Report

Leave a Reply

Your email address will not be published. Required fields are marked *