Police Using AI To Catch Criminals Is Quick, Convenient And Scarily Imprecise ( foundation.mozilla.org )

Artificial intelligence tools can contain many of the same biases that humans do — whether it be search engines, dating apps, or even job hiring software. The problem also exists in systems with even more dire consequences — specifically, the criminal justice system.Facial recognition software is far from perfect, and we’ve seen how it can be worse for dark-skinned individuals. Combine this with law enforcement’s increasing use of face detection software and it creates a gruesome intersection. Randal Reid spent a week in jail for committing a crime in a state he had never been to. Porcha Woodruff was arrested for carjacking despite being very pregnant and in no condition to carjack. Robert Williams, the first documented person to be wrongfully arrested due to facial recognition tech, was accused of stealing thousands of dollars worth of watches. At the time of the crime, he was driving home.

    











    
        


    











    
        
Facial recognition occasionally misidentifies people who are white, but it overwhelmingly misidentifies women and people of color.Confirmation Bias Makes The Facial Recognition Problem WorseAt the heart of this technology issue are some very human problems. For one: confirmation bias. Mitha Nandagopalan, a staff attorney with the Innocence Project, notes how in the case of Porcha, a visibly pregnant woman accused of carjacking,[...]
demesisx ,
@demesisx@infosec.pub avatar

A post about AI bots posted by an AI bot and written by an AI bot.

Facial recognition occasionally misidentifies people who are , but it overwhelmingly misidentifies .

That opening sentence is all I needed to see.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • pulse_of_truth@infosec.pub
  • test
  • worldmews
  • mews
  • All magazines