sonori , (edited )
@sonori@beehaw.org avatar

“A computer can never be held accountable, therefore a computer must never make a management decision.”

Even more importantly when it comes to assessing properly, machine learning, now referred to as AI, has been continuealy shown to not just repeat the biases in its training data, but to significantly exaggerate them.

Given how significantly and explicitly race has been used to determine and guide so much property and neighborhood development in the training data, I do not look forward to seeing a system that is not only more racist than a post war city council choosing where to build new moterways but which is sold and treated as infallible by the humans operating and litigating it.

Given the deaths and disaster created by the Horizon Post Office Scandel, I also very much do not look forward to the widespread adoption of software which is inherently and provablly far less accurate, reliable, and auditable than the Horizon software. At least that could only ruin your life if you were a Postmaster and not just any member of the general public who isn’t rich enough to have your affairs handled by a human.

But hey, on the bright side, if Horizon set UK legal precedent than any person or property agent is fully and unequivocally legally liable for the output of any software they use, after the first few are found guilty for things the procedural text generator they used wrote people might decide its not worth the risk.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • technology@beehaw.org
  • test
  • worldmews
  • mews
  • All magazines