365NEWSX
365NEWSX
Subscribe

Welcome

Lying, corrupt, anti-American cops are running amok with AI

Lying, corrupt, anti-American cops are running amok with AI

Lying, corrupt, anti-American cops are running amok with AI
Jul 26, 2021 1 min, 51 secs

Hundreds of thousands of law enforcement agents in the US have the authority to use blackbox AI to conduct unethical surveillance, generate evidence, and circumvent our Fourth Amendment protections.

The problem is that blackbox AI systems are a goldmine for startups, big tech, and politicians.

And, since the general public is ignorant about what they do or how they’re being used, law enforcement agencies have carte blanche do whatever they want.

And it’s as easy to use as Netflix or Spotify.

Cops are often offered these systems directly from the vendors as “trials” so they can try them before they decide whether to ask their departments to adopt them at scale.

These include facial recognition systems that don’t work for Black faces, predictive-policing systems that allow cops to blame the over-policing of poor minority communities on the algorithm, and niche services whose only purpose is generating evidence.

Predictive-policing is among the most common unethical AI systems used by law enforcement.

But, as we all know, you can’t predict when or where a crime is going to happen.

Anyone who says these systems can predict crime is obviously operating on faith alone, because nobody can explain why a blackbox system generates the output it does – not even the developers who created it.

Simply put: any time an AI system used by law enforcement can, in any way, affect an outcome for a human, it’s probably harmful.

Which means it’s a blackbox system that nobody can explain, and no legal department will defend.

They don’t go undetected if there are people around, and a gun fired in any given area of Chicago would be heard by tens of thousands of people.

They can describe what they heard, when they heard it, and explain to a jury why they thought what they heard was or was not a gunshot

If we find out that prosecutors told them to say they heard a gunshot and then they admit in court that they lied, that’s called perjury and it’s a crime

The reason there’s so much unethical cop AI is because it’s incredibly profitable

You don’t need a predictive algorithm to understand, historically speaking, what happens next

Summarized by 365NEWSX ROBOTS

RECENT NEWS

SUBSCRIBE

Get monthly updates and free resources.

CONNECT WITH US

© Copyright 2024 365NEWSX - All RIGHTS RESERVED