Most police departments don't have the resources to sift through all their body-cam footage, meaning most of it remains unreviewed and unexamined. According to Axon, a company...
So fix that. Don’t make an AI to dole out justice against police like some messed up lottery. This is such a hollow solution in my mind. AI struggles to identify a motorcycle, people expect it to identify abuse?
Until you realize that the people who make the final decision on whether something the AI saw is indeed too far or extreme are the exact same people making the decision now and all we’ve succeeded in doing is creating a million dollar system that makes it look like they’re trying to change.
The people most likely to be abused by police are the least likely to be able or willing to file a formal complaint.
So fix that. Don’t make an AI to dole out justice against police like some messed up lottery. This is such a hollow solution in my mind. AI struggles to identify a motorcycle, people expect it to identify abuse?
Were it so simple, it would have been fixed decades ago. The difference is that having AI review the footage is actually feasible.
Until you realize that the people who make the final decision on whether something the AI saw is indeed too far or extreme are the exact same people making the decision now and all we’ve succeeded in doing is creating a million dollar system that makes it look like they’re trying to change.