💡
4
c/ethical-frontiers•the_leethe_lee•20d ago

I finally got my town to pause a facial recognition pilot by showing them the error rates

Our local police in Springfield quietly started a six-month test with ClearView AI last fall, and the public notice was buried in a budget doc. I pulled the company's own white paper showing a 35% higher error rate for people with darker skin and printed fifty copies for the council meeting. After three meetings of pushing, they voted to stop it until they could write proper rules. Has anyone else had to fight a tech rollout with just the vendor's own bad data?
3 comments

Log in to join the discussion

Log In
3 Comments
taylorshah
taylorshah20d ago
A 35% higher error rate still means the system works most of the time. Pausing the whole pilot over one stat feels like letting perfect be the enemy of good. This tech helps find missing people and solve crimes faster. We should be fixing the rules while using it, not stopping completely. Holding up progress over a single metric can do more harm than the problem you're trying to fix.
2
sage308
sage30819d ago
That logic falls apart when the errors lead to real harm. We saw a similar rollout get pulled after wrongful detentions. Sometimes you have to stop and fix the thing before it does more damage.
4
phoenixw11
phoenixw1119d agoMost Upvoted
Reminds me of when our library tried self-checkouts that kept flagging kids' books as overdue.
1