San Francisco Says It Will Use AI To Reduce Bias When Charging People With Crimes
San Francisco is announcing a “bias mitigation tool” that uses basic AI techniques to automatically redact information from police reports that could identify a suspect’s race. It’s designed to be a way to keep prosecutors from being influenced by racial bias when deciding whether someone gets charged with a crime. The tool will be ready and is scheduled to be implemented on July 1st.
The tool will not only strip out descriptions of race, but also descriptors like eye color and hair color, according to the SF district attorney’s office. The names of people, locations, and neighborhoods that might all consciously or unconsciously tip off a prosecutor that a suspect is of a certain racial background are also removed. READ MORE ON: THE VERGE