Investigation: crime predicting simply does not work


Police wasted millions of dollars on AI software that has gone racist and discriminatory.

Police wasted millions of dollars on AI software that has gone racist and discriminatory.

Cities across the United States have spent hundreds of millions of dollars on software powered by artificial intelligence in order to aid local police departments in prediction of crime, but very few – actually just one unit – shared the results of this experiment.

The results, says a joint investigation by The Markup and Wired, are shocking: the “predictive policing” has failed miserably – less than 100 out of more than 23,600 crime predictions generated by Geolitica, a machine learning algorithm used by the police department in Plainfield, New Jersey, and reviewed by journalists from the two outlets, were correct.

This is less than half a percent of all predictions generated between 25 February and 18 December 2018.

The Plainfield police was the only unit that agreed to provide information on the matter – the others 38 departments the journalists approached either refused or ignored the request.

The result varies depending on the type of offense: Geolitica was right about 0.6% of robberies or aggravated assaults and just 0.1% of burglaries, for example.

The contract cost the taxpayers a 20,500-dollar subscription fee for its first annual term, and then 15,500 dollars for a yearlong extension. Plainfield PD captain David Guarino told the journalists that because of Geolitica’s poor performance, his officers rarely used the software and the department was going to get rid of it.

Geolitica, formerly known as PredPol, was founded in 2012 as a project of the Los Angeles Police Department and the University of California. It produced a patented algorithm based on the model used in predicting earthquake and spiced it later with AI. 

More to read:
Why AI pioneer has quit Google, makes apocalyptic forecasts

Another law enforcement software company, called SoundThinking, began in August 2023 absorbing Geolitica’s patents, engineers, and customers. 

Many media reports alleged during the past few years that the software turned to be ineffective in crime prediction and sometimes made things worse instead. The algorithm is “biased” towards people according to their race, ethnicity, and social background, untying officers’ hands for discriminatory treatment and unethical behavior.

Crime prediction has fascinated police officials, software engineers, and psychology researchers since author Philip K. Dick published his 1956 science fiction novella “The Minority Report” – which film director Steven Spielberg turned into an action drama in 2002 with Tom Cruise and Colin Farrell staring.

***

It's rare to see someone donating to support a science blog [PayPal: @newscafeeu / IBAN - RO50BTRLEURCRT0490900501, Swift - BTRLRO22, Beneficiary - Rudeana SRL]. If you do so - thank you dear reader. Otherwise you may click on ad banners on our website. Any help makes us hopeful.



Do you think climate change is real?

View all
Yes, it is
Well, something is happening indeed
No, there is not danger