In the UK, a police department is testing an AI system called Soze, which has the potential to help the police quickly solve long-standing unsolved cases. According to Sky News, this AI program developed in Australia can compress decades of investigative work into just a few hours.

There are currently no detailed data on the accuracy rate of this AI system, which is undoubtedly an important concern. Everyone knows that AI models can sometimes output inaccurate information and even produce false content. The Avon and Somerset Police in southwest England are conducting this test, with Soze scanning and analyzing emails, social media accounts, videos, financial statements, and other relevant documents.

Robot Artificial Intelligence AI (2)

Image source note: Image generated by AI, provided by Midjourney image licensing service

Shockingly, this AI system scanned evidence from 27 "complex" cases in about 30 hours, equivalent to 81 years of human work. These figures are indeed encouraging, especially given the relatively tight staffing and budget constraints, and the police department is full of expectations for this project.

Gavin Stephens, chairman of the National Police Chiefs' Council, said in an interview: "You might encounter a cold case that seems impossible to solve, with a vast amount of material. Using such a system can easily handle and evaluate it." He seems confident in the rollout of these AI tools, but ensuring their proper functioning is crucial beforehand.

Stephens also mentioned another ongoing AI project, which is to build a database of knives and swords, as many suspects used these weapons in attacks on victims. However, the issue of error rates and potential biases in AI in the field of law enforcement cannot be ignored. A model used to predict the likelihood of a suspect reoffending was found to be inaccurate and biased against the black community, reminiscent of Philip K. Dick's novel "Minority Report" and the subsequent Spielberg film adaptation.

Additionally, AI facial recognition technology has led to wrongful arrests, with minority groups often being mistaken for criminals. The U.S. Civil Rights Commission recently criticized the use of AI in policing. There is a general belief that machine analysis is always accurate, but in fact, these AI systems are built on data collected by humans, which may already be biased and erroneous. Therefore, the application of AI in law enforcement still requires caution.

Key Points:

1. 🤖 UK police are testing the AI system Soze, which helps in quickly handling cold cases.

2. ⏱️ Soze scanned 27 cases in 30 hours, equivalent to 81 years of human work.

3. ⚠️ The application of AI in law enforcement requires caution due to the risks of bias and errors.