Case study

Use case

Building an AI Assistant to Halve Case File Prep Time

Building an AI Assistant to Halve Police Case File Prep Time

Company

South Yorkshire Police is one of 43 territorial police forces in England and Wales, serving a population of around 1.4 million across South Yorkshire.

https://www.southyorks.police.uk/
Headquarters
Sheffield
Industry
Public Sector

Backed by the Police Science and Technology Fund (STAR Fund), South Yorkshire Police force wanted to explore whether an AI assistant could meaningfully help with case preparation and free up officer time for frontline duties.

The Challenge

Officers spend significant time on case file preparation, which involves repetitive administrative tasks that take them away from frontline duties. Preparing case files requires attention to detail and training, but not police powers. That work pulls officers off the street and reduces operational capacity.

At the time of the project, a third of South Yorkshire’s frontline officers were relatively new and still building experience with case preparation. Pairing them with experienced colleagues improves quality, but it does not scale. An AI assistant that could answer questions and support documentation offered a way to reduce that dependency without lowering standards.

No UK police force had yet tested whether large language models could support case preparation without extensive fine-tuning. There were real concerns about accuracy, hallucinations, and whether the models could reliably handle complex, structured police data.

Security was a core question. Could this be done inside existing police systems, without sending victim information or crime data to external services?

Our Solution

We worked directly with frontline officers to understand how case preparation actually fits into their day-to-day workflow.

Three priorities emerged.

Run securely inside police systems

The system operates entirely within the force’s Microsoft 365 Azure tenancy. No case data leaves that environment. We used the National Enabling Programme blueprint to show that an AI assistant could work without requiring special security exceptions.

Choose the right technology for the job

We selected the OpenAI 4.1 Nano model as a practical balance between capability and cost. The surrounding tooling is licence-free and open source, so forces are not locked into a single vendor and can change models as the technology evolves.

Work with existing police systems

Police forces use different records systems. We built a configurable data transformation layer so the assistant could work across both, allowing other forces to adapt the approach to their own data structures rather than rebuilding from scratch.

The prototype can summarise case information, identify key people and roles, build timelines of events, flag inconsistencies across documents and witness statements, and support MG11 statement review. Officers can ask questions in natural language, allowing the assistant to support rather than replace officer judgement.

We also built a disclosure log that records what officers asked, the responses they received, and which model version was used. This provides a clear audit trail for governance, review, and future evidential requirements.

The Results

The prototype was able to generate summaries, build timelines, assess evidence, and check for inconsistencies across documents and interviews. Early evidence suggests that, with further development, it can halve the time officers spend on case file preparation. The proof of concept showed enough value to justify continued investment.

Governance considerations quickly shaped the product. Questions about accountability and court readiness directly influenced features like the disclosure log, rather than being treated as afterthoughts.

An additional benefit emerged from the data extraction and transformation layers. Police data engineers identified opportunities to reduce duplication across systems, helping move toward a more consistent source of truth and reducing repeated data entry.

How We Built It

This was a multi-organisation collaboration funded through the STAR Fund. South Yorkshire Police defined the problem and provided operational expertise. Fuzzy Labs built the prototype. Sheffield Hallam University supported the research. Microsoft England helped make the connections that got the project moving.

The intellectual property belongs to policing, not to Fuzzy Labs. The tool is theirs to adapt, extend, and share.

Throughout development, we ran pair-programming sessions and regular check-ins with frontline officers. This helped ensure the assistant solved real problems and fitted naturally into existing workflows.

The technology runs entirely within the police Microsoft Azure tenancy. A configurable data transformation layer handles differences between records systems without requiring full rewrites. The disclosure log provides the audit trail needed to support governance.

Why It Matters

This project shows that large language models can support police case preparation without extensive fine-tuning. They can do so securely inside existing systems, using open-source tooling that avoids vendor lock-in.

None of this automates policing itself. What it does is remove friction from the administrative work that surrounds it.

For police forces facing similar pressures, this work offers a practical reference point rather than a theoretical promise.

How It Could Help You

The project is packaged in a Git repository under policing ownership. It is not yet configured for seamless cross-force sharing, but that is the direction of travel. The intellectual property belongs to UK policing.

The prototype demonstrated clear, practical value. The next step is to move from proof of concept to production-ready tooling, with appropriate guardrails, governance, and deeper integration. If you are working on similar challenges in secure or regulated environments, get in touch.