Kevin O’Sullivan, a key figure in digital innovation, spoke at the recent Futurescot Digital Justice and Policing conference in Glasgow about using artificial intelligence to ease the strain on Scotland’s courts. With backlogs piling up due to a flood of digital evidence and heavy administrative tasks, AI offers a practical way to speed up processes and let staff focus on what really counts.
Scotland’s justice system deals with over 40,000 pending criminal trials as of late 2025, and delays can stretch to two years for serious cases. This talk highlighted how targeted AI tools can cut through the red tape without replacing human oversight.
Pressures Mounting in Scotland’s Courts
Scotland’s justice sector faces tough challenges from rising caseloads and outdated systems. Prison overcrowding hit a record 8,430 inmates in October 2025, exceeding capacity by more than 600 people, which ties directly into court delays. The Scottish Courts and Tribunals Service reports a drop in new cases but still struggles with a backlog that peaked at over 43,000 scheduled trials earlier this year.
Administrative loads have spiked with digital evidence from phones, videos, and social media pouring in. Teams spend hours reviewing data, summarizing reports, and handling routine paperwork, slowing down everything from arrests to sentencing. Victim Support Scotland notes that waits of up to two years for trials leave people in limbo, adding emotional toll on victims and witnesses.
Recent government efforts include allocating nearly one billion pounds for a new prison in Glasgow, set for 2028, but experts say tech like AI must step in now to handle the daily grind. This pressure mirrors trends across the UK, where Crown Court backlogs reached a new high of over 70,000 cases in September 2025, with waits up to four years in some areas.
Two Types of AI Making a Difference
Experts divide AI into knowledge-based and action-based types, each playing a role in easing justice system woes. Knowledge-based AI acts like a smart assistant, pulling from vast data to boost individual work. Tools such as Microsoft Copilot or ChatGPT help draft legal notes or code, leveling the playing field for teams. A Harvard study from earlier this year found these tools narrow gaps between top and average performers by up to 40 percent in complex tasks.
Action-based AI goes further by handling actual work steps. These agents follow rules, process data, transcribe files, and even decide next actions, like triaging requests or generating documents. In justice settings, they can sift through evidence logs or flag issues in statements, freeing staff for deeper analysis.
| AI Type | Key Features | Justice System Benefits |
|---|---|---|
| Knowledge-Based | Advises on tasks, improves reasoning | Speeds up drafting and research, reduces errors in reports |
| Action-Based | Automates workflows, integrates with systems | Handles routine admin, cuts backlog time by processing evidence faster |
This split helps organizations pick the right tool for the job, starting with simple aids before scaling to full automation.
Storm ID Leads with Action-Based Solutions
Storm ID, a Scottish tech firm, showcased how action-based AI fits into justice workflows at the October 23 conference in Glasgow. They build custom setups on clouds like Azure, low-code agents via Microsoft Copilot Studio for Microsoft 365 users, and private solutions that keep data secure on-site.
In healthcare and local government projects, these tools categorize emails, fill in details, and route cases, patterns that translate well to policing and courts. For instance, AI can transcribe interviews or extract key facts from videos, cutting manual review time. Storm ID stresses explainability, ensuring users see how decisions form to build trust.
Their work aligns with broader UK moves, like the Ministry of Justice’s July 2025 AI Action Plan, which aims to use tech for faster courts and better outcomes. Police in Scotland plan to deploy AI for spotting crime hotspots, informing officer placements based on trends.
Private AI Emerges as a Secure Choice
Private AI stands out for handling sensitive data without cloud risks. It runs models inside an organization’s network, controlling access, logs, and costs. Upfront setup costs more, but per-task expenses drop sharply for high volumes, making it ideal for justice where privacy laws are strict.
This approach brings AI to the data, avoiding transfers that could breach rules. In 2025, with data leaks making headlines, private setups gain traction. Storm ID’s demos showed how they maintain full oversight, vital for evidence that could sway trials.
Experts warn of AI pitfalls, like fake citations in court docs seen in UK cases this year, but private systems with built-in checks reduce those dangers. The Bar Council urged quick action in June 2025 to guide lawyers on safe use, emphasizing human verification.
Spotlight on Storm AI Workbench
At the conference, Storm ID demoed the Storm AI Workbench, a self-hosted platform for secure team projects. Users set up workspaces blending public laws with private case files, then query AI for summaries or timelines.
In a policing example, it analyzed statements and transcripts to create executive summaries with time stamps, extracted people and roles, built inconsistency-flagged timelines, and drafted prosecution reports. It even checked evidence against legal standards, noting needs for corroboration.
Humans stay central, reviewing outputs with confidence scores and reasoning logs. This keeps things credible while speeding up prep for cases bogged down in admin.
- Key outputs from the demo:
- Concise summaries with 24-hour references.
- Source-linked timelines of events.
- Draft reports assessing proof and disclosure.
- Flagged inconsistencies for quick human checks.
Such tools could slash review times from days to hours, directly tackling backlogs.
Steps to Safely Roll Out AI in Justice
Adopting AI requires careful planning to avoid missteps. Start with narrow, high-impact tasks like summarizing evidence, where results show clear gains. Always keep humans reviewing outputs to catch errors and ensure fairness.
Design systems flexibly, swapping models as tech improves, since the field evolves fast. Pair tech with training and new processes to help teams adapt. Build trust gradually, moving from supervised low-stakes work to more independent use.
In Scotland, the Crown Office and Procurator Fiscal Service plans to highlight AI at future events, focusing on ethics and bias training. This mirrors UK-wide pushes for proportionate AI in courts, as outlined in August 2025 strategies to cut delays.
AI’s Role in a Modern Justice Future
AI won’t take over judgments but will enhance them, letting professionals tackle complex issues amid resource crunches. For Scotland’s overwhelmed system, it promises quicker resolutions and fairer access, turning digital evidence mountains into actionable insights.
From automating redaction to aiding risk assessments, these tools drive real change. As the Futurescot event showed, the tech exists; now it’s about smart deployment for a justice system that serves everyone better.
Share your thoughts on AI in courts below, and spread the word if this sparks ideas for reform.
