Government digital services are expanding and interactions between individuals and the state are increasingly mediated by a range of digital processes: we want to ensure these processes are transparent, fair, effective, and non-discriminatory.
In addition, public bodies in the UK are increasingly using algorithms to make decisions across a vast range of areas.
Our work examines broader questions around establishing a robust legal framework to govern the development and application of algorithms, and how people can seek review and redress when things go wrong.
Our main goals in this area are:
The Tracking Automated Government register
As part of PLP’s campaign for transparency around automated decision making, we have developed an open register to share everything we know about secretive algorithms currently used by the UK Government.
This database has detailed information about tools used by departments like the Home Office, Department of Work and Pensions, and the Metropolitan Police, so users can clearly see everything we know in one place.
Explore the register here to help lift the lid on how these systems work and discover the risks for individuals who are affected by their decisions.
Read more in our 2022-25 strategy
How the DWP has failed to deliver on its promise of transparency, despite committing to publish analysis of bias in its automated systems
PLP’s new report compares transparency requirements from Canada, the USA, France, Japan, and the EU to discover how the UK should regulate AI
9 June 2025 @ 9:00 am – 5:30 pm – Is the law keeping pace with technology to protect people’s rights and uphold transparency and accountability standards? How can public, regulatory and human rights law help ensure technologies are deployed for the benefit of society? The event is kindly hosted by Fieldfisher. You can click here to view Sajan Rai’s illustration for the event in […]