On 9 February 2023, the Public Law Project (PLP) launched a database revealing details of 41 secretive algorithms used by Government to make or inform decisions on a range of sensitive policy areas, including how people are policed, what benefits they receive, and their immigration status.

As of October 2023, there are now 55 automated tools.

Researchers at PLP created the Tracking Automated Government (TAG) register to help increase transparency and accountability of automated decision making systems (ADMs) used by Government.

How it works

  • The TAG register has detailed information on over 40 known ADMs used by Government, including departments such as the Home Office, Department of Work and Pensions, Ministry of Justice and the Ministry of Defence, regulators including Ofqual, Police constabularies, the Metropolitan Police, and several local authorities.
  • Users can search by public authority to see what tools they are using (that we know about), how the tools work; which factors they rely on; and what is known about their potentially unequal impact.
  • The TAG register flags where there is a risk of discrimination and where no Data Protection Impact Assessment or Equality Impact Assessment has been done, so users can see where there are risks of data misuse or inequal outcomes.

Who is it for?

The register is designed to be useful for lawyers, researchers and journalists wanting to investigate or document the impact of ADMs, and for individuals affected by their decisions. If someone already knows or suspect that an automated system is being used, the TAG register gives them better tools to investigate further and if necessary, to challenge that system.

Legal Director of the Public Law Project, Ariane Adam said:

“Until recently, people have shared the expectation that important Government decisions affecting them are made by humans. But as the TAG register shows, many are made or informed by computers, some of which are programmed in a way that could exacerbate existing inequality and discrimination, yet so little is known about them.

“We developed the TAG register to lift the lid on what’s going on. The register is open access   so that anyone can see whether decisions about them are made by a robot instead of a human, and if they might be discriminatory or unlawful. The Government’s initiative to increase transparency – the Algorithmic Recording Transparency Standard – has explained only six ADMs in current use. Our TAG register details over 40.

“Although ADMs can make Government more efficient and save money, when they process the data of millions of people to do things like stop their benefits for months, or place them on police watch lists, the risk of these systems unlawfully discriminating and worsening existing inequalities is huge.

“The use of ADMs is often only discovered when strange patterns come to light, such as Black people being incorrectly identified by facial recognition, or disabled people being disproportionately targeted for benefit fraud investigations. Individuals are often unable to document these patterns and may be unaware that an ADM is at the heart of it. It shouldn’t be this way.

“If you don’t know that a system exists, it is impossible to know if there is an unlawful or discriminatory process at work, and you can’t challenge it to put it right. The TAG register flags where there are risks of unlawfulness and, by knowing about that, individuals can hold the state to account when their rights are affected.

“The Government needs to be forthright about how and where ADMs are in use, but it has chosen secrecy by default. That is why we need the TAG register.”