A new report by the Justice and Home Affairs Committee (JHAC) includes recommendations from Public Law Project on how to ensure new technologies are applied fairly and lawfully within the justice system.


We’re please that many of our concerns around the use of algorithmic decision making by government bodies are reflected in this report, and are encouraged to see the JHAC actively engaging with the fair and lawful application of technology in the justice system.

Transparency

We’re encouraged to see the JHAC agrees with our emphasis on transparency being the essential first step in understanding and evaluating the use of new technologies in law enforcement. It cites PLP’s evidence that opacity is a “major challenge”, the report acknowledges that “evaluation is not possible without transparency.” Transparency is one of the key principles to be embodied in the new statue the JHAC recommends.

Without transparency, affected individuals cannot hold the state to account and prevent maleficence. Nor can there be proper debate and consensus-building around the use of new technologies in the public interest.

As the JHAC recognises, the Cabinet Office’s pilot Algorithmic Transparency Standard (ATS) is a step in the right direction. But the next iteration of the ATS must go further. The JHAC calls for an independent body to manage the ATS and enforce mandatory and meaningful provision of information to it. PLP joins this call.

Further, the JHAC notes our suggestion that executable versions should be disclosed. PLP considers that, as it stands, the level of detail to be provided to the ATS is insufficient. Read our feedback on the ATS.

Accountability

The JHAC notes PLP’s tripartite definition of accountability: “accountability encompasses: “responsibility” (who can be “praised, blamed, and sanctioned”); “answerability”, (who can be called to explain decisions); and “sanctionability” (subject to sanctions ranging from “social opprobrium to legal remedies”).”

To improve accountability, the report recommends:

  • An independent national body to govern the use of new technologies in the application of the law, established on a statutory basis and with its own budget. As well as being responsible for the ATS, this body would set minimum standards, certify every new technological solution against these standards, and carry out regular audits into their use.
  • Primary legislation which embodies general principles, and which is supported by detailed regulations.

We welcome these recommendations, however, we remain concerned by unsubstantiated claims that new technological tools provide efficiencies. Such claims must be backed up by a sound evidence base and the use of new technologies requires careful, independent evaluation.

Promoting meaningful engagement with algorithmic outputs

The JHAC notes PLP’s concerns around the potential for automation bias to undermine meaningful human engagement with the outputs of automated decision making (ADM) systems. It acknowledges that “there is a significant and worrying body of evidence that the users of advanced technologies are in many cases failing to engage, in a meaningful way, with the output of automated processes. Outputs may be overrated or misinterpreted, and challenge smothered, with potentially significant adverse consequences for individuals and litigants.”

We endorse the JHAC’s recommendations for research and training to promote meaningful engagement with algorithmic outputs and combat automation bias.

Human rights and the rule of law

While there are potentially significant benefits of ADM – fast, cheap, accurate, and consistent – the impact of flawed systems can be devastating. We’re pleased to see these concerns acknowledged in the JHAC’s assessment:

“The use of advanced technologies in the application of the law poses a real and current risk to human rights and to the rule of law. Unless this is acknowledged and addressed, the potential benefits of using advanced technologies may be outweighed by the harm that will occur and the distrust it will create.”

We’re pleased to see many of our recommendations on ensuring automated decision making operates fairly and lawfully in the justice system reflected in this report. While this is a promising development, there are now important steps that must be taken to apply these principles and build an automated decision making system free from discrimination.