Public Law Project has initiated a legal challenge over a secretive and potentially discriminatory algorithm used by the Home Office to target people for ‘sham marriage’ investigations.

The move comes after the Information Tribunal agreed in a separate ruling earlier this month that “there will be some indirect discrimination” and “potential bias” in the algorithmic system the Home Office is using.

Based on information revealed by PLP investigations, the Home Office’s automated ‘triage’ tool is used to decide whether couples planning to get married should be subject to a ‘sham marriage’ investigation. The triage tool categorises couples as ‘pass’ or ‘fail’.

PLP’s Legal Director Ariane Adam said: “Couples who fail face invasive and unpleasant investigations and can have their permission to marry delayed without even being told that a machine was involved in the decision-making process. Home Office data show that the triage tool fails certain nationalities at disproportionate rates that are inconsistent with their contribution to migration in the UK.

“The information available demonstrates prima facie indirect nationality discrimination, with some nationalities, including Greeks, Bulgarians, Romanians and Albanians, disproportionately failing triage. It also suggests that there is no manual review in every ‘fail’ case. If that is in fact the case, the operation of the tool would be unlawful and would not conform to the Home Office’s own policy. The Home Office’s refusal to be transparent about the triage tool may also violate data protection obligations.”

The legal grounds for challenge, outlined in PLP’s pre-action letter to the Home Office, are that:

  • The outputs of the triage tool appear to indirectly discriminate on nationality.
  • The Home Office does not appear to have discharged its Public Sector Equality Duty to take steps to eliminate unlawful discrimination and to advance equality of opportunity. The courts have established that this duty is more demanding when using novel digital systems.
  • Home Office secrecy about the system breaches transparency rules under the GDPR.
  • If there is not always a human/manual review of ‘fail’ cases which trigger an investigation, this would:
    • go against published Government policy, and
    • place the Home Secretary in breach of section 48 Immigration Act 2014 for delegating something that is for her to decide, through her officials, to a machine-learning algorithm.

Ariane Adam said: “New technology can achieve greater efficiency, accuracy, and fairness in Government decision-making. But if the computers merely replicate biased information, then all they are doing is making prejudicial and unfair decisions faster than humans can.”

Nationality discrimination

Profiling by nationality is not permitted without ministerial authority. Based on the information publicly available, the Home Office has not sought such authority.

There is no evidence, other than the outcomes of the triage tool itself, that the nationalities disproportionately affected are more likely to be involved in sham marriages. Where there is prima facie discriminatory impact, the Home Office must justify it by demonstrating that it is a proportionate means of achieving a legitimate aim. No publicly available documents demonstrate such a justification has been made.

Background – lack of transparency

PLP has initiated the pre-action stages of a judicial review by writing a pre-action protocol letter to the Home Office on 16 February 2023, but is hoping the matter can be resolved without litigation and has asked the Home Office to consider alternative dispute resolution.

This action taken by PLP follows a separate appeal to the First Tier Tribunal (Information Rights) earlier in 2023 which was against the Information Commissioner’s Office decision not to require the Home Office to disclose the criteria used by the algorithm.

The appeal was heard by the First Tier Tribunal (Information Rights) on January 6, 2023. Although the FTT decided against PLP in relation to the disclosure of the criteria, it recognised the potential for bias and noted that the apparent discriminatory effect of the Home Office’s use of the algorithm could be challenged by way of judicial review.

In its judgment, the FTT panel accepted that “there will be some indirect discrimination,” noting that specific nationalities may be more vulnerable to the processes involved.  

PLP will be seeking permission to appeal the decision of the FTT to the Upper Tribunal.

Ariane Adam said: “Without transparency, there can be no accountability. Home Office secrecy is preventing individuals from understanding how decisions that impact one of the most intimate moments of their personal lives are made.

“Without that understanding, they are unable to seek remedy when things go wrong, such as when they are impacted by unlawful discrimination.”

Tracking Automated Government

The Home Office algorithm is one of more than 40 automated decision-making systems (ADMs) detailed on PLP’s Tracking Automated Government (TAG) register, launched this month to improve transparency on automated systems used in public decision-making on a range of sensitive policy areas.

The register details algorithms that affect how people are policed, what benefits they receive, and their immigration status. It features tools people may have heard of, such as those used for A-level results, police use of facial recognition, and the Gangs Matrix, and many more that are not well known.