“The algorithms that process automated decision-making are often highly complex and can be virtually impossible for lay people to understand – so human oversight is crucial. That is why the government’s current proposal to get rid of Article 22 of the General Data Protection Regulation (GDPR) is so concerning.

Read the article

Writing in Prospect magazine, PLP Research Fellow Tatiana Kazim unpacks why a plan to scrap Article 22 of the GDPR as part of a proposed data protection law overhaul by the Department for Digital, Culture, Media and Sport (DCMS), raises serious concerns around discrimination and access to justice.

Article 22 is essential for ensuring human oversight in public decision-making, and without it, potentially life-changing decisions are made solely by computers. Errors leading to detrimental outcomes for individuals could go unnoticed, including the input of incorrect or discriminatory data into the system which could reinforce existing biases.

Alongside the proposal to remove Article 22, other concerning reforms suggested by DCMS include:

  • Removing the requirement for organisations to undertake Data Protection Impact Assessments 
  • Limiting people’s ability to find out about how their data is being used

These would both erode the transparency, accountability and protections currently enshrined in data protection law.

Read the full article

Read PLP’s consultation response on the DCMS’ proposed data protection law overhaul