Published: 1st December 2022 A Public Accounts Committee (PAC) report released this month has said that lack of transparency over the Department of Work and Pensions’ use of data analytics risks eroding public trust. Whilst PLP welcomes the report’s recognition of the importance of transparency, the PAC’s central recommendation that the DWP should submit an annual report to Parliament does not go far enough. The PAC report follows publication of DWP’ accounts earlier this year which disclosed that a machine learning algorithm was being used to flag benefit payments as potentially fraudulent and was being used to stop payments before they were made. Although the system was a pilot, the DWP planned to roll it out without having published an analysis of its equality impacts and despite concerns about potential discrimination. PLP’s Legal Director Ariane Adam said: “In the hands of public bodies, automated decision-making (ADM) is a tremendously powerful tool. The number of people subject to automated decision-making systems is ever-growing, and yet the picture as to what systems are in use, how they work, and whether they work reliably, lawfully and fairly, remains obscure and unclear.“The rules about how these tools are regulated are still being written and so it is vital that Government hears and listens carefully to what the PAC has said – public trust will be lost if the use of ADM is not transparent.“Whilst that message is welcome, the Committee’s central recommendation that the DWP submits an annual report to Parliament falls short of what is required for robust monitoring, evaluation and accountability around use of ADM.“As PLP said in evidence to the PAC, the Department should notify individuals when an automated tool has been used to make a decision that affects them and should publish plans for a redress mechanism for where such a tool has contributed to an unlawful or unfair decision.” Other recommendations PLP made to the Committee included: Public sector organisations should have statutory transparency obligations in respect of all automated tools involved in decision making with potential societal effect. It should be compulsory for organisations to monitor the operation of the tool and update the information disclosed periodically and/or whenever there is a significant change. A high-level explanation of how an ADM tool works should be disclosed and include the rules or criteria it uses. Sufficient detail should be provided so that individuals whose rights may be affected can fully understand how the process works. PLP’s written evidence to the PAC enquiry is available here. Recommendations In respect of each of its automated tools, we recommend that the Department should disclose: the colloquial name of the tool; who owns and has responsibility for the algorithmic tool; who developed the tool; what the tool is used for; any criteria or rules applied by the tool (unless unavailable because it is a machine-learning tool); an executable version of the tool; any data used to train the tool and/or determine its rules or criteria; any EIAs, DPIAs, or other evaluations; risks and mitigations. PLP recommends that the Committee should ask the Department to clarify which three groups have been subject to the ‘fairness analysis’ and what the results of this analysis were. Once the analysis is complete, the Department should publish the results. Further, the Department should consult with stakeholders – especially those likely to be processed by the new model – before the model is deployed in any new context. PLP recommends that the Department should publish EIAs and DPIAs in relation to its automated decision-making tools (including automated tools that aid a human decision-maker by providing a recommendation, risk score, or similar). PLP recommends that the Committee builds on its 2021 recommendation that the Department ‘should do more to understand the impact that both overpayments and underpayments have on claimants and ensure that vulnerable claimants are treated with care when dealing with error on the claim’ to recommend that the Department: a. Improve its data on deductions and protected characteristics b. Build in proactive consideration of affordability and vulnerability prior to deciding a) whether to apply deductions and b) the rate of recovery. c. Improve the accessibility of financial hardship decisions, suspensions and waivers. PLP recommends that the Department collects and publishes data on deductions and claimant (including protected) characteristics. PLP recommends that the Committee request further detail from the Department on what action they are taking to reduce official error overpayments. PLP recommends that the Department clarify that official error may be a ground for waiver in its own right.