In the context of rushed bills,1 the hollowing out of treaty scrutiny,2 and the lowest ever percentage of Freedom of Information requests being fully granted in the first three months of the current Government,3 how can a ‘periscoping’ method help public law researchers and caseworkers deal with the government’s lack of transparency?

In a panel debate at this year’s Society of Legal Scholars conference, Public Law Project showed how this innovative method can be a helpful tool in tackling the growing transparency gap.4 

By piecing together bits of information, ‘reflections and refractions’ emerge and allow for a fuller picture to be built up.

Harnessing Nancy Hiemstra’s method of ‘periscoping’,5 PLP has shed light on some of the interactions between people and the state that involve opaque decisions, policies or processes. Periscoping brings together different lenses and methods to challenge opacity. It involves ‘using various prisms and mirrors to refract and reproject emitted light’ and is a form of mixed methodology that specifically seeks to explore obscured state practices.6

Nancy Hiemstra argues that hard-to-access or intentionally obscured spaces, such as those involved in immigration enforcement, can never be completely ‘contained’ within government, allowing ‘flows and leaks’ to emerge from these spaces.7 By piecing together bits of information gained from applying different methods and lenses to an issue or a research site, ‘reflections and refractions’ emerge and allow for a fuller picture to be built up.8

To date, there are three key areas that illustrate how we’ve used this method: Universal Credit deductions, Government use of automated decision-making (ADM) and constitutional reforms by Boris Johnson’s Government. But what exactly has the periscope approach helped to reveal?

Universal Credit Deductions 

45% of Universal Credit (UC) claimants9 and 47% of foodbank users10 are subject to deductions: repayments of debt to the government and third parties (utility companies and landlords). But we wouldn’t know this from the official statistics.  

Instead, data on deductions is accessed via parliamentary questions, Freedom of Information Act (FOIA) Requests and civil society research, which results in a chaotic medley of information (or lack thereof).11 Where data is scattered, a mix of research methods (metaphorical ‘prisms and mirrors’) can be used to analyse the data that does exist, and to fill the remaining gaps. 

The lack of official statistics brings difficulties in demonstrating wide-spread problems: examples of systemic malfunctions can be easily dismissed as ‘anecdotal’. PLP’s ongoing research addresses this issue by using a quantitative survey of 500 UC claimants to learn about their awareness of measures designed to reduce the hardship of deductions (such as reduction of the rate of recovery or discretionary waiver)12 and experiences of applying for them. Such evidence can support individual stories and existing statistics in making a case for change. 

Conversations with welfare advisors and UC claimants provided examples of the Kafkaesque reality of dealing with government debt: for example, writing to your MP might often be quicker than calling Debt Management service. They also opened new avenues for inquiry: learning about the high evidential burden put on claimants applying for a waiver on medical grounds sparked the idea of talking to GPs supporting such applications and looking into the consistency of reasons for rejections. Such conversations can also voice strong personal – and often traumatic – experiences, which make the findings more relatable and humane. 

Litigation can expose and address the gaps in publicly available information.

Using FOIA requests can reveal a lot of unknown nuances: for example, they show what percentage of overpayments recovered by Debt Management is caused by administrative error, rather than claimants’ mistake (the majority of them)13 or what kind of debt is more likely to be waived. Interestingly, FOIA responses also help to establish what information is not held: for example, the data on percentage of people with protected characteristics experiencing deductions.14

Similarly, litigation can expose and address the gaps in publicly available information. When a Judicial Review case is brought, the Government is under specific disclosure requirements, enabling us to learn a lot about relevant policies through client representation. For example, K v SSWP,15 revealed that the DWP (unlawfully) had not made a sufficient assessment of how the UC deductions regime would affect people with disabilities and failed to disclose its internal guidance. This led to the publication of the latter and hopefully will trigger a new impact assessment in the future. 

Government use of automated decision-making (ADM) 

In recent years, there has been an increase in the use of artificial intelligence (AI) and algorithms in government decision-making.  

This isn’t inherently a problem, but ADM is not infallible. It carries a risk of bias and discrimination within its operation and application – as well as errors, both from standard statistical inaccuracies and systematic algorithmic flaws.16 PLP focuses on this because the use of automation is often opaque, which is a barrier to understanding whether the use of new technology – and the decisions it helps to make – are fair, lawful, and non-discriminatory.

The concealed information about Government ADM use, whilst difficult to obtain, is not completely contained.

Public authorities are often not forthcoming about the use of ADM, there is no Government publication of the technology used, or how it supports decision making. Furthermore, the law does not place a requirement on public authorities to notify people when a decision that affects them was reached with the support of (or by) AI or an algorithm. We have found the willingness of Government departments to engage with requests for information to be inconsistent. Unfortunately, alternative options for learning about Government ADM systems are in general limited.  

As with Universal Credit deductions, the information PLP holds on government use of ADM has been obtained through FOIA requests, findings obtained through the research of other civil society organisations and academics, network building and litigation. The concealed information about Government ADM use, whilst difficult to obtain, is not completely contained and these methods have led to occasional ‘flows and leaks’. 

This piecemeal information gained allows ‘reflections and refractions’ to emerge in public authority decision-making processes and, when put together, can help make once opaque ADM systems and practices more transparent. 

Over the past two years, this mixed research methodology has allowed us to piece together snippets of Government disclosure to build our ‘Tracking Automated Government’ (TAG) Register. The TAG Register is an online interactive database that tracks and logs known Government use of algorithms, automated tools, and databases.  

Although we know the 55 tools currently in the register are only the tip of the iceberg, it demonstrates the value of mixed methods research, non-traditional research methods, and engaging with a range of stakeholders. The entire database was constructed by PLP from information hard-won through FOIA requests, civil society research, network building and litigation.

The constitutional reform tracker 

In 2021, PLP launched the UK Constitutional Reform Tracker. At this time, the Government was making a series of important constitutional changes, including: changing the balance of power between Parliament, government and the judiciary; making it harder for people to access the courts to obtain accountability; and weakening human rights. This was happening not through a single programme but through a disparate series of proposals, such as the Judicial Review and Courts Bill, the Bill of Rights Bill, the Independent Review of Administrative Law (IRAL), and the Independent Human Rights Act Review. 

This made it difficult to keep track of what was going on, how the changes interrelated, and what the broader constitutional implications were. Taking a periscoping approach, PLP’s tracker did two things. The first was obtain as full a picture as possible by utilising as many data sources as we could, including government websites, court judgments, ministerial speeches, primary legislation, secondary legislation, parliamentary motions and debates, and public appointments. By the end of the project, PLP’s database contained 1300 entries describing reforms which changed our constitution.  

The tracker made these events and the interconnections between them more understandable by coding each entry with tags or keywords. Users could search, for example, for all policies that had implications for judicial review, or freedom of expression, or national security, or which related to the ministerial code. This not only identified the constitutional reforms occurring, it helped researchers, the media, and the public understand their importance and implications. The tracker was a way of making 1300 entries and many different data sources understandable and coherent. 

Periscoping in public law research 

The metaphor of Hiemstra’s periscope has proven to be a useful approach to public law research that seeks to improve public decision-making and increase access to justice. By using a range of methods, tactics and networks, and involving both research and casework, Public Law Project has been able to highlight concerning practices – and work to challenge them.

The periscope was used historically during military operations to see one’s surroundings whilst remaining hidden. Researchers today do not need to hide from the government: on the contrary, they have to persistently request the information needed to understand government systems and processes. Sometimes they get the information quickly, sometimes it takes numerous reformulations of FOIA requests, and on other occasions their efforts are entirely fruitless.  

“Periscoping” engages a mixture of tools to skilfully circumvent this obscurity and combine it with data that is already available. In doing so, it can tackle the transparency gap that public law researchers and caseworkers are increasingly faced with.




[4] In addition to the work outlined in this blog, Dr Emma Marshall from the University of Exeter also presented her work with Public Law Project on the Exceptional Case Funding scheme, which you can read more about here


[6] Ibid: 332.

[7] Ibid: 331. 

[8] Ibid: 330.



[11] Because of the lack of regular statistics releases, data on deductions is not organised in any way. For example, information on the scale of deductions is expressed in “claims” whilst information on their initial rates (provided in FOIA request responses) is expressed in “households”. The number of applications for the reduction in or suspension of the deductions are not captured. (FOIA Ref: FOI2023/24003 19 April 2023)


[13] Although according to the DWP Accounts 2022-2023 (p. 109) official error and claimant’s error overpayments both constitute 0.6% of the overall UC spending, overpayments loaded on Debt Management seem to be mainly Official Error ones (around 75%) according to Magdalena Caley’s FOIA Ref: FOI2021/61616 received on 29 July

[14] FOIA request submitted by Samuel Willis, Ref: FOI2022/58483, received on 27 July

[15] R (K) v Secretary of State for Work and Pensions [2023] EWHC 233 (Admin)

[16] Osonde Osoba and William Welser, An Intelligence in Our Image: The Risks of Bias and Errors in Artificial Intelligence (RAND Corporation 2017) 4 <> accessed 13 April 2023.