Best Use of Law

Foxglove & Curtis Parfitt-Ford unfair A-Level grading algorithm 

Thousands of students nationwide risked losing university places because of algorithmic grades.

The Campaign

In the summer of 2020, students across the country were prevented from sitting their exams by Covid-19. Ofqual, the UK’s exam regulator, decided an algorithm should determine their grades insteadThis resulted in 40% of students’ results being downgraded and was particularly negative for pupils attending schools in less affluent areas. Thousands of students nationwide risked losing university places because of algorithmic grades. 

Foxglove teamed up with an 18-year-old student called Curtis Parfitt-Ford, supporting him to bring a judicial review challenging the use of this algorithm. Curtis’s case argued that Ofqual’s algorithm was unlawful, that Ofqual had acted in contradiction of its statutory powers, it had violated data protection law, and was unfair.   

Together, Foxglove and Curtis launched a petition signed by over 250,000 people and launched a crowdfunder to cover his legal costs.  

The case, and national outcry, ultimately led to a Government U-turn. The algorithm was scrapped and students across the country were finally given their teacherassessed grades.   


The Change

The case made front page news and was featured on LBC, the Today Programme, the Guardian, BBC News, the Telegraph and more. But Curtis and Foxglove didn’t do this alone, students protested in the streets and there was national outcry. 

Curtis’ case and the A-Level grading fiasco sparked a much-needed national debate about the use of opaque algorithms in public life: not simply how they are used, but whether they should be used at all. 

In the future, say Foxglove, consequential algorithmic systems must be designed and built in a way that is democratically acceptable, and which does not cause chaos in thousands of lives. 


The Future

While this case is over and it looks like the Government won’t be suggesting grades by algorithm in summer 2021, we are seeing algorithms used in the public sector more and more, often with harmful consequences. Foxglove continues to fight to make UK public bodies use data in a way that is open, fair, and legal. 

Other successful cases challenging unfair algorithms include the UK’s first successful judicial review of an algorithmic system, with partners the Joint Council for the Welfare of Immigrants. It is also challenging a discriminatory Home Office ‘visa streaming algorithm’. 

Foxglove is also focusing on transparency, alongside openDemocracy,around secret contracts entered into by the Government and big technology companies, including Palantir and Amazon, for the ‘NHS Covid-19 datastore’. 

Who else was involved?

This case and campaign would not have happened without the brilliant Curtis Parfitt-Ford. Curtis was the A-level student whobrought this legal challenge out of deep concern that the A-Level grading algorithm would further ingrain inequality in education, treating him and his peers, especially those at disadvantaged schools, unfairly.  
Curtis’s legal team were David Wolfe QC at Matrix Chambers, Estelle Dehon at Cornerstone Barristers, and Ciar McAndrew at Monckton Chambers.    

The 250,000 people who signed Curtis’ petition and the thousands of people who donated to Curtis’ crowdfunding campaign.  

Teachers up and down the country that worked so hard to produce accurate grades for their pupils.