A Levels, Algorithms and GDPR by Nicola Hoskins

Wed 19th Aug 2020

1. The day of A level results is not one that arrives without great trepidation. In 2020, the year of no actual A level exams, it’s fair to say that the trepidation went up a gear. And so the day came, and the theme that emerged was of downgrading of CAGs, or ‘Centre Assessed Grades’ by an algorithm, to the tune of almost 40%.  Unsurprisingly, there arose a swift reaction and a great deal of anguish.

2. The situation has been rectified to an extent by a policy u-turn announced on Monday afternoon that effectively re-instated CAGs, but it is worth remembering that this is in relation to A levels only. Some schools and many FE colleges – including those aimed at mature students – offer instead BTEC courses as the gateway to university. At the time of writing, the final awards for these qualifications were similarly affected by the algorithm, but no remedy has yet been proposed.

3. Whilst there are grounds of appeal, they are in truth very narrow. In all cases, it is the centre’s (school’s or college’s) appeal rather than that of the student, and it must be shown that:
a) There has been an administrative error, or
b) There is evidence that the current year group could have been expected to get better results than previous year groups, or
c) There is a ‘valid’ mock result higher than the issued grade.

4. What constitutes a ‘valid’ mock result is not defined, and nor is the process for such an appeal. Centres submit that a mock is a progress check above all, and they may be taken at different points in the year in different centres. Not all centres use formal mocks. Also, higher grade boundaries are often applied to motivate; this means the marking scheme may not comply with whatever the requirements are and it may not be right that it does. None of these were within the control of the affected students and none of it could have been foreseen by any centre.

5. But perhaps the issue that raises the most eyebrows is in the relation to second limb – the fact that the algorithm took into account historical performances at the school, i.e. the modified grade was not based solely on an assessment of the student’s previous performance or CAG. The algorithm also went further, because centres were required to produce a ‘rank’ order for the students in particular subjects, and unless the number of students taking the subject was particularly large, there could be no joint-rankings. Students were therefore impacted by previous students and the centre’s potentially arbitrary rank of them as frequently equally
competent individuals within a group.

6. Aside from questions of unfairness, irrationality and ultra vires – the operation of the algorithm in practice does not appear to sit well with previous commitments made – there are legitimate GDPR concerns in relation to the approach taken.

7. Modifying CAGs using an algorithm is a data profiling exercise: defined in Article 4(4) GDPR as “any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person”. This sits alongside the general duties in the GDPR, Article 5, namely,
the duties to be accurate and fair when dealing with personal data.

8. But the key provision is found in Article 22: this sets out a prohibition on the use of recommendations or decisions “based solely” on automated processes, including profiling, which produce significant effects. Recital 75 expands on what this may mean, and highlights risks to rights and freedoms in relation to the profiling, including where the profiling leads to discrimination, loss or significant economic or social disadvantage.

9. There are exceptions to the prohibition: where the profiling is (a) necessary for contract or performance of a contract, (b) authorised by law to which the controller is subject and which also lays down suitable measures to safeguard the data subject’s rights and legitimate interests, or (c) based on explicit consent of the data

10. The government’s own previously published guidance sought to suggest that the operation of the algorithm was only part of the profiling operation, and the fact that human intervention followed – a final review by the centres who could highlight where further scrutiny should could be applied, and hence the decision was
not “based solely” on the algorithm – appears to have been abandoned as an approach, given that it is now clear that centres or indeed any other human being at the end of the process had no power to apply any other factor to produce the final outcome.

11. Instead, notwithstanding the policy u-turn, it now seems that the approach is to comply with Article 22(3) and install an appeals process. However, as stated above, this appears to be limited to the centre rather than the individual, and in such circumstances, it could produce the result that if one student were to be elevated in the rank order, another would have to go down. It is hard to see how the effect of this is not to simply compound the unfairness. It is also the case that the evidential burden to show that this year’s group of students were different to previous ones is likely to be nigh on impossible to meet, save for situations where
something very drastic has happened in the centre, such as a merger.

12. It also appears to be the case that no Data Protection Impact Assessment, as required by Article 35, took place prior to the profiling. These assessments are supposed to inform decision making in relation to projects such as this which have a large and significant impact on individuals.

13. For completeness, it is worth noting that profiling happens frequently in many areas of life. Until recently, it was deployed in Home Office decision making: that process fell under challenge. Insurance claim assessment is another area, where an algorithm may decide liability, although this is based on explicit consent and is
hence Article 22 compliant. The most obvious day-to-day example is in relation to financial contracts and credit scoring, but processes such as a requirement to notify the applicant of the specific credit agency used when there is a refusal based on that information, and a right to file a statement of correction on that agency’s records, both appear in the Consumer Credit Act 1974. These sit alongside the rights of access to data and correction in the GDPR.

14. In closing, it is to be hoped that the situation facing those remaining will be resolved quickly, either by a similar policy u-turn or by installing an appeals process that is quick, fair and available to any aggrieved student as an individual data subject. It may also be the case the debacle will have highlighted the issues around profiling, and shown how the failure to comply with GDPR, particularly Articles 22 and 35, can result in a deeply flawed outcome that is open to legal challenge as well as damaging reputations.

To view Nicola’s profile please click here

Nicola Hoskins

← Back to News & Resources

Photo of A Levels, Algorithms and GDPR by Nicola Hoskins