top of page
  • Writer's pictureefta-studies.org

A dystopian story about COVID-19, Artificial Intelligence setting grades and the GDPR

This contribution sheds light on serious data protection concerns resulting from using Artificial Intelligence (AI) to set grades of thousands of students whose exams were cancelled due to the COVID-19 outbreak. As demonstrated below, the pandemic challenged the protection of personal data in numerous and sometimes unexpected ways.

Author: Malgorzata Agnieszka Cyndecka


Download blog as PDF


The COVID-19 response must comply with the GDPR

The COVID-19 outbreak has had an unprecedented impact on probably every single aspect of our lives. This includes the right to privacy and data protection. In this respect, a widely discussed issue was the use of location data and contact tracing tools to fight the pandemic. While some EU/EEA Member States encouraged their citizens to use mobile applications to support contact tracing, not all such data driven solutions complied with the principles governing the processing of personal data as required by the GDPR. This concerned amongst others the principle of purpose limitation, data minimisation and storage limitation. One of the COVID-19 contact tracing apps that were suspended due to data protection concerns was the Norwegian “Smittestopp”. Given the necessity of explaining how one may achieve an efficient response in limiting the pandemic without compromising the standards of data protection, the European Data Protection Board (EDPB) issued Guidelines on the use of location data and contact tracing tools in the context of the COVID-19 outbreak.

Another set of specific COVID-19 related guidelines was issued to address the processing of health data for the purpose of scientific research in the fight against the coronavirus. An efficient response to the pandemic requires an undisturbed international cooperation, which often implies international transfers of health data outside the EU/EEA. As data concerning health qualifies as “sensitive” or “special category” data, the GDPR provides strict requirements that must be met with respect to international transfers of such data. In essence, in the absence of an adequacy decision pursuant to Article 45(3) GDPR or appropriate safeguards pursuant to Article 46 GDPR, public authorities and private entities may rely upon Article 49 GDPR derogations. However, those have exceptional character.

Both sets of guidelines have confirmed that the GDPR does not hinder measures taken in the fight against the pandemic, but provides safeguards that allow for protecting fundamental human rights and freedoms.

“Your exams were cancelled due to the COVID-19 outbreak”

This post concerns a probably less expected, but very controversial data protection concern that was triggered by the COVID-19 outbreak, namely the use of AI to set grades. Following a decision to cancel the spring/summer examinations, thousands of students had their exam results awarded through an algorithm. In the EU/EEA, the most well-known are an “awarding model” that was applied by the Switzerland-based International Baccalaureate Organisation (IBO), which offers its curriculum in 150 countries around the world, and an algorithm used by the UK official exam regulator, Ofqual, with respect to AS, A-level and GCSE exams.

While the decision to cancel the exams was understandable, the exam results and the manner in which they were decided led to an unprecedented outcry of a global dimension. When A-level grades were announced in England, Wales and Northern Ireland on 13 August, nearly 40% were lower than teachers' assessments. There were similar issues in Scotland. What's more, the downgrading affected state schools and students from the most deprived backgrounds much more than the private sector and the wealthiest students. As Ofqual’s algorithm shows clear signs of bias, which led to unfair and discriminatory exam results, it is currently under review by the UK national statistic regulator, the Office for Statistics Regulation.

As regards the IBO, in early July thousands of over 170,000 students found their final grades substantially deviating from the grades that were predicted by their teachers. Thus, many students failed to meet university entry requirements. Importantly, neither the IBO nor Ofqual have properly explained how the final grades were awarded.

Following the unprecedented uproar, Ofqual’s infamous algorithm was replaced by teachers’ estimates unless the algorithm gave a higher rate. Apparently, the IBO proposed a similar solution. It remains to see, however, whether it will help the affected students to secure a place at a chosen university. What is undisputable is that the students’ rights under the GDPR have been violated. In this respect, this contribution focuses on the IBO’s “awarding model”.

“Your grades will be set by our awarding model”

According to the IBO, the “awarding model” had three components: 1) student coursework, 2) teacher-delivered predicted grades and 3) school context. The third factor was “based on historical prediction data, and the same school factor was applied to every student in that school for that subject and level.” In other words, the students were assessed on the basis of somebody else’s results from the past years (“the school’s own record”).

As explained below, the application of the “awarding model” and, in particular, its third component are highly problematic under the GDPR.

The material and territorial scope of the GDPR

As provided in Article 2(1) GDPR, the Regulation applies to the processing of “personal data”, i.e. “any information relating to an identified or identifiable natural person”, the “data subject”. The notion of “personal data” is defined in Article 4(1) GDPR. The students’ course work and grades qualify as “personal data”. They are processed by the IBO that acts as “controller” as it determines the purposes and means of the processing of personal data in question (see Article 4(7) GDPR). Under the GDPR, the IBO is responsible for the processing of the personal data of the students.

Importantly, the GDPR requires that every processing of personal data is based on a legal basis (Articles 5(1) and 6 GDPR). Interestingly, the IBO processed the personal data of not only those students who graduated this year, but also those who took exams in previous years. Their results determined the “school context”. It remains to see what legal ground the IBO relied on in this respect. Moreover, in case the IBO argued that it used anonymised data, it should be recalled that “anonymisation” qualifies as processing of personal data under the GDPR. This issue was clarified by Article 29 Working Party in its Opinion on Anonymisation Techniques of 2014. Although the Opinion was issued under the Data Protection Directive of 1995, it remains valid as the definition of “processing” in Article 4(2) GDPR is essentially the same as that of Article 2(b) of the Directive.

Still, given that the IBO has its headquarters in Switzerland, why should it comply with the GDPR? As stipulated in Article 3, the GDPR applies to the processing of personal data of data subjects who are in the EU/EEA by a controller not established in the EU/EEA, where the processing activities are related to the offering of goods or services to such data subjects in the EU/EEA. As regards the students taking the exams in the EU/EEA, the IBO must thus comply with the GDPR.

Was the IBO profiling the students?

Given the information on the “awarding model”, its application amounted to “profiling” the students. As provided in Article 4(4) GDPR, “profiling” means “any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person's performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements.” Given the potential consequences of profiling, the GDPR sets special requirements to such processing of personal data. Those concern amongst others transparency and fairness.

Moreover, profiling that is based on automated decisions-making within the meaning of Article 22 GDPR is as a rule prohibited. Admittedly, there are three are exceptions to this rule. In this respect, an automatic decision must be: 1) necessary for the performance of or entering into a contract; (2) authorised by EU or Member State law to which the controller is subject and which also lays down suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests; or (3) based on the data subject’s explicit consent. Yet, even if one the exceptions applies, the GDPR requires measures in place to safeguard the data subject’s rights and freedoms and legitimate interests. Such measures should include as a minimum a way for the students to obtain human intervention, express their point of view, and contest the decision. Moreover, the data subjects have right to be informed about being subject to automated decision-making, receive meaningful information about the logic involved and an explanation of the significance and envisaged consequences of the processing.

Notably, Article 22 GDPR concerns decisions “based solely” on automated processing, that is the decision must be made without a “meaningful human involvement”. If a human being reviews and takes account of other factors in making the final decision, that decision would not be “based solely” on automated processing. Yet, the mere fact that the input factors include human involvement (such as teacher-predicted grades in the IBO’s model) does not exclude the applicability of Article 22. Automated processing concerns calculating the result on the basis of the relevant input.

As one may point out, it is only after the Norwegian Data Protection Authority asked the IBO to provide explanations concerning the “awarding model” in the context of alleged profiling, the IBO elaborated on the teachers’ role in awarding the exam results. Importantly, profiling may include a human involvement. Thus, the teachers’ involvement does not change the fact that the IBO was profiling the students.

The violation of the principles of fairness, transparency and accuracy

Following a closer examination of the IBO’s “awarding model”, one must conclude that its application infringed three principles governing the processing of personal data as stipulated in Article 5 GDPR: the principle of fairness, transparency and accuracy.

Under Article 5(1)(a) GDPR, personal data is to be processed fairly. In this respect, the data subject must receive sufficient information. The fairness also depends on whether the processing corresponded with the expectations of the data subjects. In the case of profiling, fairness requires that risks such as discriminatory effects or significant economic or social disadvantage are prevented. In the case of IBO, the lack of information on how the three factors were weighed, evaluating the students on the basis of somebody else’s results and the unexpected disparities between the anticipated and final grades indicate that the principle of fairness was infringed.

As regards the principle of transparency provided for in Article 5(1)(a) GDPR, apart from receiving sufficient information, it is particularly important that data subjects are not taken by surprise at a later point about the ways in which their personal data have been used. Given the reactions of students around the world, the manner in which the model was constructed was definitely neither expected, nor comprehensible.

As stipulated in Article 5(1)(d) GDPR, personal data shall be accurate and, where necessary, kept up to date. As confirmed by the CJEU in the Nowak case, the principle of accuracy also applies to the assessment of students’ academic work. Moreover, as provided in Recital 71, “the controller should use appropriate mathematical or statistical procedures for the profiling, implement technical and organisational measures appropriate to ensure, in particular, that factors which result in inaccuracies in personal data are corrected and the risk of errors is minimised, secure personal data in a manner that takes account of the potential risks involved for the interests and rights of the data subject and that prevents, inter alia, discriminatory effects on natural persons on the basis of racial or ethnic origin, political opinion, religion or beliefs, trade union membership, genetic or health status or sexual orientation, or that result in measures having such an effect.” In this respect, the “school context” as one of the three factors opened for risks of discrimination and biased results that may have serious consequences for the students’ future education and job opportunities.

What’s the state of play?

The IBO has clearly infringed the GDPR provisions. Moreover, pursuant to Article 83(5) GDPR, the infringements of the principles governing the processing of the personal data (Articles 5,6,7, and 9 GDPR) or the data subjects’ rights (Article 12-22 GDPR) may result in a fine of up to €20 million, or 4% of the firm’s worldwide annual revenue from the preceding financial year, whichever amount is higher. To the author’s knowledge, however, only the Norwegian Data Protection Authority has thus far required information form the IBO. It also notified the IBO about its order to rectify unfairly processed and incorrect personal data. Yet, given the information on complaints lodged with other EU/EEA data protection authorities, this is definitely not the last time we hear about the IBO. One may also expect class actions under the GDPR. In any case, the use of the 2020 “awarding model” has not only damaged the IBO’s reputation, but may be quite costly.

Most importantly, the IBO case is yet another example of how the use of algorithms and profiling may affect our lives. It has also proved how important it is to have a proper data protection legislation in place.

 

Author

Malgorzata Agnieszka Cyndecka, Associate Professor, Faculty of Law, University of Bergen

 

How to cite

Cyndecka, Malgorzata Agnieszka (2020): A dystopian story about COVID-19, Artificial Intelligence setting grades and the GDPR. Blog. EFTA-Studies.org.

Comments


Commenting has been turned off.
bottom of page