In August 2019, the Swedish DPA fined Skellefteå Municipality, Secondary Education Board 200 000 SEK (approximately 20 000 euros) pursuant to the General Data Protection Regulation (EU) 2016/679 for using facial recognition technology to monitor the attendance of school children.
The Authority has now made a 14-page translation of the decision available in English on its site, that can be downloaded.
This facial recognition technology trial, compared images from camera surveillance with pre-registered images of the face of each child, and processed first and last name.
In the preamble, the decision recognised that the General Data Protection Regulation does not contain any derogations for pilot or trial activities.
In summary, the Authority concluded that by using facial recognition via camera to monitor school children’s attendance, the Secondary Education Board (Gymnasienämnden) in the municipality of Skellefteå (Skellefteå kommun) processed personal data that was unnecessary, excessively invasive, and unlawful; with regard to
- Article 5 of the General Data Protection Regulation by processing personal data in a manner that is more intrusive than necessary and encompasses more personal data than is necessary for the specified purpose (monitoring of attendance)
- Article 9 processing special category personal data (biometric data) without having a valid derogation from the prohibition on the processing of special categories of personal data,
- Articles 35 and 36 by failing to fulfil the requirements for an impact assessment and failing to carry out prior consultation with the Swedish Data Protection Authority.
Perhaps the most significant part of the decision is the first officially documented recognition in education data processing under GDPR, that consent fails, even though explicit guardians’ consent was requested and it was possible to opt out. It recognised that this was about processing the personal data of children in a disempowered relationship and environment.
It makes the assessment that consent was not freely given. It is widely recognised that consent cannot be a tick box exercise, and that any choice must be informed. However, little attention has yet been given in GDPR circles, to the power imbalance of relationships, especially for children.
The decision recognised that the relationship that exists between the data subject and the controller, namely the balance of power, is significant in assessing whether a genuine choice exists, and whether or not it can be freely given without detriment. The scope for voluntary consent within the public sphere is limited:
“As regards the school sector, it is clear that the students are in a position of dependence with respect to the school …”
The Education Board had said that consent was the basis for the processing of the facial recognition in attendance monitoring.
With the Data Protection Authority’s assessment that the consent was invalid, the lawful basis for processing fell away.
The importance of necessity
The basis for processing was consent 6(1)(a), not 6(1)(e) ‘necessary for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller’ so as to process special category [sensitive] personal data.
However the same test of necessity, was also important in this case. Recital 39 of GDPR requires that personal data should be processed only if the purpose of the processing could not reasonably be fulfilled by other means.
The Swedish Data Protection Authority recognised and noted that, while there is a legal basis for administering student attendance at school, there is no explicit legal basis for performing the task through the processing of special categories of personal data or in any other manner which entails a greater invasion of privacy — put simply, taking the register via facial recognition did not meet the data protection test of being necessary and proportionate. There are less privacy invasive alternatives available, and on balance, the rights of the individual outweigh those of the data processor.
While some additional considerations were made for local Swedish data protection law, (the Data Protection Act (prop. 2017/18:105 Ny dataskyddslag)) even those exceptional provisions were not intended to be applied routinely to everyday tasks.
Considering rights by design
The decision refers to the document provided by the school board, Skellefteå kommun – Framtidens klassrum (Skelleftå municipality – The classroom of the future). In the appendix (p. 5), “it noted one advantage of facial recognition is that it is easy to register a large group such as a class in bulk. The disadvantages mentioned include that it is a technically advanced solution which requires a relatively large number of images of each individual, that the camera must have a free line of sight to all students who are present, and that any headdress/shawls may cause the identification process to fail.”
The Board did not submit a prior consultation for data protection impact assessment to the Authority under Article 36. The Authority considered that a number of factors indicated that the processing operations posed a high risk to the rights and freedoms of the individuals concerned but that these were inadequately addressed, and failed to assess the proportionality of the processing in relation to its purposes.
For example, the processing operations involved
a) the use of new technology,
b) special categories of personal data,
d) and a power imbalance between the parties.
As the risk assessment submitted by the Board did not demonstrate an assessment of relevant risks to the rights and freedoms of the data subjects [and its mitigations], the decision noted that the high risks pursuant to Article 36 had not been reduced.
What’s next for the UK
The Swedish Data Protection Authority identifies some important points in perhaps the first significant GDPR ruling in the education sector so far, and much will apply school data processing in the UK.
What may surprise some, is that this decision was not about the distribution of the data; since the data was stored on a local computer without any internet connection. It was not about security, since the computer was kept in a locked cupboard. It was about the fundamentals of basic data protection and rights to privacy for children in the school environment, under the law.
Processing must meet the tests of necessity. Necessary is not defined by a lay test of convenience.
Processing must be lawful. Consent is rarely going to offer a lawful basis for routine processing in schools, and especially when it comes to the risks to the rights and freedoms of the child when processing biometric data, consent fails to offer satisfactory and adequate lawful grounds for processing, due to the power imbalance.
Data should be accurate, be only the minimum necessary and proportionate, and not respect the fundamental rights of the child.
The Swedish DPA fined Skellefteå Municipality, Secondary Education Board 200 000 SEK (approximately 20 000 euros). According to Article 83 (1) of the General Data Protection Regulation, supervisory authorities must ensure that the imposition of administrative fines is effective, proportionate and dissuasive, and in this case, is designed to end the processing infringements.
The GDPR, as preceding data protection law did, offers a route for data controllers and processors to understand what is lawful, and it demands their accountability to be able to demonstrate they are.
Whether children in the UK will find that it affords them their due protections, now depends on its enforcement like this case.