Preventive processing of biometric data: An unanswered question

2021-06-28T16:00:00
Spain

It is no longer surprising that technologies can be used to find us and trace our steps, our location or our day-to-day activities. Everyday actions such as walking into our office, validating our ticket before riding a public bus, or simply using our mobile devices, allow for locating us. The widespread connected media, the increasing sophistication and decreasing costs to access this information have triggered the need for setting limits on the use of location technologies, which are both invasive and threatening for our privacy.

Preventive processing of biometric data: An unanswered question
June 28, 2021

It is no longer surprising that technologies can be used to find us and trace our steps, our location or our day-to-day activities. Everyday actions such as walking into our office, validating our ticket before riding a public bus, or simply using our mobile devices, allow for locating us. The widespread connected media, the increasing sophistication and decreasing costs to access this information have triggered the need for setting limits on the use of location technologies, which are both invasive and threatening for our privacy.

Clearly, we must set some limits on location technologies. However, the matter becomes particularly complicated when dealing with uses aimed at (i) preventing legal infringements or (ii) reporting imminent threats. In other words, the development of personal tracking systems poses the question of how to use these technologies regarding persons that must be away or live separately from other individuals or certain places. Ideally, we should strike a fair balance between preventing these violations or risks and safeguarding the rights of those being tracked.  

The Criminal Chamber of the Barcelona Provincial Court (the “Provincial Court”) decided on this matter in Order 72/2021 of February 15 (the “Order”). The Order set out that biometric data collection systems cannot be used to identify individuals banned by a court order from a famous supermarket chain (the “Plaintiff” or the “Company”). Specifically, the Provincial Court concluded that the exceptions provided in article 9(2) of the General Data Protection Regulation (“GDPR”) for the processing of special categories of personal data were not met, adding that the private interest of the Company should not prevail over the data subjects’ general interest. See below a summary of the Order.

Background:

In a robbery case in one of the Plaintiff’s supermarkets, a criminal court in Barcelona banned the offenders from entering the supermarket for two years.

To enforce the judgement, the Company requested authorization from the court to use electronic facial recognition technologies to identify the offenders and prevent them from entering the supermarket. This facial recognition system captures all customers’ faces and identifies those who are banned from the supermarket within 0.3 seconds.

In its request, the Company argued that (i) there was no way to enforce the judgement through traditional means, i.e., employees identifying the offenders; and (ii) its request was justified on grounds of public interest and the Company’s legitimate interest in enforcing the court decisions where it was the injured party.

At first, the Barcelona Criminal Court 24 rejected the request.

Appeal:

The Company appealed against the court’s rejection raising the following arguments:

  • Although the GDPR provides that “biometric data for the purpose of uniquely identifying a natural person” are a special category of data, they can still be used provided appropriate security measures are adopted.
  • In this case, the requested security measure does not violate the protection of personal data. Although the system processes biometric data from all customers, it instantly identifies those who have been banned from the supermarket. Therefore, the system does not store biometric data of persons who have not been found guilty of a crime.
  • In the GDPR framework, the lawmaker’s intent is not only to protect personal data, but also to allow for the free movement of personal data in line with technological development.
  • The measure is appropriate, necessary and proportionate, meaning that it is (i) effective, since it identifies individuals who have been banned from the supermarket by a court decision; (ii) the only one addressing and solving the problem, since the previous measures have been unsuccessful; and (iii) proportionate, because its benefits for the public interest outweigh the costs for the individuals concerned, since there is no processing of personal biometric data in general.

The Provincial Court’s decision:

The Provincial Court dismissed these arguments and upheld the Barcelona Criminal Court’s decision, rejecting the requested measure for the following reasons:

a) The measure involves the processing of biometric data:

In this case, the Provincial Court confirmed that the use of facial recognition technologies in video surveillance systems for private security involves the processing of biometric data aimed at uniquely identifying an individual.

The Provincial Court referred to report 36/2020 of the Spanish Data Protection Agency (“AEPD”). Although acknowledging that this is a complex matter open to interpretation, the report states that, generally, “biometric data will only qualify as a special category of data if they are subject to technical processing for biometric identification (one-to-many) and not in case of biometric verification/authentication (one-to-one).” To summarize:

  • Biometric identification” involves the comparison between a person’s facial image and many other templates stored in a database.
  • Biometric verification/authentication” occurs when specific templates are matched against each other, i.e., when biometric templates expected to belong to the same person are compared to one another.

In this case, the Provincial Court confirmed that the processing is a biometric identification and not a mere authentication. Therefore, this processing is prohibited under article 9(1) GDPR, and it will only be lawful if any of the circumstances provided in article 9(2) occur.

b) The data processing does not have a legitimate basis:

Under these circumstances, the Provincial Court found that the processing of data is not legitimate:

  • So far, the Plaintiff is not requesting the data subjects’ express consent for the processing of their biometric data or for storing images of their faces in a database for these purposes. Also, considering the uneven bargaining power between the Company and the customers, they can hardly consent under these circumstances.
  • In the absence of the data subjects’ express consent, the processing must have another sufficiently solid legitimate basis. However, contrary to the Company’s arguments, the Provincial Court found that there is no “public interest,” arguing that the measure is rather aimed at safeguarding the Company’s private interest, e.g., ensuring the security of its facilities.
  • Quoting the AEPD, the Provincial Court states that, for the processing of special categories of data, legislation requires that there be “an essential public interest” provided in a statutory provision that currently does not exist in Spanish law. So, the applicable framework does not suffice to regulate this type of processing. Lawmakers must adopt a provision determining when this essential public interest occurs and regulating the legitimacy requirements for biometric data processing.

Based on the above, the Provincial Court dismissed the appeal and rejected the processing of biometric data as requested. The measure is not proportionate, necessary or appropriate, and applying it would violate the rights and freedoms not only of the offenders but of all the supermarket customers.

Authors: Ainhoa Rey and Albert Agustinoy

June 28, 2021