Conferences and other in-person events rush to impose facial recognition on attendees in Europe without due diligence regarding data protection risks beware: the organizers of the global connectivity industry shindig, Mobile World Congress (MWC), which takes place annually in Barcelona, have been fined €200,000 (~$224,000) by Spain’s data protection watchdog for a breach of privacy rules during the 2021 edition.
In an 8-page decision (PDF in Spanish) Rejecting the appeal of the organizer of the MWC, the GSMA against the finding of the breach, the Agencia Española de Protección de Datos (AEPD) concludes that it has violated Article 35 of the General Data Protection Regulation (GDPR) Breached — which deals with requirements for conducting a data protection impact assessment (DPIA).
The breach finding relates to the GSMA’s collection of biometric data about visitors to the show, including for a facial recognition system it has deployed (called BREEZZ), which allowed visitors to use automated identity verification to personalize the location. instead of manually showing their ID documentation to staff.
If you think back to 2021, remember that the mobile industry event took place at a time when COVID-19 pandemic-related concerns about attending in-person events were still high. Not that that stopped MWC’s organizer from holding a physical conference in the summer of that year — months later than the show’s usual timing and in an inexorably slimmed-down format with far fewer exhibitors and attendees than in years past.
In fact, fewer than 20,000 registered people attended MWC 2021 in person (17,462 to be exact), according to GSMA disclosures to the AEPD – and of those, only 7,585 actually used the BREEZZ facial recognition system to access the venue. The majority apparently opted for the alternative of manually checking their identity documents. (While MWC 2021 (still) took place in the middle of the pandemic, the GSMA also offered virtual participation, streaming conference sessions to remote viewers – and no ID checks were required for that kind of participation.)
Coming back to the GDPR, the regulation requires a DPIA to be proactively conducted in situations where the processing of people’s data poses a high risk to individuals’ rights and freedoms. FSocial recognition technology, meanwhile, involves the processing of biometric data, which, when used to identify individuals, are classified as special category data under the GDPR. This means that the use of biometrics for identification inevitably falls into this high-risk category that requires proactive assessment.
This assessment should take into account the necessity and proportionality of the proposed processing, as well as examining the risks and detailing the measures envisaged to address the identified risks. The GDPR puts emphasis on data controllers conducting a robust and rigorous proactive assessment of high-risk processing. Thus, the fact that the AEPD found that the GSMA violated Article 35 indicates that it failed to demonstrate that it had exercised due diligence in this regard.
In fact, the regulator found the GSMA’s DPIA “only nominal,” according to the resolution — and said it failed to examine the “substantial aspects” of data processing; nor did it assess the risks or the proportionality and necessity of the system it implemented.
“What the resolution concludes is that a (DPIA) that does not take into account the essential elements is ineffective and does not fulfill any purpose,” adds the AEPD, confirming its view that the GSMA’s DPIA did not meet the requirements of the GDPR (Note: this is an automatic translation of the original Spanish text).
More from the AEPD’s resolution:
The DPIA document (of the GSMA) lacks an assessment of the necessity and proportionality of the processing operations in relation to its purpose; the use of facial recognition to access events, its assessment of the risks to the rights and freedoms of data subjects as referred to in Article 35(1) of the GDPR and of the envisaged measures to address the risks, including safeguards, security measures and mechanisms to ensure the protection of personal data and to demonstrate compliance with the GDPR, taking into account the rights and legitimate interests of data subjects and other affected persons. It also lists the passport and identity card details it believes are required by the Mossos d’Esquadra (local police) who allegedly have a goal, to connect it to the photo taken by the software, which starts the process of facial recognition, comparing your identity to facilitate access.
A description of the GSMA’s DPIA in the AEPD’s resolution suggests that the GSMA not only failed to conduct an adequate assessment, but also borrowed on a security rationale for collecting showgoers’ passports/EU IDs – saying that it was by the Spanish police had been instructed to institute “strict processes” for participants in identity investigations.
It also appears that attendees were required to consent to biometric processing of their facial data as part of their ID upload process, with the AEPD noting the consent information in BREEZZ asking the individual for consent using “biometric data obtained from the photos “. provided for identification validation purposes in the context of online registration and MWC Barcelona for venue access”.
This is important because the GDPR sets a clear bar for consent as a valid legal basis — requiring it to be informed, specific (ie not aggregated) and freely given. Ergo, you cannot enforce consent. (While consent to process sensitive data like facial biometrics has an even higher bar for explicit consent to be legally processed.)
It was the lack of free choice for conference delegates around uploading sensitive biometric data that led to a complaint against GSMA data processing being filed with the AEPD by Dr. MWC 2021. It’s her complaint that led — a few years later — to the GSMA being sanctioned.
“I couldn’t find any reasonable justification for it,” she explained in a LinkedIn message late last week, when she made her complaint public, she discussed what she said was a disproportionate requirement by the GSMA that MWC participants upload ID documents. “Their website suggested I could also bring my ID/passport for personal verification, which I didn’t mind. However, the organizers insisted that unless I upload my passport details I CANNOT attend the live event and would have to participate virtually, which I eventually did.
Technologist, Adam Leon Smith, who co-authored her complaint, also wrote about it in a LinkedIn message – in which he warns: “Facial recognition is very sensitive in public areas and if you really need to use it, get an excellent lawyer and technical team.”
“The AEPD was able to request internal privacy review documents from MWC and found them to be outdated and inadequate. The decision of the AEPD focuses mainly on that,” he also said. “There were no other specific solutions, although I think the MWC will have to do that risk and impact assessment very carefully.”
While the resolution of Spain’s data protection regulator has no impact on whether or not the GSMA’s legal basis for the biometric processing was valid, Smith suggests that this could only be a sequential consequence of the DPIA being inadequate found – that is, it would have decided on a more complete technical review is not worth it.
“I wouldn’t be surprised if they refrain from using facial recognition technology,” he suggested of the GSMA. “This kind of application of the technology would fall into the high-risk category in the latest drafts of the (EU) AI law, meaning they would need some form of conformity assessment by an independent party.”
The GSMA was contacted for comment on the AEPD’s fine, but at the time of writing had not yet responded.
It is worth noting that while the administrative process of the AEPD regarding this complaint is closed with this resolution, the GSMA could try to challenge the outcome through a legal appeal to the Audiencia Nacional (Spain’s Supreme National Court).
Zooming out, as Smith points out, the forthcoming pan-EU AI law will introduce a risk-based framework for regulating applications of AI over the next few years.
The draft version of this legislation proposed by the Commission in 2021 includes a ban on the use of remote biometrics, such as facial recognition, in public places. automated verification checks in the future. (Add to that, parliamentarians have been push to further strengthen the ban on remote biometrics.) And that’s on top of existing GDPR risks for data processors who take a shoddy approach to risk due diligence (or even the hard requirement to have a valid legal basis for such processing of sensitive data).
For its part, the GSMA has continued to provide a facial biometric-based automated ID check option for MWC attendees (both this year and last year) — and continues to require ID document uploads for in-person attendance registration. So it will be interesting to see if it adjusts its privacy statements and/or makes any changes to the registration process for MWC 2024 in light of the GDPR sanction. (And if it continues to offer a biometric-based automated ID check option at the show in the future, it might be a good idea to make sure its technology provider is entirely within the EU.)