Data Protection Weekly 4/2024

Jan 29, 2024

 European Union

EDPS: Publication of an Opinion on Regulation to combat migrant smuggling and human trafficking

The European Data Protection Supervisor (EDPS) released an Opinion on the proposed Regulation aimed at enhancing police cooperation against migrant smuggling and human trafficking, which also seeks to bolster the EU Agency for Law Enforcement Cooperation (Europol)’s role in these efforts. The EDPS outlined concerns regarding four aspects of the proposal: the increased processing of biometric data; cooperation between the European Border and Coast Guard Agency (Frontex) and Europol; the transfer of personal data to non-EU countries; and Europol’s assistance to EU Member States’ authorities. Emphasising the need to balance the fight against illegal migration with data protection, the EDPS called for clear rules and safeguards, especially for biometric data processing. The Opinion also addresses the delineation of Frontex’s role to prevent it from becoming a law enforcement agency and cautions against relying on derogations for data transfers outside the EU, advocating instead for the use of structural tools for transfers of personal data. The EDPS highlights the importance of defining the access and use of personal data by competent authorities within the EU, aiming to protect individuals’ rights in the context of migration control measures. You can read the press release here and download the full Opinion here.

European Commission: Decision establishing the European AI Office

On 24 January 2024, the European Commission announced the creation of the European Artificial Intelligence Office. This new entity will be part of the Directorate-General for Communication Networks, Content and Technology, adhering to its annual management plan while operating in line with Commission internal processes. The establishment of this office aims to complement the existing framework without affecting the powers of national authorities or the Union’s bodies in supervising AI systems, as outlined in the forthcoming regulation on harmonised rules for artificial intelligence. It seeks to provide guidance without duplicating the efforts of relevant Union bodies under sector-specific legislation, thereby ensuring a coordinated approach to AI oversight within the EU. You can read the press release here.

National Authorities

Italy: Garante to participate in GPEN “Privacy Sweep” targeting deceptive design

The Italian data protection authority (Garante), is set to join an international effort coordinated by the Global Privacy Enforcement Network (GPEN) to investigate deceptive design practices on websites and apps. This “Privacy Sweep,” scheduled from January 29 to February 2, aims to identify and address user interfaces and pathways that coerce users into making unintended decisions regarding their personal data, often to their detriment but the platform’s benefit. The initiative, rooted in the European Data Protection Board’s (EDPB) definition of deceptive design, will evaluate sites and apps for clarity, interface design, and other indicators of manipulative practices. Results may lead to awareness efforts, direct engagement with entities to discuss findings, or even formal inquiries by participating privacy authorities. The GPEN, founded in 2010 on the basis of an OECD recommendation, represents a network of privacy regulators focused on the practical aspects of enforcing data protection laws internationally. You can read the press release here (in Italian).

Italy: Garante fines Trento Municipality for AI surveillance projects

The Italian data protection authority (Garante) has imposed a €50,000 fine on the Municipality of Trento for privacy violations in two scientific research projects, Marvel and Protector, aimed at enhancing urban security under the smart cities paradigm. These projects, funded by European funds, involved the use of surveillance cameras, microphones, and social media analysis without a proper legal framework, leading to multiple privacy law breaches. Marvel focused on using AI to automatically detect public safety risks through video and audio surveillance, while Protector targeted the identification of threats to places of worship by analysing hate speech on social networks. The Garante highlighted the municipality’s lack of jurisdiction in scientific research, inadequate data anonymisation techniques, and a lack of transparency and impact assessment before project initiation. Despite acknowledging some mitigating factors, the extensive and invasive data processing practices were criticised for posing significant risks to individuals’ rights and freedoms. The Garante remains open to dialogue for future AI usage initiatives compliant with privacy regulations. You can read the press release here (in Italian).

Netherlands: AP part of European collaboration on privacy and personalised ads

The Dutch data protection authority (AP), in collaboration with its Norwegian and German counterparts, has launched a European initiative to scrutinise the privacy implications of personalised advertising. This collective effort seeks to forge a consensus among EU data protection bodies regarding the consent mechanisms online platforms employ for personalised ads. The initiative emerges from concerns over platforms that condition free access on user consent to process personal data for targeted advertising, highlighting the issue of privacy as a fundamental right, not a commodity for those who can afford it. The European Data Protection Board (EDPB) is set to issue a decision on this matter within eight weeks, a move that could significantly influence tech companies’ privacy practices. Central to the debate is the assessment of the ‘pay or okay’ model—where users must choose between consenting to targeted ads or paying a fee to opt-out—against the GDPR’s requirements for freely given consent. This development points to a critical moment for privacy regulation in Europe, with the potential to standardise how consent is sought and ensure privacy rights are equitably upheld across the board. You can read the press release here (in Dutch).

Netherlands: AP highlights significant challenges in education regarding privacy

The Dutch data protection authority (AP) has outlined several challenges facing the education sector in terms of privacy, particularly due to the use of algorithms and artificial intelligence (AI). In its “Sectorbeeld Onderwijs” report, the AP notes societal issues like poverty, inequality of opportunity, and safety are posing privacy challenges for educational institutions. There is a call for more precise policies and discussions on the desirability of algorithms and AI, alongside concerns over the control of software use and the clarity around the use of personal data for scientific research within these institutions. While acknowledging the sector’s efforts to improve compliance with privacy legislation through self-regulation and increased collaboration, the AP emphasises the need for further improvements to ensure foundational compliance. The report underscores the importance of handling students’, parents’, and staff members’ personal data with care, highlighting the need for additional protection for children and young people due to their vulnerable status, to foster a free and safe educational environment. You can read the press release here (in Dutch).

Global

Noyb’s survey reveals data protection non-compliance concerns

A recent survey by noyb, conducted among over 1000 European data protection professionals to commemorate Data Protection Day, reveals significant concerns about GDPR compliance within companies. Despite the GDPR’s introduction in 2018, which aimed at stricter data protection enforcement, 74% of surveyed professionals believe that if data protection authorities (DPAs) were to inspect the average company, they would uncover “relevant violations.” The study highlights a perceived need for more decisive enforcement actions from DPAs to ensure compliance. Additionally, it sheds light on internal pressures faced by Data Protection Officers (DPOs), with many reporting resistance from sales, marketing, and senior management when trying to implement GDPR compliance measures. The survey suggests that significant fines and publicising these penalties could be effective in promoting compliance, whereas current strategies, such as informal negotiations and reliance on guidelines, are seen as less effective. This underscores a broader call for “evidence-based enforcement” to ensure that companies align their data processing practices with GDPR requirements. You can read the full article here.

Fines

France: CNIL fines Amazon France Logistique €32 million over employee monitoring practices

On 27 December 2023, the French data protection authority (CNIL) imposed a €32 million fine on Amazon France Logistique for multiple breaches of the General Data Protection Regulation (GDPR). This followed investigations triggered by press articles and complaints from employees about the company’s practices in its warehouses. Amazon used scanners to record data on employees’ tasks, utilising this information to monitor quality, productivity, and inactivity. The CNIL identified violations including excessive data collection beyond what was necessary for job performance evaluation and unlawful processing through three specific indicators. These indicators overly monitored employees’ speed and idle times, deemed not justifiable by legitimate interest. Furthermore, Amazon was found to have not minimised data in work scheduling and employee assessments, exceeding what was necessary for performance evaluation. The company also failed to meet GDPR’s transparency and information obligations and did not ensure the security of personal data in video surveillance, violating articles 12, 13, and 32 of the GDPR. You can read the press release here and the full decision here (in French).

Belgium: APD sanctions Black Tiger Belgium for lack of transparency and unlawful processing

The Belgian data protection authority (APD) has imposed sanctions on Black Tiger Belgium (BTB), previously known as Bisnode Belgium, for multiple breaches of the GDPR. The company, now part of the French Black Tiger Group, was found to have collected and processed personal data without sufficient transparency or lawful basis for over 15 years. Despite BTB’s claims of legitimate interest and adequate information provision to data subjects, the APD highlighted the company’s failure in proactively and individually informing the data subjects, inadequately balancing the interests at stake against the fundamental rights of individuals, and not properly addressing data subjects’ access requests. The APD’s decision includes administrative fines totalling €174,640, a temporary ban on processing data until affected individuals are informed and given a chance to object, and a definitive ban on processing data for B2B Data Quality purposes for data subjects who cannot be notified, underscoring the importance of transparency, lawfulness, and respect for individuals’ rights in data processing activities. The decision can be appealed by the parties involved. You can read the press release here and the full decision here (in Dutch).

Luxembourg: CNPD fines electronic communication company for GDPR breaches

On 5 July 2023, the Luxembourg data protection authority (CNPD) addressed a violation involving a company offering electronic communication services. The case, initiated from a complaint, revealed that the complainant’s personal data was illicitly transferred multiple times to a third party by a processor of the controller, without adequate information being provided to the data subject as required under GDPR Article 13. Furthermore, the company was found to lack appropriate technical and organisational measures to ensure data protection, contravening GDPR Article 24’s provision on controller responsibility. The CNPD imposed a €1,500 administrative fine on the company for these infractions and issued orders to rectify the identified shortcomings by implementing measures to prevent further unauthorised data transfers, alongside issuing a formal reprimand for the infringements. You can read the press release here and the full decision here (in French).

Poland: UODO fines district court for GDPR breach

The Polish data protection authority (UODO) imposed a 10,000 PLN (equivalent to €2,290)  fine on the District Court in Kraków for failing to report a personal data breach to the supervisory authority and for not notifying the individuals concerned “without undue delay”. The breach was reported to UODO by the Ministry of Foreign Affairs, which learned of it when a postal operator delivered damaged and incomplete correspondence containing personal data sent by the court. The incident involved the personal data of seven individuals, including sensitive information such as national identification numbers and health data, which posed a high risk to their rights and freedoms. Despite being aware of the breach, the court did not undertake the necessary risk analysis or notify UODO or the individuals affected, as required by GDPR Article 33(1). UODO’s investigation highlighted additional issues, including the court’s data protection officer underestimating the risk level and the prolonged duration of the breach, which lasted 16 months before action was taken. UODO has now ordered the court to notify the four individuals most at risk due to the breach within three days, in compliance with GDPR Article 34(2). This case underscores UODO’s authority to impose fines on courts for breaches related to their administrative activities, rather than their judicial functions. You can read the press release here (in Polish).

UK: ICO reprimands South Tees Hospitals NHS Foundation Trust for data breach

The UK data protection authority (ICO) has issued a reprimand to South Tees Hospitals NHS Foundation Trust following a data breach that led to the disclosure of sensitive information to an unauthorised family member. In November 2022, a Trust employee mistakenly sent an appointment letter intended for a patient’s father to the wrong address, thereby violating the patient’s privacy. The ICO’s investigation concluded that the breach was a result of human error and highlighted a lack of sufficient training and preparation among Trust staff for handling sensitive correspondence. Joanne Stones, Group Manager at the ICO, emphasised the serious and harmful nature of the breach, underscoring the importance of proper training and procedures to prevent such errors. The ICO now expects the Trust to implement new standard operating procedures and enhance staff training to safeguard against future data breaches, ensuring compliance with data protection laws that mandate the security of personal information. You can read the press release here.