Access the full list of the EDPB and WP29 Guidelines here, including consultation versions, now-current versions and redlines between versions.
List of the EDPB Guidelines
May 30th, 2023 | Posted by in Guidelines | Non classé | Privacy - (0 Comments)Gateway to Privacy – Our K&L Gates Data Protection Podcast
February 22nd, 2023 | Posted by in Europe | Podcast | Privacy | World - (0 Comments)This program provides timely updates, best practices, and emerging developments in today’s data protection, privacy, and security industry.
Opinion 28/2024 on certain data protection aspects related to the processing of personal data in the context of AI models
December 17th, 2024 | Posted by in Artificial Intelligence | Europe | Privacy - (0 Comments)Go to official publication on EDPB website.
Adopted on 17 December 2024.
Executive summary
AI technologies create many opportunities and benefits across a wide range of sectors and social activities.
By protecting the fundamental right to data protection, GDPR supports these opportunities and promotes other EU fundamental rights, including the right to freedom of thought, expression and information, the right to education or the freedom to conduct a business. In this way, GDPR is a legal framework that encourages responsible innovation.
In this context, taking into account the data protection questions raised by these technologies, the Irish supervisory authority requested the EDPB to issue an opinion on matters of general application pursuant to Article 64(2) GDPR. The request relates to the processing of personal data in the context of the development and deployment phases of Artificial Intelligence (“AI”) models. In more details, the request asked: (1) when and how an AI model can be considered as ‘anonymous’; (2) how controllers can demonstrate the appropriateness of legitimate interest as a legal basis in the development and (3) deployment phases; and (4) what are the consequences of the unlawful processing of personal data in the development phase of an AI model on the subsequent processing or operation of the AI model.
With respect to the first question, the Opinion mentionsthat claims of an AI model’s anonymity should be assessed by competent SAs on a case-by-case basis, since the EDPB considers that AI models trained with personal data cannot, in all cases, be considered anonymous. For an AI model to be considered anonymous, both (1) the likelihood of direct (including probabilistic) extraction of personal data regarding individuals whose personal data were used to develop the model and (2) the likelihood of obtaining, intentionally or not, such personal data from queries, should be insignificant, taking into account ‘all the means reasonably likely to be used’ by the controller or another person.
To conduct their assessment, SAs should review the documentation provided by the controller to demonstrate the anonymity of the model. In that regard, the Opinion provides a non-prescriptive and non-exhaustive list of methods that may be used by controllers in their demonstration of anonymity, and thus be considered by SAs when assessing a controller’s claim of anonymity. This covers, for instance, the approaches taken by controllers, during the development phase, to prevent or limit the collection of personal data used for training, to reduce their identifiability, to prevent their extraction or to provide assurance regarding state of the art resistance to attacks.
With respect to the second and third questions, the Opinion provides general considerations for SAs to take into account when assessing whether controllers can rely on legitimate interest as an appropriate legal basis for processing conducted in the context of the development and the deployment of AI models.
The Opinion recalls that there is no hierarchy between the legal bases provided by the GDPR, and that it is for controllers to identify the appropriate legal basis for their processing activities. The Opinion then recalls the three-step test that should be conducted when assessing the use of legitimate interest as a legal basis, i.e. (1) identifying the legitimate interest pursued by the controller or a third party; (2) analysing the necessity of the processing for the purposes of the legitimate interest(s) pursued (also referred to as “necessity test”); and (3) assessing that the legitimate interest(s) is (are) not overridden by the interests or fundamental rights and freedoms of the data subjects (also referred to as “balancing test”).
With respect to the first step, the Opinion recalls that an interest may be regarded as legitimate if the following three cumulative criteria are met: the interest (1) is lawful; (2) is clearly and precisely articulated; and (3) is real and present (i.e. not speculative). Such interest may cover, for instance, in the development of an AI model – developing the service of a conversational agent to assist users, or in its deployment – improving threat detection in an information system.
With respect to the second step, the Opinion recalls that the assessment of necessity entails considering: (1) whether the processing activity will allow for the pursuit of the legitimate interest; and (2) whether there is no less intrusive way of pursuing this interest. When assessing whether the condition of necessity is met, SAs should pay particular attention to the amount of personal data processed and whether it is proportionate to pursue the legitimate interest at stake, also in light of the data minimisation principle.
With respect to the third step, the Opinion recalls that the balancing test should be conducted taking into account the specific circumstances of each case. It then provides an overview of the elements that SAs may take into account when evaluating whether the interest of a controller or a third party is overridden by the interests, fundamental rights and freedoms of data subjects.
As part of the third step, the Opinion highlights specific risks to fundamental rights that may emerge either in the development or the deployment phases of AI models. It also clarifies that the processing of personal data that takes place during the development and deployment phases of AI models may impact data subjects in different ways, which may be positive or negative. To assess such impact, SAs may consider the nature of the data processed by the models, the context of the processing and the possible further consequences of the processing.
The Opinion additionally highlights the role of data subjects’ reasonable expectations in the balancing test. This can be important due to the complexity of the technologies used in AI models and the fact that it may be difficult for data subjects to understand the variety of their potential uses, as well as the different processing activities involved. In this regard, both the information provided to data subjects and the context of the processing may be among the elements to be considered to assess whether data subjects can reasonably expect their personal data to be processed. With regard to the context, this may include: whether or not the personal data was publicly available, the nature of the relationship between the data subject and the controller (and whether a link exists between the two), the nature of the service, the context in which the personal data was collected, the source from which the data was collected (i.e., the website or service where the personal data was collected and the privacy settings they offer), the potential further uses of the model, and whether data subjects are actually aware that their personal data is online at all.
The Opinion also recalls that, when the data subjects’ interests, rights and freedoms seem to override the legitimate interest(s) being pursued by the controller or a third party, the controller may consider introducing mitigating measures to limit the impact of the processing on these data subjects. Mitigating measures should not be confused with the measures that the controller is legally required to adopt anyway to ensure compliance with the GDPR. In addition, the measures should be tailored to the circumstances of the case and the characteristics of the AI model, including its intended use. In this respect, the Opinion provides a non-exhaustive list of examples of mitigating measures in relation to the development phase (also with regard to web scraping) and the deployment phase. Mitigating measures may be subject to rapid evolution and should be tailored to the circumstances of the case. Therefore, it remains for the SAs to assess the appropriateness of the mitigating measures implemented on a case-by-case basis.
With respect to the fourth question, the Opinion generally recalls that SAs enjoy discretionary powers to assess the possible infringement(s) and choose appropriate, necessary, and proportionate measures, taking into account the circumstances of each individual case. The Opinion then considers three scenarios.
Under scenario 1, personal data is retained in the AI model (meaning that the model cannot be considered anonymous, as detailed in the first question) and is subsequently processed by the same controller (for instance in the context of the deployment of the model). The Opinion states that whether the development and deployment phases involve separate purposes (thus constituting separate processing activities) and the extent to which the lack of legal basis for the initial processing activity impacts the lawfulness of the subsequent processing, should be assessed on a case-by-case basis, depending on the context of the case.
Under scenario 2, personal data is retained in the model and is processed by another controller in the context of the deployment of the model. In this regard, the Opinion states that SAs should take into account whether the controller deploying the model conducted an appropriate assessment, as part of its accountability obligations to demonstrate compliance with Article 5(1)(a) and Article 6 GDPR, to ascertain that the AI model was not developed by unlawfully processing personal data. This assessment should take into account, for instance, the source of the personal data and whether the processing in the development phase was subject to the finding of an infringement, particularly if it was determined by a SA or a court, and should be less or more detailed depending on the risks raised by the processing in the deployment phase.
Under scenario 3, a controller unlawfully processes personal data to develop the AI model, then ensures that it is anonymised, before the same or another controller initiates another processing of personal data in the context of the deployment. In this regard, the Opinion states that if it can be demonstrated that the subsequent operation of the AI model does not entail the processing of personal data, the EDPB considers that the GDPR would not apply. Hence, the unlawfulness of the initial processing should not impact the subsequent operation of the model. Further, the EDPB considers that, when controllers subsequently process personal data collected during the deployment phase, after the model has been anonymised, the GDPR would apply in relation to these processing operations. In these cases, the Opinion considers that, as regards the GDPR, the lawfulness of the processing carried out in the deployment phase should not be impacted by the unlawfulness of the initial processing.
The European Data Protection Board Having regard to Article 63 and Article 64(2) of the Regulation 2016/679/EU of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (hereinafter “GDPR”), Having regard to the EEA Agreement and in particular to Annex XI and Protocol 37 thereof, as amended by the Decision of the EEA joint Committee No 154/2018 of 6 July 2018, Having regard to Article 10 and Article 22 of its Rules of Procedure, Whereas:
(1) The main role of the European Data Protection Board (hereafter the “Board” or the “EDPB”) is to ensure the consistent application of the GDPR throughout the European Economic Area (“EEA”). Article 64(2) GDPR provides that any supervisory authority (“SA”), the Chair of the Board or the Commission may request that any matter of general application or producing effects in more than one EEA Member State be examined by the Board with a view to obtaining an opinion. The aim of this opinion is to examine a matter of general application or which produces effects in more than one EEA Member State.
(2) The opinion of the Board shall be adopted pursuant to Article 64(3) GDPR in conjunction with Article 10(2) of the EDPB Rules of Procedure within eight weeks from when the Chair and the competent supervisory authority have decided that the file is complete. Upon decision of the Chair, this period may be extended by a further six weeks taking into account the complexity of the subject matter.
(more…)Guidelines 02/2024 on Article 48 GDPR
December 2nd, 2024 | Posted by in Data Transfer | Europe | Guidelines | Privacy - (0 Comments)Adopted on 02 December 2024 – For public consultation
EXECUTIVE SUMMARY
Article 48 GDPR provides that: “Any judgment of a court or tribunal and any decision of an administrative authority of a third country requiring a controller or processor to transfer or disclose personal data may only be recognised or enforceable in any manner if based on an international agreement, such as a mutual legal assistance treaty, in force between the requesting third country and the Union or a Member State, without prejudice to other grounds for transfer pursuant to this Chapter”.
The purpose of these guidelines is to clarify the rationale and objective of this article, including its interaction with the other provisions of Chapter V of the GDPR, and to provide practical recommendations for controllers and processors in the EU that may receive requests from third country authorities to disclose or transfer personal data.
The main objective of the provision is to clarify that judgments or decisions from third country authorities cannot automatically and directly be recognised or enforced in an EU Member State, thus underlining the legal sovereignty vis-a-vis third country law. As a general rule, recognition and enforceability of foreign judgements and decisions is ensured by applicable international agreements.
Regardless of whether an applicable international agreement exists, if a controller or processor in the EU receives and answers a request from a third country authority for personal data, such data flow is a transfer under the GDPR and must comply with Article 6 and the provisions of Chapter V.
An international agreement may provide for both a legal basis (under Article 6(1)(c) or 6(1)(e)) and a ground for transfer (under Article 46(2)(a)).
In the absence of an international agreement, or if the agreement does not provide for a legal basis under Article 6(1)(c) or 6(1)(e), other legal bases could be considered. Similarly, if there is no international agreement or the agreement does not provide for appropriate safeguards under Article 46(2)(a), other grounds for transfer could apply, including the derogations in Article 49.
C3PO: Competition and Consumer Convergence toward Privacy Operations
November 20th, 2024 | Posted by in Competition | Conference | Europe | France | Privacy - (0 Comments)Speakers:
- Claude-Étienne Armingaud, CIPP/E, Partner, Data Protection Privacy and Security Practice Group Coordinator, K&L Gates
- Shereen Kenyon, Senior Manager Data Privacy and Data Protection Officer, SharkNinja
- Elodie Vandenhende, Deputy Head of the Digital Economy Unit, French Competition Authority
- Jörn Wittmann, Group Privacy Ambassador, Volkswagen AG
Governments around the globe have turned their attention to the power of accumulated data, and to the use of competition law powers to enact legislative initiatives. From the EU’s Digital Markets Act and proposed Data Act, to the UK’s Data Protection and Digital Information Bill, laws addressing competition, privacy and wider data access issues are becoming increasingly intertwined. Privacy and competition regulators, alongside consumer protection agencies and associations, are working more closely together than ever before. The EU Court of Justice has been asked for clarity on how such regulators should interact going forward. In some countries, we are also seeing a testing of the use of competition mechanisms for bringing group actions on privacy issues. In this session, we will discuss the interactions between privacy, consumer protection and competition, and how these are likely to shape compliance tactics, litigation strategies and regulatory interactions going forward.
What you will learn:
Navigating the Intersection of Data Scraping and Artificial Intelligence–A Global Data Protection Authorities’ Take
November 18th, 2024 | Posted by in Artificial Intelligence | Privacy | World - (0 Comments)In alignment with the ongoing concerns from several European data protection authorities publishing guidelines on data scrapping (i.e., the Dutch DPA, the Italian DPA and the UK Information Commissioner’s Office), the Global Privacy Assembly (GPA)’s International Enforcement Cooperation Working Group (IEWG) recently published a Joint statement on data scraping and the protection of privacy (signed by the Canadian, British, Australian, Swiss, Norwegian, Moroccan, Mexican, and Jersey data protection authorities) to provide further input for businesses when considering data.
The statement emphasizes that:
Even publicly accessible data is subject to privacy laws across most jurisdictions – meaning that scraping activities must comply with data protection regulations requiring a (i) lawful basis for data collection and, (ii) transparency with individuals, including obtaining consent where necessary.
Collecting mass data can constitute a reportable data breach if it includes unauthorized access to personal data.
Relying on platform terms (e.g., Instagram) for data scraping does not automatically ensure compliance as (i) this contractually authorized use of scraped personal data is not automatically compliant with data protection and artificial intelligence (AI) laws, and (ii) it is difficult to determine whether scraped data is used solely for purposes allowed by the contract terms.
When training AI models, it is critical to adhere not only to privacy regulations but also to emerging AI laws as ensuring AI model transparency and data processing limitations is now increasingly expected by privacy regulators.
The sensitivity of this topic underscores the close relationship between data protection and the ever-data-hungry artificial intelligence industry.
First Publication on K&L Gates Cyber Law Watch blog, in collaboration with Anna Gaentzhirt
Top 10 operational impacts of the EU AI Act – Regulatory implementation and application alongside EU digital strategy
October 29th, 2024 | Posted by in Artificial intelligence | Europe | Legislation - (0 Comments)Launched in 2015, the EU’s Digital Single Market Strategy aimed to foster the digital harmonization between the EU member states and contribute to economic growth, boosting jobs, competition, investment and innovation in the EU.
The EU AI Act characterizes a fundamental element of this strategy. By adopting the first general-purpose regulation of artificial intelligence in the world, Brussels sent a global message to all stakeholders, in the EU and abroad, that they need to pay attention to the AI discussion happening in Europe.
The EU AI Act achieves a delicate balancing act between the specifics, including generative AI, systemic models and computing power threshold, and its general risk-based approach. To do so, the act includes a tiered implementation over a three-year period and a flexible possibility to revise some of the more factual elements that would be prone to rapid obsolescence, such as updating the threshold of the floating point operations per second — a measurement of the performance of a computer for general-purpose AI models presumed to have high impact capabilities. At the same time, the plurality of stakeholders involved in the interpretation of the act and its interplay with other adopted, currently in discussion or yet-to-come regulations will require careful monitoring by the impacted players in the AI ecosystems.
(more…)Cyber Securi-Tea or Coffee – The Data Act or the multiverse of data
October 15th, 2024 | Posted by in Conference | Europe | IT - (0 Comments)Dans le cadre de notre nouveau cycle de conférences autour du numérique et des problématiques « cyber », nous avons le plaisir de vous convier à un petit déjeuner organisé dans nos locaux parisiens, à l’occasion duquel Claude-Etienne Armingaud, CIPP/E (Associé, Protection des données & Technologies) se penchera sur la préparation des entreprises dans le cadre de leur mise en conformité au regard du Règlement sur les Données (EU Data Act). Une belle occasion d’échanger, de s’inspirer et d’entrer en relation avec des professionnels du domaine !
Les places étant limitées, nous vous invitons à vous inscrire dès à présent via le lien suivant : https://ow.ly/183L50TAWbP.
Guidelines 01/2024 on processing of personal data based on Article 6(1)(f) GDPR
October 8th, 2024 | Posted by in Europe | Guidelines | Privacy - (0 Comments)Access official publication on EDPB website.
EXECUTIVE SUMMARY
These guidelines analyse the criteria set down in Article 6(1)(f) GDPR that controllers must meet to lawfully engage in the processing of personal data that is “necessary for the purposes of the legitimate interests pursued by the controller or by a third party”.
Article 6(1)(f) GDPR is one of the six legal bases for the lawful processing of personal data envisaged by the GDPR. Article 6(1)(f) GDPR should neither be treated as a “last resort” for rare or unexpected situations where other legal bases are deemed not to apply nor should it be automatically chosen or its use unduly extended on the basis of a perception that Article 6(1)(f) GDPR is less constraining than other legal bases.
For processing to be based on Article 6(1)(f) GDPR, three cumulative conditions must be fulfilled:
- First, the pursuit of a legitimate interest by the controller or by a third party;
- Second, the need to process personal data for the purposes of the legitimate interest(s) pursued; and
- Third, the interests or fundamental freedoms and rights of the concerned data subjects do not take precedence over the legitimate interest(s) of the controller or of a third party.
In order to determine whether a given processing of personal data may be based on Article 6(1)(f) GDPR, controllers should carefully assess and document whether these three cumulative conditions are met. This assessment should be done before carrying out the relevant processing operations.
With regard to the condition relating to the pursuit of a legitimate interest, not all interests of the controller or a third party may be deemed legitimate; only those interests that are lawful, precisely articulated and present may be validly invoked to rely on Article 6(1)(f) GDPR as a legal basis. It is also the responsibility of the controller to inform the data subject of the legitimate interests pursued where that processing is based on Article 6(1)(f) GDPR.
With regard to the condition that the processing of personal data be necessary for the purposes of the legitimate interests pursued, it should be ascertained whether the legitimate interests pursued cannot reasonably be achieved just as effectively by other means less restrictive of the fundamental rights and freedoms of data subjects, also taking into account the principles enshrined in Article 5(1) GDPR. If such other means exist, the processing may not be based on Article 6(1)(f) GDPR.
With regard to the condition that the interests or fundamental rights and freedoms of the person concerned by the data processing do not take precedence over the legitimate interests of the controller or of a third party, that condition entails a balancing of the opposing rights and interests at issue which depends in principle on the specific circumstances of the relevant processing. The processing may take place only if the outcome of this balancing exercise is that the legitimate interests being pursued are not overridden by the data subjects’ interests, rights and freedoms.
A proper Article 6(1)(f) GDPR assessment is not a straightforward exercise. Rather, the assessment — and in particular the balancing of opposing interests and rights — requires full consideration of a number of factors, such as the nature and source of the relevant legitimate interest(s), the impact of the processing on the data subject and their reasonable expectations about the processing, and the existence of additional safeguards which could limit undue impact on the data subject. The present guidelines provide guidance on how such an assessment should be carried out in practice, including in a number of specific contexts (e.g., fraud prevention, direct marketing, information security, etc.) where this legal basis may be considered.
The guidelines also explain the relationship that exists between Article 6(1)(f) GDPR and a number of data subject rights under the GDPR.
(more…)Guidelines 1/2024 on processing of personal data based on Article 6(1)(f) GDPR
October 8th, 2024 | Posted by in Europe | Guidelines | Privacy - (0 Comments)Version 1.0 – Adopted on 8 October 2024
These guidelines analyse the criteria set down in Article 6(1)(f) GDPR that controllers must meet to lawfully engage in the processing of personal data that is “necessary for the purposes of the legitimate interests pursued by the controller or by a third party”.
Article 6(1)(f) GDPR is one of the six legal bases for the lawful processing of personal data envisaged by the GDPR. Article 6(1)(f) GDPR should neither be treated as a “last resort” for rare or unexpected situations where other legal bases are deemed not to apply nor should it be automatically chosen or its use unduly extended on the basis of a perception that Article 6(1)(f) GDPR is less constraining than other legal bases.
For processing to be based on Article 6(1)(f) GDPR, three cumulative conditions must be fulfilled: • First, the pursuit of a legitimate interest by the controller or by a third party; • Second, the need to process personal data for the purposes of the legitimate interest(s) pursued; and • Third, the interests or fundamental freedoms and rights of the concerned data subjects do not take precedence over the legitimate interest(s) of the controller or of a third party.
In order to determine whether a given processing of personal data may be based on Article 6(1)(f) GDPR, controllers should carefully assess and document whether these three cumulative conditions are met. This assessment should be done before carrying out the relevant processing operations.
With regard to the condition relating to the pursuit of a legitimate interest, not all interests of the controller or a third party may be deemed legitimate; only those interests that are lawful, precisely articulated and present may be validly invoked to rely on Article 6(1)(f) GDPR as a legal basis. It is also the responsibility of the controller to inform the data subject of the legitimate interests pursued where that processing is based on Article 6(1)(f) GDPR.
With regard to the condition that the processing of personal data be necessary for the purposes of the legitimate interests pursued, it should be ascertained whether the legitimate interests pursued cannot reasonably be achieved just as effectively by other means less restrictive of the fundamental rights and freedoms of data subjects, also taking into account the principles enshrined in Article 5(1) GDPR. If such other means exist, the processing may not be based on Article 6(1)(f) GDPR.
With regard to the condition that the interests or fundamental rights and freedoms of the person concerned by the data processing do not take precedence over the legitimate interests of the controller or of a third party, that condition entails a balancing of the opposing rights and interests at issue which depends in principle on the specific circumstances of the relevant processing. The processing may take place only if the outcome of this balancing exercise is that the legitimate interests being pursued are not overridden by the data subjects’ interests, rights and freedoms.
A proper Article 6(1)(f) GDPR assessment is not a straightforward exercise. Rather, the assessment — and in particular the balancing of opposing interests and rights — requires full consideration of a number of factors, such as the nature and source of the relevant legitimate interest(s), the impact of the processing on the data subject and their reasonable expectations about the processing, and the existence of additional safeguards which could limit undue impact on the data subject. The present guidelines provide guidance on how such an assessment should be carried out in practice, including in a number of specific contexts (e.g., fraud prevention, direct marketing, information security, etc.) where this legal basis may be considered.
The guidelines also explain the relationship that exists between Article 6(1)(f) GDPR and a number of data subject rights under the GDPR.
Lost in transition? – Data Strategy & Opportunities in the New EU Legal Frameworks
October 8th, 2024 | Posted by in Artificial intelligence | Conference | Europe | IT | Legislation | Privacy - (0 Comments)We kindly invite you to the K&L Gates Legal & Compliance Breakfast on 8 October 2024 in Frankfurt.
Please join us for coffee, tea and croissants and take away impulses and new momentum for the work on your data strategy.
We will discuss how the Data Act and the AI Act impact a company’s data strategy. How does one reconcile them with each other and with other elements of the legal framework, like GDPR and antitrust laws?
Our key note speaker will be Claude-Étienne Armingaud, a partner at K&L Gates‘ Paris office. He coordinates our European technology and privacy practices and has been building pragmatic legal solutions on both sides of the Atlantic for many years.
We look forward to welcoming you at our Frankfurt office on level 28 of the „Opernturm“ tower.
Please register by clicking here.