This panel session will focus on the growing concern over the ethical use of Artificial Intelligence (AI) and its impact on privacy. The panelists will discuss the role of accountability in developing responsible AI practices and the potential risks of AI systems when not properly regulated. They will also explore the importance of transparency and the need for data privacy regulations in the development and deployment of AI technologies. The session will provide insights into best practices for AI governance and how organizations can ensure the ethical use of AI while still benefiting from its potential.

Co-Panelists:

#AI #ArtificialIntelligence #gdpr #ethics #dataprotection #regulation #insights23 #pecb #Privacy #Accountability

Access the full text of the EU AI Act here.

The UK Government has laid adequacy regulations before Parliament that, once in force from 12 October 2023, will permit use of the UK – US “Data Bridge” as a safeguard for personal data transfers from the UK to the US under Article 44 UK GDPR.

The UK – US “Data Bridge,” AKA the UK Extension to the EU – US Data Privacy Framework (Framework), allows UK organisations to transfer personal data to organisations located in the United States that have self-certified their compliance with certain data protection principles and appear on the Data Privacy Framework List. This scheme, administered by the US Department of Commerce, provides a redress mechanism for data subjects in the European Union to enforce their rights under the EU General Data Protection Regulation, in relation to a participating US organisation’s compliance with the Framework, and to US national security agencies’ access to personal data. This new redress mechanism attempts to prevent a challenge to the Framework similar to the Schrems II case, which invalidated the Framework’s predecessor EU – US Privacy Shield. Despite this, the Framework has already been the subject of a short-lived case at the Court of Justice of the EU, and there may be more legal challenges.

Alongside the adequacy regulations, the UK government published an analysis of the US laws relating to US national security agencies’ access to the personal data of European data subjects. This analysis effectively completes the international data transfer risk assessment (TRA), which UK organisations have been required to carry out before transferring personal data to the US. It is likely that UK organisations relying on the other Article 44 UK GDPR safeguards, such as the International Data Transfer Agreement, may also rely on this analysis in place of completing a TRA.

First publication: K&L Gate Cyber Law Watch Blog in collaboration with Noirin McFadden

August may be perceived as the month where France shuts down for the summer. Yet, just before the summer ’23 holiday, the French Data Protection Authority (“CNIL”) published several call to action for the various players of the data ecosystems in general and in artificial intelligence (AI) in particular, following its 16 May 2023 announcement of an AI action plan:

  • Opening and re-use of publicly accessible data – The CNIL published a draft guidance on the such data usage, and all stakeholders are invited to weight in until 15 October 2023 before its finalization. While non-binding, this guidance is expected to lead the way on how the EU’s Supervisory Authority will apprehend and enforce the General Data Protection Regulation (“GDPR”) when personal data is scraped from online sources and subsequently used for subsequent purposes. This notably focuses on Art. 14 GDPR and the indirect collection of personal data and specific prior information requirements. Artificial Intelligence is explicitly mentioned by the CNIL in the draft, as such data, which feeds large-language models, “undeniably contributes to the development of the digital economy and is at the core of artificial intelligence.” Stakeholders are invited to submit their observations online through the dedicated portal.
  • Artificial Intelligence Sandbox – Following in the footsteps of its connected cameras, EdTech & eHealth initiatives, the CNIL is launching an AI sandbox call for projects, where stakeholders involved in AI in connection with public services may apply to receive dedicated assistance by the regulator to co-construct AI systems complying with data protection and privacy rules.
  • Creation of databases for Artificial Intelligence uses – Open to the broadest possible array of stakeholders (including individuals), this call for contributions notably addresses the specific issue relating to the use of publicly accessible data and aims at informing the CNIL of the various positions at play and how to balance GDPR’s requirements (information, legitimate interests, exercise of rights) with data subjects’ expectations. Stakeholders are invited to submit their observations online through the dedicated form (in French – our free translation in English is available below)- no deadline for submission has been set.
(more…)

Thrilled to share that I’ve been shortlisted for Privacy Leader: Legal for this year’s PICCASO (Privacy, InfoSec, Culture Change & Awareness Societal Organisation) Awards.

Grateful to the award committee for the recognition and to our K&L Gates #DataProtection team as a whole who is a constant source of motivation, motivation and fun even in complex moments (especially in Europe cc Ulrike Elteste (Mahlmann) Noirin McFadden Andreas Müller Veronica Muratori Thomas Nietsch Camille Scarparo). Also psyched to be among such a roster nominees, whether in this category or the others as a whole. Whoever gets awarded, it’ll always be a win for #privacy!

Looking forward to celebrate with you all in person in London!

K&L Gates ranked “Recommended” with Claude-Etienne Armingaud.

Source: Leaders League

(more…)

Once again included in the Best Lawyers in France ranking for Privacy and Data Security Law

Source: Best Lawyers

In this webinar, our lawyers discuss generative artificial intelligence (AI). Fast paced growth in generative AI is changing the way we work and live. With such changes come complex issues and uncertainty. We will address the legal, policy and ethical risks, mitigation, and best practices to consider as you develop generative AI products and services, or use generative AI in the operation of your business.

With Annette Becker, Guillermo Christensen, Whitney McCollum, Jilie Rizzo, and Mark Wittow

If you were not able to join last Tuesday, you can watch the replay below:

Source: K&L Gates Hub

Access the full text of the EU AI Act here.

Speakers:

  • Zelda Olentia, Senior Product Manager, RadarFirst
  • Claude-Étienne Armingaud, CIPP/E, Partner, Data Protection Privacy and Security Practice Group Coordinator, K&L Gates LLP

Air Date: Wednesday 14 June at 1 pm ET / 10 am PT. Replay on demand available here!

Description

Gartner predicts that by the end of 2024, 75% of the world’s population will have its personal data covered under modern privacy regulations. This exponential increase from only 10% global coverage in 2020 raises the stakes for global organizations. The challenge will be to ensure compliance, while safeguarding trust for an unprecedented volume of regulated data.

Join the upcoming live Q&A to learn what’s driving this expansion and how to prepare. You’ll hear from Zelda Olentia, Senior Product Manager at RadarFirst, and special guest, Claude-Etienne Armingaud who is a partner at K&L Gates LLP and a coordinator for the Firm’s Data Protection, Privacy, and Security practice group.

In this session we will cover:

  What is driving the expansion of privacy regulation?

  Where are we on this path towards 65% global coverage?

  How do you scale privacy operations for international privacy laws quickly and effectively before year-end 2024?

Register Now >>

Version 2.1 dated 24 May 2023 – Go to the official PDF version.

Executive Summary

The European Data Protection Board (EDPB) has adopted these guidelines to harmonise the methodology supervisory authorities use when calculating of the amount of the fine. These Guidelines complement the previously adopted Guidelines on the application and setting of administrative fines for the purpose of the Regulation 2016/679 (WP253), which focus on the circumstances in which to impose a fine.

The calculation of the amount of the fine is at the discretion of the supervisory authority, subject to the rules provided for in the GDPR. In that context, the GDPR requires that the amount of the fine shall in each individual case be effective, proportionate and dissuasive (Article 83(1) GDPR). Moreover, when setting the amount of the fine, supervisory authorities shall give due regard to a list of circumstances that refer to features of the infringement (its seriousness) or of the character of the perpetrator (Article 83(2) GDPR). Lastly, the amount of the fine shall not exceed the maximum amounts provided for in Articles 83(4) (5) and (6) GDPR. The quantification of the amount of the fine is therefore based on a specific evaluation carried out in each case, within the parameters provided for by the GDPR.

Taking the abovementioned into account, the EDPB has devised the following methodology, consisting of five steps, for calculating administrative fines for infringements of the GDPR.

Firstly, the processing operations in the case must be identified and the application of Article 83(3) GDPR needs to be evaluated (Chapter 3). Second, the starting point for further calculation of the amount  of  the fine needs to be identified (Chapter 4). This is done by evaluating the classification of the infringement in the GDPR, evaluating the seriousness of the infringement in light of the circumstances of the case, and evaluating the turnover of the undertaking. The third step is the evaluation of aggravating and mitigating circumstances related to past or present behaviour of the controller/processor and increasing or decreasing the fine accordingly (Chapter 5). The fourth step is identifying the relevant legal maximums for the different infringements. Increases applied in previous or next steps cannot exceed this maximum amount (Chapter 6). Lastly, it needs to be analysed whether the calculated final amount meets the requirements of effectiveness, dissuasiveness and proportionality. The fine can still be adjusted accordingly (Chapter 7), however without exceeding the relevant legal maximum.

Throughout all abovementioned steps, it must be borne in mind that the calculation of a fine is no mere mathematical exercise. Rather, the circumstances of the specific case are the determining factors leading to the final amount, which can – in all cases – be any amount up to and including the legal maximum.

These Guidelines and its methodology will remain under constant review of the EDPB.

(more…)

Access the full list of the EDPB and WP29 Guidelines here, including consultation versions, now-current versions and redlines between versions.