A bit of Jyn Erso to wrap up the week!

New episode of K&L Gates Gateway to Privacy is out, and this time with our first external guest — our dear friend Arya Tripathy joins us with Whitney McCollum and Camille Scarparo for a deep dive into India’s new data protection law, the Digital Personal Data Protection Bill, 2023.

What’s to know, what’s to expect? Listen and find out!

Amidst a sudden increase in paid-for posts that went viral for dubious products and services, France has taken a significant step toward the regulation of influencer communication. The Act no. 2023-451 (Influencers Act), which came into effect on 9 June 2023, aims not only to protect consumers but also to support the influencers, in order to foster the healthy growth of this ecosystem. France is now the first European Union (EU) country to implement a thorough framework regulating commercial influence.

Background information

Digital influencers have changed the way companies can promote their products and services, from beauty and fashion to technology, notably by blurring the lines between commercial advertising and genuine consumer reviews.

Between 8 to 31 January 2023, the French Ministry of the Economy conducted a public consultation on the influencer ecosystem, to evaluate of the contemplated regulation, which received an overwhelming support from the panels.

Key provisions beating on influencers

General ban on certain communications

The following communications are explicitly banned from any influencer communication:

  • Cosmetic surgery and procedures;
  • Alternative therapeutic technics;
  • Nicotine-based products;
  • Non-domestic animal trade.
  • Certain financial services, notably as they pertain to blockchain-based services (e.g. NFT); and
  • Online gambling and betting;

With regard to the latter, the communication remains possible provided that it occurs exclusively on platforms restricted to adults over the age of 18 and subject to the usual specific disclaimer pertaining to the advertising of such services.

Mandatory labeling

The Influencers Act requires influencers to label:

  • Their promoted posts with the mention “advertisement” or “commercial collaboration” in a clear, legible and identifiable manner to avoid falling under misleading commercial practices further to Art. L. 121-3 of the French Consumer Code (“FCC”).

Influencers failing to comply with this obligation face up to 300,000 euros in fines and up to two years of imprisonment (Art. 5 Influencers Act).

  • The pictures (still or moving) they post and which have been
    • edited to enlarge or refine the general appearance or modify the appearance of the model’s face to clearly include the “Retouched images” mention; or
    • generated through artificial intelligence (AI), notably generative AI (gen AI) to clearly include a “Virtual image” disclaimer

Influencers failing to comply with this obligation face up to 4,500 euros in fines and up to one year of imprisonment (Art. 5 Influencers Act).

Drop-shipping

In case of sales of goods through a third party (so-called “drop-shipping” practices), influencers will need to abide by obligations of transparency about the identity of the supplier, pursuant to Art. L. 221-5 of the FCC and will bear the liability relating to the legality and availability of the promoted products.

Content moderation and insurance

Influencers based outside of the European Economic Area or Switzerland but directing their activities to a French audience are required to appoint a legal representative in the EU, as well as to subscribe to a dedicated insurance covering the potential damage resulting from their activities.

Key provisions bearing on platforms used by influencers

Further to the entry into force of the European Regulation no. 2022/2065 on a Single Market for Digital Services (Digital Services Act or DSA) on 25 August 2023, the Influencers A amended the Act no. 2004-575 of 21 June 2004 for trust in the digital economy (Loi pour la Confiance dans l’Économie Numérique or LCEN) increasing the burden on digital platforms, notably for such platforms which allow influencers to conduct their activities.

These platform now have the obligation to promptly remove any illegal content which would be notified through the “trusted flaggers” introduced under Art. 22 DSA.

Key provisions bearing on brands

  • The Influencers Act now mandate a written contract between the influencer and the advertised brands, or their respective representatives. This contract, which must imperatively be subject to French law, must include:
  • The identity of the parties, including their domiciliation for tax purposes;
  • The detailed nature of the influence services;
  • The financial compensation or any equivalent advantage resulting from the influence services;
  • As the case may be, any provision pertaining to intellectual property.

With regard to liability on the influence services, a joint and several liability between the brand and influencer has been implemented, rendering the brand de jure liable for any damage caused to third party.

Enforcement of the Influencers Act

Just prior to the summer holidays, the French Ministry of the Economy appointed a team of 15 agents responsible for monitoring social networks and responding to complaints.

In parallel, the French Directorate General for Consumer Affairs, Competition and Fraud Prevention (“DGCCRF”) audited fifty influencers in the first quarter of 2023, resulted in 60% of the audited influencers to be found in breach of the then-current (and pre-Influencers Act) misleading commercial practice framework.

These findings led to eighteen injunctions to cease illicit practices and sixteen criminal reports. In the following context, in July, the DGCCRF published a code of conduct for influencers and content creators in July, explaining their duties and obligations in accessible language.

Whether you are a brand considering hiring the services of influencers or an influencer yourself, the K&L Gates Luxury Product & Fashion team remains at your disposal to assist you in your compliance with the new French framework.

First publication: K&L Gates Fashion Law Watch Blog in collaboration with Camille Scarparo.

Thrilled to share that I’ve been shortlisted for Privacy Leader: Legal for this year’s PICCASO (Privacy, InfoSec, Culture Change & Awareness Societal Organisation) Awards.

Grateful to the award committee for the recognition and to our K&L Gates #DataProtection team as a whole who is a constant source of motivation, motivation and fun even in complex moments (especially in Europe cc Ulrike Elteste (Mahlmann) Noirin McFadden Andreas Müller Veronica Muratori Thomas Nietsch Camille Scarparo). Also psyched to be among such a roster nominees, whether in this category or the others as a whole. Whoever gets awarded, it’ll always be a win for #privacy!

Looking forward to celebrate with you all in person in London!

On 14 June 2023, the European Parliament (Parliament) plenary voted on its position on the Artificial Intelligence Act (AI Act), which was adopted by a large majority, with 499 votes in favor, 28 against, and 93 abstentions. The newly adopted text (Parliament position) will serve as the Parliament’s negotiating position during the forthcoming interinstitutional negotiations (trilogues) with the Council of the European Union (Council) and the European Commission (Commission).

The members of Parliament (MEPs) proposed several changes to the Commission’s proposal, published on 21 April 2021, including expanding the list of high-risk uses and prohibited AI practices. Specific transparency and safety provisions were also added on foundation models and generative AI systems. MEPs also introduced a definition of AI that is aligned with the definition provided by the Organisation for Economic Co-operation and Development. In addition, the text reinforces natural persons’ (or their groups’) right to file a complaint about AI systems and receive explanations of decisions based on high-risk AI systems that significantly impact their fundamental rights.

Definition

The Parliament position provides that AI, or an AI System, should refer to “a machine-based system that is designed to operate with varying levels of autonomy and that can, for explicit or implicit objectives, generate outputs such as predictions, recommendations, or decisions, that influence physical or virtual environments.” This amends the Commission’s proposal, where an AI System was solely limited to software acting for human-defined objectives and now encompasses the metaverses through the explicit inclusion of “virtual environments.”

Agreement on the final version of the definition of AI is expected to be found at the technical level during trilogue negotiations, as it does appear to be a noncontentious item.

Another notable inclusion relates to foundation models (Foundation Models) that were not yet in the public eye when the Commission’s proposal was published and were defined as a subset of AI Systemtrained on broad data at scale, is designed for generality of output, and can be adapted to a wide range of distinctive tasks.

(more…)

Access the full list of the EDPB and WP29 Guidelines here, including consultation versions, now-current versions and redlines between versions.

In this episode, Claude Etienne Armingaud, Eleonora Curreri, and Camille Scarparo introduce a case regarding a U.S. company’s data privacy breach, the consequences a company may face for being non-compliant with GDPR for companies established outside of the EU, and which steps companies can take to prevent these situations.

First publication: K&L Gates Hub with Eleonora Curreri & Camille Scarparo

FW: Could you provide an overview of trends in global data flows? To what extent is the business world now unavoidably reliant on the ability to share information instantly over vast distances?

Armingaud: A global economy, with data being the fuel for that economy, means that globalised data is unavoidable. This tendency is in particular driven by more and more jurisdictions adopting rules on data transfers of personal data. Cross-border data transfer trends could be roughly described as, on the one hand, a Western trend, for example the EU’s General Data Protection Regulation (GDPR) aimed at data protection and restriction of transfers, in particular contractually framing personal data transfers, and, on the other hand, an Eastern data protectionism trend, such as China’s Personal Information Protection Law (PIPL) and Indonesia’s data protection laws and regulations, aimed at a general restrictive data localisation requirement, which may be linked to a broader concept of data sovereignty.

FW: How would you characterise the risks and complexities involved in cross-border data transfers? Drilling down, what particular factors do organisations need to consider?

Armingaud: Risks pertaining to cross-border data transfers relate to regulatory compliance to ensure that such transfers are valid in light of a lack on foreseeability since the Schrems II decision. Less obvious, but not negligible, is whether proper information is being given to data subjects regarding data transfers. The French Data Protection Authority (CNIL) recently suspended the use of cookies on such grounds. Organisations also need to consider onward transfers that require end-to-end visibility by data exporters and the risks of a shared or joint several liability qualification as per the joint controller relationship between parties.

FW: How do regulations governing data transfers vary between jurisdictions? To what extent do these variances add additional layers of risk?

Armingaud: Both the Western and Eastern cross-border transfer restriction trends – data protection and data protectionism – are essentially opposed. This divergence of opinion over how to deal with personal data necessarily calls for more complex agreements – which is leading to frustration and incomprehension during negotiations on both sides – or to separate, regional templates, which may lead to potential discrepancies in warranties.

FW: How important is it for organisations to undertake a data transfer risk assessment (TRA)? What steps need to be taken when conducting a TRA to ensure it is effective, up to date and compliant with current regulatory requirements and privacy laws?

Armingaud: Pertaining to the accountability principle, a data transfer risk assessment is mandatory. To quote the European Data Protection Board (EDPB): “Knowing your transfers is an essential first step to fulfil your obligations under the principle of accountability.” Mapping a transfer requires the entity to perform a 360-degree overview of the process, asking and being able to answer questions on who, why, what, how and how long, from initial export to final import of the personal data.

FW: What kinds of tools, such as encryption and containerisation, may be used to protect privileged, sensitive or confidential information being transferred internationally?

Armingaud: To protect personal data, we need to make use of what is referred to under article 32 of the GDPR as technical and organisational measures (TOMs). These are not restricted to only technical tools but also fall under pure process. In that sense, annex II of the EC Implementing Decision (EU) 2021/914 of 4 June 2021 on standard contractual clauses for the transfer of personal data to third countries provides a set of process type examples of TOMs, including ‘measures for ensuring data minimisation’, ‘measures for ensuring data quality’ and ‘measures for ensuring limited data retention’. Implementing TOMs requires the controller to carry out a proportionality test relying on the underlying personal data and the processing operations. It is, however, sometimes easier, less time consuming and less expensive to set out a maximum level of TOMs regardless of the sensitivity of the processing.

FW: What essential advice would you offer to organisations on establishing an effective international data transfer solution that manages risk and provides an adequate level of protection?

Armingaud: If I were to offer only one word of advice, it would be to ‘document’. Data protection is less about what you are doing and more about why you are doing it. Being prepared and able to justify any action when processing data ensures that either you are doing it right or you have a justified and legitimate answer for it, as per the accountability principle.

FW: Given that the volume of data transferred around the world will only increase, do you expect the associated risks and regulatory regimes to intensify? What key issues are likely to dominate this issue over the coming years?

Armingaud: It is not so much that the volume is increasing, but the sensitivity of the underlying data. There is an increasing frustration within many countries arising from the perceived data wealth being funnelled to the US and generating less value in the country of origin. I would expect to see more data localisation requirements, so protecting individuals against foreign access will, for all intents and purposes, dictate the future evolution of regulations.

Read the full article on Financier Worldwide Magazine

On March 29, 2018, French President Emmanuel Macron announced his plan to turn France into a global leader in AI. This political leadership was subsequently translated into the Villani report on AI, highlighting autonomous vehicles (AVs) as a regulatory case study, and the Idrac report on AVs. Following these reports, the regulatory framework is currently being amended. This presentation will outline the key changes and how they will affect AV developments in France and in the EU.

More information on the Future of Transportation World Conference 2019 website.

Mode information on K&L Gates website