Part IV of our series “Regulating AI: The Potential Impact of Global Regulation of Artificial Intelligence” will focus on recent developments in general availability of AI and how generative AI solutions are leading regulators, at a global level, to consider legal frameworks to protect both individuals affected by AI and digital sovereignty.

The program will feature a panel addressing the EU AI Act, on which a preliminary political agreement was reached last December and unanimously approved by the ambassadors of the 27 countries of the European Union on 2 February 2024, prior to its upcoming final votes.

Like the GDPR before it, the EU AI Act will be a trailblazing piece of legislation which will impact companies at global level.

Our panelists will discuss the consequences of the EU AI Act on companies contemplating the provision of AI solutions in the EU market or leveraging AI in the EU, with a special focus on non-EU companies.

Additional topics in our Regulating AI — The Potential Impact of Global Regulation of Artificial Intelligence series include:  

  • Part I – 13 September 2023 (EU / U.K.) – View Recording
  • Part II – 7 December 2023 (Asia-Pacific Region: China, Hong Kong, Singapore, Japan) – View Recording
  • Part III – 12 December 2023 (United States)

Register here.

Join our session as we explore the implications of the EU AI Act. In this webinar, we’ll:

  • Break down the four levels of AI risk under the AI Act
  • Discuss legal requirements for deployers and providers of AI systems
  • Provide a playbook for deployers and providers to accelerate EU AI Act compliance

Featured speakers

Yücel Hamzaoğlu​

Partner
HHK Legal

Melike Hamzaoğlu

Partner
HHK Legal

Claude-Étienne Armingaud​

Partner
KL Gates

Noshin Khan​

Ethics & Compliance, Associate Director
OneTrust​

Harry Chambers

Senior Privacy Analyst
OneTrust

Register here.

Quoted in Agenda article “New EU AI Rules Will Have Global Impact“:

The scope of the EU AI Act will apply to all companies whose AI systems are used or affect EU-based individuals, according to Claude-Etienne Armingaud, a partner in K&L Gates’ Paris office and a member of the law firm’s technology transactions and sourcing practice group.

Due to its breadth, global companies developing AI systems, most of which are headquartered either in the U.S. or in China, will face two options: “Get in line with the EU AI Act or abstain from the EU market,” Armingaud said.

Some companies threatened to exit the European market after the EU’s General Data Protection Regulation, or GDPR, became effective in 2018, but many didn’t actually follow through, according to Armingaud.

“So, without a doubt, all companies dabbling in AI will need to comply if they truly want to remain global,” he said.

Agenda – New EU AI Rules Will Have Global Impact

It has been some time already since the EU Digital Services Act (Regulation 2022/2065, DSA) was published, and since then, the discussions about Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs) have dominated the media coverage (see initial press release of European Commission here and coverage about VLOPs/VLOSEs petitions against categorization as VLOPs/VLOSEs here and here). 

Smaller online service providers tend to forget that they may also face some new obligations under the DSA from 17 February 2024 onwards, but would be well advised to comply to avoid significant sanctions (e.g., fines of up to 6% of the global annual turnover or periodic penalty payments up to 5% of the global average daily turnover). 

The following paragraphs provide a brief summary of the most relevant content of the DSA and will help online service providers to understand:

  • If and to what extent the DSA applies to them;
  • What specific obligations exist; and
  • What sanctions may be applied in case of breach.

Preface: The DSA – What Was That Again?

The DSA replaces or takes over parts of the EU Directive 2000/31/EC (eCommerce Directive), namely its provisions for liability for third-party content. Due to massive developments in the digital field, evolving use of online services, and new players entering the market or gaining relevance for our societies and lives, the European Union decided it was time to put in place a more contemporary set of rules taking into account these new developments, services, and stakeholders. Some of the provisions under the eCommerce Directive will continue to apply. Others were more or less copy and pasted from the eCommerce Directive (liability for third-party content). However, the DSA also lays a profound focus on transparency, due diligence, and general fairness principles, which results in a broad set of additional respective obligations with which every covered online service provider needs to comply. As the need for transparency and fairness varies depending on the types of service providers, the DSA follows a tiered approach:

  • All online intermediary services are subject to basic due diligence obligations;
  • Hosting service providers and online search engines need to comply with additional obligations; and
  • Online platforms are subject to another set of additional rules.

For a more detailed analysis of the respective undertakings for each tier, please refer to our comprehensive publication on the DSA available here.

While certain provisions for VLOPs and VLOSEs already apply since November 2023, the broad set of obligations for the basis of online intermediary services will apply from 17 February 2024—about time to check again if you are well prepared.

Sounds a bit chaotic? Agreed, but we will try shed some light on the new obligations and who needs to do what.

Does the DSA Apply to My Company?

The DSA applies to all providers of online intermediary services, which are defined as basically all of the below service categories offered to users (and not merely “consumers”) with a habitual residence in the EU1)An “offering” of such services shall require either a material number of users in the European Union or any of its member states in relation to … Continue reading. A seat or establishment of the service provider in the European Union is expressly not required.

Mere Conduit 

(i.e., the transmission of information provided by a user to a communication network or granting a user access to such a communication network, e.g., the internet);

Caching

 (i.e., the automated temporary storage of user information for the purpose of transmitting the information to other users faster on their request);

Hosting

 (i.e., the storage of information provided by a user and on user’s request):

Online Platforms

(i.e., a hosting service storing information and disseminating the information to the public on request by a user);

Very Large Online Platforms 

(i.e., Online Platforms with an average of at least 45 million monthly active users);


Online Search Engine

(i.e., the intermediary service enabling users to enter a search request and to run searches on the entire internet for a random topic and receive a result list in any format)2)While the definition of “intermediary service” does not expressly mention search engines, the definition of “online search engines” include … Continue reading:

Very Large Online Search Engines

(i.e., Online Search Engines with an average of at least 45 million monthly active users).

OKAY, GOT IT—BUT WHAT DO I NEED TO DO?

As already pointed out, the specific obligations for online intermediary services depend on which of the above mentioned services your company provides. Here are the most relevant (albeit nonexhaustive) obligations for each service provider category applying from 17 February 2024 in a nutshell:

General Due Diligence and Transparency Obligations Applying to All Intermediary Services (Including Online Platforms, Hosting, Caching and Mere Conduit Services, Online Search Engines):

Point of Contact (Art 11, 12)

Make single points of contact for authorities and users available to the public, enabling users to communicate easily in electronic form (excluding solely automated tools like chat bots) and to choose the means of communication.

Legal Representative (Art. 13)

Appoint a legal representative who is authorized to respond to enquiries on behalf of service provider in one EU member state where services are offered (only if service provider has no EU establishment).

Terms and Conditions (Art. 14)

The restrictions for use of service in respect of illegal content provided by users must be mentioned in publicly available terms and conditions, including information regarding measures applied to block illegal content and procedural rules for complaint handling.

Transparency Report (Art. 15)

Publication of publicly accessible, easily comprehensible, and machine-readable reports on measures taken to respond to illegal content at least once a year.

Additional Obligations for Hosting Service Providers (Including Online Platforms, but Apparently Not for Online Search Engines)

Notice and Action Mechanism (Art. 16)

Put in place easily accessible and user friendly mechanisms to allow persons to notify hosting provider of illegal content and duly handle such notices (where receipt of such justified notice may give rise to liability for such illegal content pursuant to Art. 6).

Statement of Reasons (Art. 17)

Provide specific statement of reasons to users that are subject to service restrictions such as blocking content, suspension or termination of services, or similar measures in context of provision of illegal content, including information about applicable redress mechanisms.

Notification of Criminal Offenses (Art. 18)

Upon suspicion of a criminal offense against the life or safety of persons, report to competent law enforcement authorities.

Additional Obligations Only for Online Platforms (Not Applicable to Micro and Small Enterprises3)Micro enterprises have less than 10 employees and an annual turnover or balance sheet below €2 million; small enterprises have less than 50 … Continue reading)

Complaint Handling System (Art. 20)

Inform persons submitting take-down notices through the notice and action mechanism about the results of their notice by way of an effective internal complaint-handling mechanism and handle any incoming complaints in timely, nondiscriminatory manner by qualified personnel.

Out-of-Court Dispute Settlement (Art. 21)

Offer persons submitting take-down notices through the notice and action mechanism subject to a decision under Art. 20 an out-of-court settlement body certified by an EU member state to resolve any dispute related to the decision and engage in a dispute resolution process in front of such body.

Trusted Flaggers  (Art. 22)

Notices by trusted flaggers (status to be awarded by EU member states) in their respective field of expertise must be handled with priority.

Suspension of Services in Cases of Misuse (Art. 23)

Upon prior warning, temporarily suspend access to services for users who have frequently and manifestly provided illegal content or have issued frequently and manifestly unfounded notifications under Art. 16 and 20.

Transparency Reporting (Art. 24)

In addition to Art. 15. Online Platform providers need to include in their annual reports information regarding out-of-court dispute settlement proceedings (Art. 21) and service suspensions (Art. 23) every six months. Starting 17 February 2024, Online Platform providers must publish and submit to the competent member state authority information about the number of average monthly active users in the European Union to assess whether they qualify as VLOP.

Online Interface Design (Art. 25)

Websites and other online interfaces must be designed in a non-manipulative and non-deceiving manner.

Advertising  (Art. 26)

When presenting ads to users of an Online Platform, the provider must, in real time, inform the user that the content is advertising, the person on whose behalf the ad is presented or who has paid for it, and based on which main criteria the presented ad was selected.

Recommender Systems (Art. 27)

Online Platforms using tools to recommend specific content to users need to inform users about the main parameters for the content selection and how these can be changed or influenced.

Protection of Minors (Art. 28)

Online Platforms accessible to minors must put in place proportionate measures to ensure a high level of privacy, safety, and security of minors when using the service; advertising based on profiling (as defined by the General Data Protection Regulation) may not be presented where the data to build the profile does, with reasonable certainty, relate to a minor.

Additional Obligations for Online Platforms Enabling Consumers to Enter Into Distance Sales Contracts with Traders (Not Applicable to Micro and Small Enterprises):

Traceability of Traders (Art. 30)

Prior to allowing the use of the services, providers need to collect certain information from traders communicating with or offering goods or services to EU users on the Online Platform (the legal names, contact details, copy of an identification document, payment account details, trade register data, and self-certification to offer only products compliant with applicable EU laws) and confirm such information by accessible sources.

Compliance by Design (Art. 31)

The Online Platform needs to be designed in a manner that enables traders to comply with their statutory information, compliance, and product safety information duties and to assess by applying best efforts whether traders have complied with their respective obligations prior to permitting the trader to offer goods on the Online Platform (including random checks in official, freely accessible, and machine-readable online databases or online interfaces whether the products or services offered by traders have been identified as illegal).

Notification Obligation (Art. 32)

If an Online Platform provider becomes aware that a certain product sold via its Online Platform is illegal, the identified purchasers of this product must be notified that they purchased an illegal product, the identity of the trader, and any available means of redress.

As regards the transparency reporting obligations on content moderation applicable to intermediary service providers, the European Commission has published a draft act setting out the mandatory templates for these transparency reports here, on which feedback and comments can be submitted until 24 January 2024.

VLOPs and VLOSEs (Art 33–43)

As to this date, only very few Online Platforms and search engines have been identified by the European Commission as VLOPs and VLOSEs, and we thus refrain from providing further information in this regard here.

OUCH—BUT WHAT IF I DO NOT COMPLY?

The competent supervisory authorities to be designated by each of the EU member states respectively until 17 February 2024 (Digital Services Coordinators) have investigative and corrective powers, including the power to impose administrative fines in case of breach of the obligations under the DSA of up to 6% of the annual worldwide turnover of the provider of the intermediary service, as well as up to 5% of the average daily worldwide turnover for periodic penalty payment. The experience with fines under the EU General Data Protection Regulation, which has established a similar legal regime, indicates that, after an initial period of uncertainty, fines in the five- to seven-digit area may be realistic for smaller and medium-sized enterprises. However, if fines under the DSA will develop similarly remains, of course, to be seen.

Our EU Data Protection and IT team is available to assist you in preparing your compliance with the DSA. We are an international law firm with European offices in Brussels, France, Luxemburg, Germany, Italy, and the United Kingdom. Our lawyers regularly advise on technology and media law, privacy law, consumer protection and product safety laws, and antitrust law. 

First Publication: K&L Gates Hub with Thomas Nietsch, Giovanni Campi, Veronica Muratori & Andreas Müller

References

References
1 An “offering” of such services shall require either a material number of users in the European Union or any of its member states in relation to its population or an active orientation of the services to the European Union or its member states.
2 While the definition of “intermediary service” does not expressly mention search engines, the definition of “online search engines” include them as intermediary services.
3 Micro enterprises have less than 10 employees and an annual turnover or balance sheet below €2 million; small enterprises have less than 50 employees and annual turnover or balance sheet below €10 million.

On 18 October 2023, the French National Assembly voted in favour of a law aiming to secure and regulate the digital space (“Loi visant visant à sécuriser et réguler l’espace numérique” or “SREN”), otherwise called the “Sorare Act.” This new development marks a first step towards the establishment of a regulatory framework dedicated to games integrating non-fungible tokens (NFTs) and monetisation models based on digital assets.

These new provisions are aimed at the creation of a new category of games under French law called games with monetisable digital objects (“jeux à objets numériques monétisables” or “JONUM”). This new regime will enter into force ‘on an experimental basis and for a period of three years’ from the promulgation of the law and will authorise Web3 games with monetisable digital objects (including NFTs).

The Sorare Act defines JONUMs as “game elements, which only confer on players one or more rights associated with the game, and which may be transferred, directly or indirectly, for consideration to third parties,” while excluding digital assets covered by 2° of Article L. 54-10-1 of the French Monetary and Financial Code.

France is one the first jurisdictions in the world to create a specific regime for companies using NFTs as part of their games and the objective is to provide certainty to the industry.

Please reach out to our team if you need further information on this new development. 

First publication: K&L Gates Hub, in collaboration with Lucas Nicolet-Serra

Amidst a sudden increase in paid-for posts that went viral for dubious products and services, France has taken a significant step toward the regulation of influencer communication. The Act no. 2023-451 (Influencers Act), which came into effect on 9 June 2023, aims not only to protect consumers but also to support the influencers, in order to foster the healthy growth of this ecosystem. France is now the first European Union (EU) country to implement a thorough framework regulating commercial influence.

Background information

Digital influencers have changed the way companies can promote their products and services, from beauty and fashion to technology, notably by blurring the lines between commercial advertising and genuine consumer reviews.

Between 8 to 31 January 2023, the French Ministry of the Economy conducted a public consultation on the influencer ecosystem, to evaluate of the contemplated regulation, which received an overwhelming support from the panels.

Key provisions beating on influencers

General ban on certain communications

The following communications are explicitly banned from any influencer communication:

  • Cosmetic surgery and procedures;
  • Alternative therapeutic technics;
  • Nicotine-based products;
  • Non-domestic animal trade.
  • Certain financial services, notably as they pertain to blockchain-based services (e.g. NFT); and
  • Online gambling and betting;

With regard to the latter, the communication remains possible provided that it occurs exclusively on platforms restricted to adults over the age of 18 and subject to the usual specific disclaimer pertaining to the advertising of such services.

Mandatory labeling

The Influencers Act requires influencers to label:

  • Their promoted posts with the mention “advertisement” or “commercial collaboration” in a clear, legible and identifiable manner to avoid falling under misleading commercial practices further to Art. L. 121-3 of the French Consumer Code (“FCC”).

Influencers failing to comply with this obligation face up to 300,000 euros in fines and up to two years of imprisonment (Art. 5 Influencers Act).

  • The pictures (still or moving) they post and which have been
    • edited to enlarge or refine the general appearance or modify the appearance of the model’s face to clearly include the “Retouched images” mention; or
    • generated through artificial intelligence (AI), notably generative AI (gen AI) to clearly include a “Virtual image” disclaimer

Influencers failing to comply with this obligation face up to 4,500 euros in fines and up to one year of imprisonment (Art. 5 Influencers Act).

Drop-shipping

In case of sales of goods through a third party (so-called “drop-shipping” practices), influencers will need to abide by obligations of transparency about the identity of the supplier, pursuant to Art. L. 221-5 of the FCC and will bear the liability relating to the legality and availability of the promoted products.

Content moderation and insurance

Influencers based outside of the European Economic Area or Switzerland but directing their activities to a French audience are required to appoint a legal representative in the EU, as well as to subscribe to a dedicated insurance covering the potential damage resulting from their activities.

Key provisions bearing on platforms used by influencers

Further to the entry into force of the European Regulation no. 2022/2065 on a Single Market for Digital Services (Digital Services Act or DSA) on 25 August 2023, the Influencers A amended the Act no. 2004-575 of 21 June 2004 for trust in the digital economy (Loi pour la Confiance dans l’Économie Numérique or LCEN) increasing the burden on digital platforms, notably for such platforms which allow influencers to conduct their activities.

These platform now have the obligation to promptly remove any illegal content which would be notified through the “trusted flaggers” introduced under Art. 22 DSA.

Key provisions bearing on brands

  • The Influencers Act now mandate a written contract between the influencer and the advertised brands, or their respective representatives. This contract, which must imperatively be subject to French law, must include:
  • The identity of the parties, including their domiciliation for tax purposes;
  • The detailed nature of the influence services;
  • The financial compensation or any equivalent advantage resulting from the influence services;
  • As the case may be, any provision pertaining to intellectual property.

With regard to liability on the influence services, a joint and several liability between the brand and influencer has been implemented, rendering the brand de jure liable for any damage caused to third party.

Enforcement of the Influencers Act

Just prior to the summer holidays, the French Ministry of the Economy appointed a team of 15 agents responsible for monitoring social networks and responding to complaints.

In parallel, the French Directorate General for Consumer Affairs, Competition and Fraud Prevention (“DGCCRF”) audited fifty influencers in the first quarter of 2023, resulted in 60% of the audited influencers to be found in breach of the then-current (and pre-Influencers Act) misleading commercial practice framework.

These findings led to eighteen injunctions to cease illicit practices and sixteen criminal reports. In the following context, in July, the DGCCRF published a code of conduct for influencers and content creators in July, explaining their duties and obligations in accessible language.

Whether you are a brand considering hiring the services of influencers or an influencer yourself, the K&L Gates Luxury Product & Fashion team remains at your disposal to assist you in your compliance with the new French framework.

First publication: K&L Gates Fashion Law Watch Blog in collaboration with Camille Scarparo.

August may be perceived as the month where France shuts down for the summer. Yet, just before the summer ’23 holiday, the French Data Protection Authority (“CNIL”) published several call to action for the various players of the data ecosystems in general and in artificial intelligence (AI) in particular, following its 16 May 2023 announcement of an AI action plan:

  • Opening and re-use of publicly accessible data – The CNIL published a draft guidance on the such data usage, and all stakeholders are invited to weight in until 15 October 2023 before its finalization. While non-binding, this guidance is expected to lead the way on how the EU’s Supervisory Authority will apprehend and enforce the General Data Protection Regulation (“GDPR”) when personal data is scraped from online sources and subsequently used for subsequent purposes. This notably focuses on Art. 14 GDPR and the indirect collection of personal data and specific prior information requirements. Artificial Intelligence is explicitly mentioned by the CNIL in the draft, as such data, which feeds large-language models, “undeniably contributes to the development of the digital economy and is at the core of artificial intelligence.” Stakeholders are invited to submit their observations online through the dedicated portal.
  • Artificial Intelligence Sandbox – Following in the footsteps of its connected cameras, EdTech & eHealth initiatives, the CNIL is launching an AI sandbox call for projects, where stakeholders involved in AI in connection with public services may apply to receive dedicated assistance by the regulator to co-construct AI systems complying with data protection and privacy rules.
  • Creation of databases for Artificial Intelligence uses – Open to the broadest possible array of stakeholders (including individuals), this call for contributions notably addresses the specific issue relating to the use of publicly accessible data and aims at informing the CNIL of the various positions at play and how to balance GDPR’s requirements (information, legitimate interests, exercise of rights) with data subjects’ expectations. Stakeholders are invited to submit their observations online through the dedicated form (in French – our free translation in English is available below)- no deadline for submission has been set.
(more…)

On 14 June 2023, the European Parliament (Parliament) plenary voted on its position on the Artificial Intelligence Act (AI Act), which was adopted by a large majority, with 499 votes in favor, 28 against, and 93 abstentions. The newly adopted text (Parliament position) will serve as the Parliament’s negotiating position during the forthcoming interinstitutional negotiations (trilogues) with the Council of the European Union (Council) and the European Commission (Commission).

The members of Parliament (MEPs) proposed several changes to the Commission’s proposal, published on 21 April 2021, including expanding the list of high-risk uses and prohibited AI practices. Specific transparency and safety provisions were also added on foundation models and generative AI systems. MEPs also introduced a definition of AI that is aligned with the definition provided by the Organisation for Economic Co-operation and Development. In addition, the text reinforces natural persons’ (or their groups’) right to file a complaint about AI systems and receive explanations of decisions based on high-risk AI systems that significantly impact their fundamental rights.

Definition

The Parliament position provides that AI, or an AI System, should refer to “a machine-based system that is designed to operate with varying levels of autonomy and that can, for explicit or implicit objectives, generate outputs such as predictions, recommendations, or decisions, that influence physical or virtual environments.” This amends the Commission’s proposal, where an AI System was solely limited to software acting for human-defined objectives and now encompasses the metaverses through the explicit inclusion of “virtual environments.”

Agreement on the final version of the definition of AI is expected to be found at the technical level during trilogue negotiations, as it does appear to be a noncontentious item.

Another notable inclusion relates to foundation models (Foundation Models) that were not yet in the public eye when the Commission’s proposal was published and were defined as a subset of AI System “trained on broad data at scale, is designed for generality of output, and can be adapted to a wide range of distinctive tasks.

(more…)

On 27 October 2022, the Digital Services Act (DSA) was published in the EU Official Journal as Regulation (EU) 2022/2065, with the aim to fully harmonize the rules on the safety of online services and the dissemination of illegal content online. The Digital Services Act will require online intermediaries to amend their terms of service, to better handle complaints, and to increase their transparency, especially with respect to advertising.

(more…)

Read the full text.

(more…)