Part IV of our series “Regulating AI: The Potential Impact of Global Regulation of Artificial Intelligence” will focus on recent developments in general availability of AI and how generative AI solutions are leading regulators, at a global level, to consider legal frameworks to protect both individuals affected by AI and digital sovereignty.

The program will feature a panel addressing the EU AI Act, on which a preliminary political agreement was reached last December and unanimously approved by the ambassadors of the 27 countries of the European Union on 2 February 2024, prior to its upcoming final votes.

Like the GDPR before it, the EU AI Act will be a trailblazing piece of legislation which will impact companies at global level.

Our panelists will discuss the consequences of the EU AI Act on companies contemplating the provision of AI solutions in the EU market or leveraging AI in the EU, with a special focus on non-EU companies.

Additional topics in our Regulating AI — The Potential Impact of Global Regulation of Artificial Intelligence series include:  

  • Part I – 13 September 2023 (EU / U.K.) – View Recording
  • Part II – 7 December 2023 (Asia-Pacific Region: China, Hong Kong, Singapore, Japan) – View Recording
  • Part III – 12 December 2023 (United States)

Register or watch the replay here.

The Information Commissioner’s Office (ICO) recently launched a consultation series on how data protection laws should apply to the development and use of generative AI models (“Gen AI”). In the coming months, the ICO will publish further views on how to interpret specific requirements of UK GDPR and Part 2 of the DPA 2018 in relation to Gen AI. This first part of the consultation focusses on whether it is lawful to train Gen AI on personal data scraped from the web. The consultation seeks feedback from stakeholders with an interest in Gen AI.

As outlined by the ICO, web scraping will involve the collection and processing of personal data, which may not have been placed online directly by the data subjects themselves. To comply with the UK GDPR, Gen AI developers would need to ensure there is a valid lawful basis for their processing under UK GDPR, as well as comply with the relevant information requirements pertaining to indirect personal data collection.

For the first part of the consultation series, the ICO published a policy position on the lawful basis for training Gen AI models on web-scraped data which can be found here. More specifically, this consultation focusses on the ‘legitimate interest’ lawful basis under art. 6(1)(f) UK GDPR and the ‘three-part’ test that a data controller must pass to meet the legitimate interest basis (a so-called Legitimate Interest Assessment). The ICO has considered various actions that Gen AI developers could take to meet this three-part legitimate interest test to guarantee that the collection of training data through web scraping, i.e. processing of data, is complaint with the principles of UK GDPR. The ICO would now like to hear from relevant stakeholders on their view of the proposed regulatory approach and the impact this would have on their organisation. A link to the survey can be found here.

The deadline to submit a response is 1 March 2024.

First publication: K&L Gates Cyber Law Watch blog with Sophie Verstraeten

Join our session as we explore the implications of the EU AI Act. In this webinar, we’ll:

Featured speakers

Yücel Hamzaoğlu​

Partner
HHK Legal

Melike Hamzaoğlu

Partner
HHK Legal

Claude-Étienne Armingaud​

Partner
KL Gates

Noshin Khan​

Ethics & Compliance, Associate Director
OneTrust​

Harry Chambers

Senior Privacy Analyst
OneTrust

Register here.

Quoted in Agenda article “New EU AI Rules Will Have Global Impact“:

The scope of the EU AI Act will apply to all companies whose AI systems are used or affect EU-based individuals, according to Claude-Etienne Armingaud, a partner in K&L Gates’ Paris office and a member of the law firm’s technology transactions and sourcing practice group.

Due to its breadth, global companies developing AI systems, most of which are headquartered either in the U.S. or in China, will face two options: “Get in line with the EU AI Act or abstain from the EU market,” Armingaud said.

Some companies threatened to exit the European market after the EU’s General Data Protection Regulation, or GDPR, became effective in 2018, but many didn’t actually follow through, according to Armingaud.

“So, without a doubt, all companies dabbling in AI will need to comply if they truly want to remain global,” he said.

Agenda – New EU AI Rules Will Have Global Impact

It has been some time already since the EU Digital Services Act (Regulation 2022/2065, DSA) was published, and since then, the discussions about Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs) have dominated the media coverage (see initial press release of European Commission here and coverage about VLOPs/VLOSEs petitions against categorization as VLOPs/VLOSEs here and here). 

Smaller online service providers tend to forget that they may also face some new obligations under the DSA from 17 February 2024 onwards, but would be well advised to comply to avoid significant sanctions (e.g., fines of up to 6% of the global annual turnover or periodic penalty payments up to 5% of the global average daily turnover). 

The following paragraphs provide a brief summary of the most relevant content of the DSA and will help online service providers to understand:

  • If and to what extent the DSA applies to them;
  • What specific obligations exist; and
  • What sanctions may be applied in case of breach.

Preface: The DSA – What Was That Again?

The DSA replaces or takes over parts of the EU Directive 2000/31/EC (eCommerce Directive), namely its provisions for liability for third-party content. Due to massive developments in the digital field, evolving use of online services, and new players entering the market or gaining relevance for our societies and lives, the European Union decided it was time to put in place a more contemporary set of rules taking into account these new developments, services, and stakeholders. Some of the provisions under the eCommerce Directive will continue to apply. Others were more or less copy and pasted from the eCommerce Directive (liability for third-party content). However, the DSA also lays a profound focus on transparency, due diligence, and general fairness principles, which results in a broad set of additional respective obligations with which every covered online service provider needs to comply. As the need for transparency and fairness varies depending on the types of service providers, the DSA follows a tiered approach:

  • All online intermediary services are subject to basic due diligence obligations;
  • Hosting service providers and online search engines need to comply with additional obligations; and
  • Online platforms are subject to another set of additional rules.

For a more detailed analysis of the respective undertakings for each tier, please refer to our comprehensive publication on the DSA available here.

While certain provisions for VLOPs and VLOSEs already apply since November 2023, the broad set of obligations for the basis of online intermediary services will apply from 17 February 2024—about time to check again if you are well prepared.

Sounds a bit chaotic? Agreed, but we will try shed some light on the new obligations and who needs to do what.

Does the DSA Apply to My Company?

The DSA applies to all providers of online intermediary services, which are defined as basically all of the below service categories offered to users (and not merely “consumers”) with a habitual residence in the EU1)An “offering” of such services shall require either a material number of users in the European Union or any of its member states in relation to … Continue reading. A seat or establishment of the service provider in the European Union is expressly not required.

Mere Conduit 

(i.e., the transmission of information provided by a user to a communication network or granting a user access to such a communication network, e.g., the internet);

Caching

 (i.e., the automated temporary storage of user information for the purpose of transmitting the information to other users faster on their request);

Hosting

 (i.e., the storage of information provided by a user and on user’s request):

Online Platforms

(i.e., a hosting service storing information and disseminating the information to the public on request by a user);

Very Large Online Platforms 

(i.e., Online Platforms with an average of at least 45 million monthly active users);


Online Search Engine

(i.e., the intermediary service enabling users to enter a search request and to run searches on the entire internet for a random topic and receive a result list in any format)2)While the definition of “intermediary service” does not expressly mention search engines, the definition of “online search engines” include … Continue reading:

Very Large Online Search Engines

(i.e., Online Search Engines with an average of at least 45 million monthly active users).

OKAY, GOT IT—BUT WHAT DO I NEED TO DO?

As already pointed out, the specific obligations for online intermediary services depend on which of the above mentioned services your company provides. Here are the most relevant (albeit nonexhaustive) obligations for each service provider category applying from 17 February 2024 in a nutshell:

General Due Diligence and Transparency Obligations Applying to All Intermediary Services (Including Online Platforms, Hosting, Caching and Mere Conduit Services, Online Search Engines):

Point of Contact (Art 11, 12)

Make single points of contact for authorities and users available to the public, enabling users to communicate easily in electronic form (excluding solely automated tools like chat bots) and to choose the means of communication.

Legal Representative (Art. 13)

Appoint a legal representative who is authorized to respond to enquiries on behalf of service provider in one EU member state where services are offered (only if service provider has no EU establishment).

Terms and Conditions (Art. 14)

The restrictions for use of service in respect of illegal content provided by users must be mentioned in publicly available terms and conditions, including information regarding measures applied to block illegal content and procedural rules for complaint handling.

Transparency Report (Art. 15)

Publication of publicly accessible, easily comprehensible, and machine-readable reports on measures taken to respond to illegal content at least once a year.

Additional Obligations for Hosting Service Providers (Including Online Platforms, but Apparently Not for Online Search Engines)

Notice and Action Mechanism (Art. 16)

Put in place easily accessible and user friendly mechanisms to allow persons to notify hosting provider of illegal content and duly handle such notices (where receipt of such justified notice may give rise to liability for such illegal content pursuant to Art. 6).

Statement of Reasons (Art. 17)

Provide specific statement of reasons to users that are subject to service restrictions such as blocking content, suspension or termination of services, or similar measures in context of provision of illegal content, including information about applicable redress mechanisms.

Notification of Criminal Offenses (Art. 18)

Upon suspicion of a criminal offense against the life or safety of persons, report to competent law enforcement authorities.

Additional Obligations Only for Online Platforms (Not Applicable to Micro and Small Enterprises3)Micro enterprises have less than 10 employees and an annual turnover or balance sheet below €2 million; small enterprises have less than 50 … Continue reading)

Complaint Handling System (Art. 20)

Inform persons submitting take-down notices through the notice and action mechanism about the results of their notice by way of an effective internal complaint-handling mechanism and handle any incoming complaints in timely, nondiscriminatory manner by qualified personnel.

Out-of-Court Dispute Settlement (Art. 21)

Offer persons submitting take-down notices through the notice and action mechanism subject to a decision under Art. 20 an out-of-court settlement body certified by an EU member state to resolve any dispute related to the decision and engage in a dispute resolution process in front of such body.

Trusted Flaggers  (Art. 22)

Notices by trusted flaggers (status to be awarded by EU member states) in their respective field of expertise must be handled with priority.

Suspension of Services in Cases of Misuse (Art. 23)

Upon prior warning, temporarily suspend access to services for users who have frequently and manifestly provided illegal content or have issued frequently and manifestly unfounded notifications under Art. 16 and 20.

Transparency Reporting (Art. 24)

In addition to Art. 15. Online Platform providers need to include in their annual reports information regarding out-of-court dispute settlement proceedings (Art. 21) and service suspensions (Art. 23) every six months. Starting 17 February 2024, Online Platform providers must publish and submit to the competent member state authority information about the number of average monthly active users in the European Union to assess whether they qualify as VLOP.

Online Interface Design (Art. 25)

Websites and other online interfaces must be designed in a non-manipulative and non-deceiving manner.

Advertising  (Art. 26)

When presenting ads to users of an Online Platform, the provider must, in real time, inform the user that the content is advertising, the person on whose behalf the ad is presented or who has paid for it, and based on which main criteria the presented ad was selected.

Recommender Systems (Art. 27)

Online Platforms using tools to recommend specific content to users need to inform users about the main parameters for the content selection and how these can be changed or influenced.

Protection of Minors (Art. 28)

Online Platforms accessible to minors must put in place proportionate measures to ensure a high level of privacy, safety, and security of minors when using the service; advertising based on profiling (as defined by the General Data Protection Regulation) may not be presented where the data to build the profile does, with reasonable certainty, relate to a minor.

Additional Obligations for Online Platforms Enabling Consumers to Enter Into Distance Sales Contracts with Traders (Not Applicable to Micro and Small Enterprises):

Traceability of Traders (Art. 30)

Prior to allowing the use of the services, providers need to collect certain information from traders communicating with or offering goods or services to EU users on the Online Platform (the legal names, contact details, copy of an identification document, payment account details, trade register data, and self-certification to offer only products compliant with applicable EU laws) and confirm such information by accessible sources.

Compliance by Design (Art. 31)

The Online Platform needs to be designed in a manner that enables traders to comply with their statutory information, compliance, and product safety information duties and to assess by applying best efforts whether traders have complied with their respective obligations prior to permitting the trader to offer goods on the Online Platform (including random checks in official, freely accessible, and machine-readable online databases or online interfaces whether the products or services offered by traders have been identified as illegal).

Notification Obligation (Art. 32)

If an Online Platform provider becomes aware that a certain product sold via its Online Platform is illegal, the identified purchasers of this product must be notified that they purchased an illegal product, the identity of the trader, and any available means of redress.

As regards the transparency reporting obligations on content moderation applicable to intermediary service providers, the European Commission has published a draft act setting out the mandatory templates for these transparency reports here, on which feedback and comments can be submitted until 24 January 2024.

VLOPs and VLOSEs (Art 33–43)

As to this date, only very few Online Platforms and search engines have been identified by the European Commission as VLOPs and VLOSEs, and we thus refrain from providing further information in this regard here.

OUCH—BUT WHAT IF I DO NOT COMPLY?

The competent supervisory authorities to be designated by each of the EU member states respectively until 17 February 2024 (Digital Services Coordinators) have investigative and corrective powers, including the power to impose administrative fines in case of breach of the obligations under the DSA of up to 6% of the annual worldwide turnover of the provider of the intermediary service, as well as up to 5% of the average daily worldwide turnover for periodic penalty payment. The experience with fines under the EU General Data Protection Regulation, which has established a similar legal regime, indicates that, after an initial period of uncertainty, fines in the five- to seven-digit area may be realistic for smaller and medium-sized enterprises. However, if fines under the DSA will develop similarly remains, of course, to be seen.

Our EU Data Protection and IT team is available to assist you in preparing your compliance with the DSA. We are an international law firm with European offices in Brussels, France, Luxemburg, Germany, Italy, and the United Kingdom. Our lawyers regularly advise on technology and media law, privacy law, consumer protection and product safety laws, and antitrust law. 

First Publication: K&L Gates Hub with Thomas Nietsch, Giovanni Campi, Veronica Muratori & Andreas Müller

References

References
1 An “offering” of such services shall require either a material number of users in the European Union or any of its member states in relation to its population or an active orientation of the services to the European Union or its member states.
2 While the definition of “intermediary service” does not expressly mention search engines, the definition of “online search engines” include them as intermediary services.
3 Micro enterprises have less than 10 employees and an annual turnover or balance sheet below €2 million; small enterprises have less than 50 employees and annual turnover or balance sheet below €10 million.

New ranking in Who’s Who Data 2024 as Recommended in the Data Privacy & Protection and Information Technology categories.

On 18 October 2023, the French National Assembly voted in favour of a law aiming to secure and regulate the digital space (“Loi visant visant à sécuriser et réguler l’espace numérique” or “SREN”), otherwise called the “Sorare Act.” This new development marks a first step towards the establishment of a regulatory framework dedicated to games integrating non-fungible tokens (NFTs) and monetisation models based on digital assets.

These new provisions are aimed at the creation of a new category of games under French law called games with monetisable digital objects (“jeux à objets numériques monétisables” or “JONUM”). This new regime will enter into force ‘on an experimental basis and for a period of three years’ from the promulgation of the law and will authorise Web3 games with monetisable digital objects (including NFTs).

The Sorare Act defines JONUMs as “game elements, which only confer on players one or more rights associated with the game, and which may be transferred, directly or indirectly, for consideration to third parties,” while excluding digital assets covered by 2° of Article L. 54-10-1 of the French Monetary and Financial Code.

France is one the first jurisdictions in the world to create a specific regime for companies using NFTs as part of their games and the objective is to provide certainty to the industry.

Please reach out to our team if you need further information on this new development. 

First publication: K&L Gates Hub, in collaboration with Lucas Nicolet-Serra

This panel session will focus on the growing concern over the ethical use of Artificial Intelligence (AI) and its impact on privacy. The panelists will discuss the role of accountability in developing responsible AI practices and the potential risks of AI systems when not properly regulated. They will also explore the importance of transparency and the need for data privacy regulations in the development and deployment of AI technologies. The session will provide insights into best practices for AI governance and how organizations can ensure the ethical use of AI while still benefiting from its potential.

Co-Panelists:

#AI #ArtificialIntelligence #gdpr #ethics #dataprotection #regulation #insights23 #pecb #Privacy #Accountability

Amidst a sudden increase in paid-for posts that went viral for dubious products and services, France has taken a significant step toward the regulation of influencer communication. The Act no. 2023-451 (Influencers Act), which came into effect on 9 June 2023, aims not only to protect consumers but also to support the influencers, in order to foster the healthy growth of this ecosystem. France is now the first European Union (EU) country to implement a thorough framework regulating commercial influence.

Background information

Digital influencers have changed the way companies can promote their products and services, from beauty and fashion to technology, notably by blurring the lines between commercial advertising and genuine consumer reviews.

Between 8 to 31 January 2023, the French Ministry of the Economy conducted a public consultation on the influencer ecosystem, to evaluate of the contemplated regulation, which received an overwhelming support from the panels.

Key provisions beating on influencers

General ban on certain communications

The following communications are explicitly banned from any influencer communication:

  • Cosmetic surgery and procedures;
  • Alternative therapeutic technics;
  • Nicotine-based products;
  • Non-domestic animal trade.
  • Certain financial services, notably as they pertain to blockchain-based services (e.g. NFT); and
  • Online gambling and betting;

With regard to the latter, the communication remains possible provided that it occurs exclusively on platforms restricted to adults over the age of 18 and subject to the usual specific disclaimer pertaining to the advertising of such services.

Mandatory labeling

The Influencers Act requires influencers to label:

  • Their promoted posts with the mention “advertisement” or “commercial collaboration” in a clear, legible and identifiable manner to avoid falling under misleading commercial practices further to Art. L. 121-3 of the French Consumer Code (“FCC”).

Influencers failing to comply with this obligation face up to 300,000 euros in fines and up to two years of imprisonment (Art. 5 Influencers Act).

  • The pictures (still or moving) they post and which have been
    • edited to enlarge or refine the general appearance or modify the appearance of the model’s face to clearly include the “Retouched images” mention; or
    • generated through artificial intelligence (AI), notably generative AI (gen AI) to clearly include a “Virtual image” disclaimer

Influencers failing to comply with this obligation face up to 4,500 euros in fines and up to one year of imprisonment (Art. 5 Influencers Act).

Drop-shipping

In case of sales of goods through a third party (so-called “drop-shipping” practices), influencers will need to abide by obligations of transparency about the identity of the supplier, pursuant to Art. L. 221-5 of the FCC and will bear the liability relating to the legality and availability of the promoted products.

Content moderation and insurance

Influencers based outside of the European Economic Area or Switzerland but directing their activities to a French audience are required to appoint a legal representative in the EU, as well as to subscribe to a dedicated insurance covering the potential damage resulting from their activities.

Key provisions bearing on platforms used by influencers

Further to the entry into force of the European Regulation no. 2022/2065 on a Single Market for Digital Services (Digital Services Act or DSA) on 25 August 2023, the Influencers A amended the Act no. 2004-575 of 21 June 2004 for trust in the digital economy (Loi pour la Confiance dans l’Économie Numérique or LCEN) increasing the burden on digital platforms, notably for such platforms which allow influencers to conduct their activities.

These platform now have the obligation to promptly remove any illegal content which would be notified through the “trusted flaggers” introduced under Art. 22 DSA.

Key provisions bearing on brands

  • The Influencers Act now mandate a written contract between the influencer and the advertised brands, or their respective representatives. This contract, which must imperatively be subject to French law, must include:
  • The identity of the parties, including their domiciliation for tax purposes;
  • The detailed nature of the influence services;
  • The financial compensation or any equivalent advantage resulting from the influence services;
  • As the case may be, any provision pertaining to intellectual property.

With regard to liability on the influence services, a joint and several liability between the brand and influencer has been implemented, rendering the brand de jure liable for any damage caused to third party.

Enforcement of the Influencers Act

Just prior to the summer holidays, the French Ministry of the Economy appointed a team of 15 agents responsible for monitoring social networks and responding to complaints.

In parallel, the French Directorate General for Consumer Affairs, Competition and Fraud Prevention (“DGCCRF”) audited fifty influencers in the first quarter of 2023, resulted in 60% of the audited influencers to be found in breach of the then-current (and pre-Influencers Act) misleading commercial practice framework.

These findings led to eighteen injunctions to cease illicit practices and sixteen criminal reports. In the following context, in July, the DGCCRF published a code of conduct for influencers and content creators in July, explaining their duties and obligations in accessible language.

Whether you are a brand considering hiring the services of influencers or an influencer yourself, the K&L Gates Luxury Product & Fashion team remains at your disposal to assist you in your compliance with the new French framework.

First publication: K&L Gates Fashion Law Watch Blog in collaboration with Camille Scarparo.

This series of webinars will address the potential impacts of artificial intelligence (AI) regulations on business across the globe. Recent developments in general availability of AI and generative AI solutions are leading regulators, at a global level, to consider legal frameworks to protect both individuals affected by AI and digital sovereignty. Our panelists will address these potential regulatory developments, as well as the expected timeline for these changes, region by region.  

Our first panel will feature a discussion focused on current and future regulatory requirements on the AI industry throughout the EU and the UK. With the language of the EU’s Al Act heading into its trialogue, it is even more important for stakeholders to understand the EU’s approach and prepare for the potential impact of this regulation in Europe, UK, and beyond. The panelists will address key questions, such as:

  • What new undertaking will be bearing on the stakeholders in this industry?
  • Will government regulation be “technology neutral”?
  • Could the various frameworks lead to conflicts for local compliance efforts?
  • Will a requirement for an AI system to explain its thinking or provide substantive sources for all results have a deleterious impact on its ability to “think” independently?  
  • Is it too late for stakeholders to have a say in these expected frameworks?

Speakers:

Claude-Étienne Armingaud | PARTNER | PARIS

Giovanni Campi | POLICY DIRECTOR | BRUSSELS

Jennifer Marsh | PARTNER | LONDON

Register here: K&L Gates Website

Watch the recording here.