Part IV of our series “Regulating AI: The Potential Impact of Global Regulation of Artificial Intelligence” will focus on recent developments in general availability of AI and how generative AI solutions are leading regulators, at a global level, to consider legal frameworks to protect both individuals affected by AI and digital sovereignty.

The program will feature a panel addressing the EU AI Act, on which a preliminary political agreement was reached last December and unanimously approved by the ambassadors of the 27 countries of the European Union on 2 February 2024, prior to its upcoming final votes.

Like the GDPR before it, the EU AI Act will be a trailblazing piece of legislation which will impact companies at global level.

Our panelists will discuss the consequences of the EU AI Act on companies contemplating the provision of AI solutions in the EU market or leveraging AI in the EU, with a special focus on non-EU companies.

Additional topics in our Regulating AI — The Potential Impact of Global Regulation of Artificial Intelligence series include:  

  • Part I – 13 September 2023 (EU / U.K.) – View Recording
  • Part II – 7 December 2023 (Asia-Pacific Region: China, Hong Kong, Singapore, Japan) – View Recording
  • Part III – 12 December 2023 (United States)

Register or watch the replay here.

The Information Commissioner’s Office (ICO) recently launched a consultation series on how data protection laws should apply to the development and use of generative AI models (“Gen AI”). In the coming months, the ICO will publish further views on how to interpret specific requirements of UK GDPR and Part 2 of the DPA 2018 in relation to Gen AI. This first part of the consultation focusses on whether it is lawful to train Gen AI on personal data scraped from the web. The consultation seeks feedback from stakeholders with an interest in Gen AI.

As outlined by the ICO, web scraping will involve the collection and processing of personal data, which may not have been placed online directly by the data subjects themselves. To comply with the UK GDPR, Gen AI developers would need to ensure there is a valid lawful basis for their processing under UK GDPR, as well as comply with the relevant information requirements pertaining to indirect personal data collection.

For the first part of the consultation series, the ICO published a policy position on the lawful basis for training Gen AI models on web-scraped data which can be found here. More specifically, this consultation focusses on the ‘legitimate interest’ lawful basis under art. 6(1)(f) UK GDPR and the ‘three-part’ test that a data controller must pass to meet the legitimate interest basis (a so-called Legitimate Interest Assessment). The ICO has considered various actions that Gen AI developers could take to meet this three-part legitimate interest test to guarantee that the collection of training data through web scraping, i.e. processing of data, is complaint with the principles of UK GDPR. The ICO would now like to hear from relevant stakeholders on their view of the proposed regulatory approach and the impact this would have on their organisation. A link to the survey can be found here.

The deadline to submit a response is 1 March 2024.

First publication: K&L Gates Cyber Law Watch blog with Sophie Verstraeten

Join our session as we explore the implications of the EU AI Act. In this webinar, we’ll:

Featured speakers

Yücel Hamzaoğlu​

Partner
HHK Legal

Melike Hamzaoğlu

Partner
HHK Legal

Claude-Étienne Armingaud​

Partner
KL Gates

Noshin Khan​

Ethics & Compliance, Associate Director
OneTrust​

Harry Chambers

Senior Privacy Analyst
OneTrust

Register here.

Quoted in Agenda article “New EU AI Rules Will Have Global Impact“:

The scope of the EU AI Act will apply to all companies whose AI systems are used or affect EU-based individuals, according to Claude-Etienne Armingaud, a partner in K&L Gates’ Paris office and a member of the law firm’s technology transactions and sourcing practice group.

Due to its breadth, global companies developing AI systems, most of which are headquartered either in the U.S. or in China, will face two options: “Get in line with the EU AI Act or abstain from the EU market,” Armingaud said.

Some companies threatened to exit the European market after the EU’s General Data Protection Regulation, or GDPR, became effective in 2018, but many didn’t actually follow through, according to Armingaud.

“So, without a doubt, all companies dabbling in AI will need to comply if they truly want to remain global,” he said.

Agenda – New EU AI Rules Will Have Global Impact

It has been some time already since the EU Digital Services Act (Regulation 2022/2065, DSA) was published, and since then, the discussions about Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs) have dominated the media coverage (see initial press release of European Commission here and coverage about VLOPs/VLOSEs petitions against categorization as VLOPs/VLOSEs here and here). 

Smaller online service providers tend to forget that they may also face some new obligations under the DSA from 17 February 2024 onwards, but would be well advised to comply to avoid significant sanctions (e.g., fines of up to 6% of the global annual turnover or periodic penalty payments up to 5% of the global average daily turnover). 

The following paragraphs provide a brief summary of the most relevant content of the DSA and will help online service providers to understand:

  • If and to what extent the DSA applies to them;
  • What specific obligations exist; and
  • What sanctions may be applied in case of breach.

Preface: The DSA – What Was That Again?

The DSA replaces or takes over parts of the EU Directive 2000/31/EC (eCommerce Directive), namely its provisions for liability for third-party content. Due to massive developments in the digital field, evolving use of online services, and new players entering the market or gaining relevance for our societies and lives, the European Union decided it was time to put in place a more contemporary set of rules taking into account these new developments, services, and stakeholders. Some of the provisions under the eCommerce Directive will continue to apply. Others were more or less copy and pasted from the eCommerce Directive (liability for third-party content). However, the DSA also lays a profound focus on transparency, due diligence, and general fairness principles, which results in a broad set of additional respective obligations with which every covered online service provider needs to comply. As the need for transparency and fairness varies depending on the types of service providers, the DSA follows a tiered approach:

  • All online intermediary services are subject to basic due diligence obligations;
  • Hosting service providers and online search engines need to comply with additional obligations; and
  • Online platforms are subject to another set of additional rules.

For a more detailed analysis of the respective undertakings for each tier, please refer to our comprehensive publication on the DSA available here.

While certain provisions for VLOPs and VLOSEs already apply since November 2023, the broad set of obligations for the basis of online intermediary services will apply from 17 February 2024—about time to check again if you are well prepared.

Sounds a bit chaotic? Agreed, but we will try shed some light on the new obligations and who needs to do what.

Does the DSA Apply to My Company?

The DSA applies to all providers of online intermediary services, which are defined as basically all of the below service categories offered to users (and not merely “consumers”) with a habitual residence in the EU1)An “offering” of such services shall require either a material number of users in the European Union or any of its member states in relation to … Continue reading. A seat or establishment of the service provider in the European Union is expressly not required.

Mere Conduit 

(i.e., the transmission of information provided by a user to a communication network or granting a user access to such a communication network, e.g., the internet);

Caching

 (i.e., the automated temporary storage of user information for the purpose of transmitting the information to other users faster on their request);

Hosting

 (i.e., the storage of information provided by a user and on user’s request):

Online Platforms

(i.e., a hosting service storing information and disseminating the information to the public on request by a user);

Very Large Online Platforms 

(i.e., Online Platforms with an average of at least 45 million monthly active users);


Online Search Engine

(i.e., the intermediary service enabling users to enter a search request and to run searches on the entire internet for a random topic and receive a result list in any format)2)While the definition of “intermediary service” does not expressly mention search engines, the definition of “online search engines” include … Continue reading:

Very Large Online Search Engines

(i.e., Online Search Engines with an average of at least 45 million monthly active users).

OKAY, GOT IT—BUT WHAT DO I NEED TO DO?

As already pointed out, the specific obligations for online intermediary services depend on which of the above mentioned services your company provides. Here are the most relevant (albeit nonexhaustive) obligations for each service provider category applying from 17 February 2024 in a nutshell:

General Due Diligence and Transparency Obligations Applying to All Intermediary Services (Including Online Platforms, Hosting, Caching and Mere Conduit Services, Online Search Engines):

Point of Contact (Art 11, 12)

Make single points of contact for authorities and users available to the public, enabling users to communicate easily in electronic form (excluding solely automated tools like chat bots) and to choose the means of communication.

Legal Representative (Art. 13)

Appoint a legal representative who is authorized to respond to enquiries on behalf of service provider in one EU member state where services are offered (only if service provider has no EU establishment).

Terms and Conditions (Art. 14)

The restrictions for use of service in respect of illegal content provided by users must be mentioned in publicly available terms and conditions, including information regarding measures applied to block illegal content and procedural rules for complaint handling.

Transparency Report (Art. 15)

Publication of publicly accessible, easily comprehensible, and machine-readable reports on measures taken to respond to illegal content at least once a year.

Additional Obligations for Hosting Service Providers (Including Online Platforms, but Apparently Not for Online Search Engines)

Notice and Action Mechanism (Art. 16)

Put in place easily accessible and user friendly mechanisms to allow persons to notify hosting provider of illegal content and duly handle such notices (where receipt of such justified notice may give rise to liability for such illegal content pursuant to Art. 6).

Statement of Reasons (Art. 17)

Provide specific statement of reasons to users that are subject to service restrictions such as blocking content, suspension or termination of services, or similar measures in context of provision of illegal content, including information about applicable redress mechanisms.

Notification of Criminal Offenses (Art. 18)

Upon suspicion of a criminal offense against the life or safety of persons, report to competent law enforcement authorities.

Additional Obligations Only for Online Platforms (Not Applicable to Micro and Small Enterprises3)Micro enterprises have less than 10 employees and an annual turnover or balance sheet below €2 million; small enterprises have less than 50 … Continue reading)

Complaint Handling System (Art. 20)

Inform persons submitting take-down notices through the notice and action mechanism about the results of their notice by way of an effective internal complaint-handling mechanism and handle any incoming complaints in timely, nondiscriminatory manner by qualified personnel.

Out-of-Court Dispute Settlement (Art. 21)

Offer persons submitting take-down notices through the notice and action mechanism subject to a decision under Art. 20 an out-of-court settlement body certified by an EU member state to resolve any dispute related to the decision and engage in a dispute resolution process in front of such body.

Trusted Flaggers  (Art. 22)

Notices by trusted flaggers (status to be awarded by EU member states) in their respective field of expertise must be handled with priority.

Suspension of Services in Cases of Misuse (Art. 23)

Upon prior warning, temporarily suspend access to services for users who have frequently and manifestly provided illegal content or have issued frequently and manifestly unfounded notifications under Art. 16 and 20.

Transparency Reporting (Art. 24)

In addition to Art. 15. Online Platform providers need to include in their annual reports information regarding out-of-court dispute settlement proceedings (Art. 21) and service suspensions (Art. 23) every six months. Starting 17 February 2024, Online Platform providers must publish and submit to the competent member state authority information about the number of average monthly active users in the European Union to assess whether they qualify as VLOP.

Online Interface Design (Art. 25)

Websites and other online interfaces must be designed in a non-manipulative and non-deceiving manner.

Advertising  (Art. 26)

When presenting ads to users of an Online Platform, the provider must, in real time, inform the user that the content is advertising, the person on whose behalf the ad is presented or who has paid for it, and based on which main criteria the presented ad was selected.

Recommender Systems (Art. 27)

Online Platforms using tools to recommend specific content to users need to inform users about the main parameters for the content selection and how these can be changed or influenced.

Protection of Minors (Art. 28)

Online Platforms accessible to minors must put in place proportionate measures to ensure a high level of privacy, safety, and security of minors when using the service; advertising based on profiling (as defined by the General Data Protection Regulation) may not be presented where the data to build the profile does, with reasonable certainty, relate to a minor.

Additional Obligations for Online Platforms Enabling Consumers to Enter Into Distance Sales Contracts with Traders (Not Applicable to Micro and Small Enterprises):

Traceability of Traders (Art. 30)

Prior to allowing the use of the services, providers need to collect certain information from traders communicating with or offering goods or services to EU users on the Online Platform (the legal names, contact details, copy of an identification document, payment account details, trade register data, and self-certification to offer only products compliant with applicable EU laws) and confirm such information by accessible sources.

Compliance by Design (Art. 31)

The Online Platform needs to be designed in a manner that enables traders to comply with their statutory information, compliance, and product safety information duties and to assess by applying best efforts whether traders have complied with their respective obligations prior to permitting the trader to offer goods on the Online Platform (including random checks in official, freely accessible, and machine-readable online databases or online interfaces whether the products or services offered by traders have been identified as illegal).

Notification Obligation (Art. 32)

If an Online Platform provider becomes aware that a certain product sold via its Online Platform is illegal, the identified purchasers of this product must be notified that they purchased an illegal product, the identity of the trader, and any available means of redress.

As regards the transparency reporting obligations on content moderation applicable to intermediary service providers, the European Commission has published a draft act setting out the mandatory templates for these transparency reports here, on which feedback and comments can be submitted until 24 January 2024.

VLOPs and VLOSEs (Art 33–43)

As to this date, only very few Online Platforms and search engines have been identified by the European Commission as VLOPs and VLOSEs, and we thus refrain from providing further information in this regard here.

OUCH—BUT WHAT IF I DO NOT COMPLY?

The competent supervisory authorities to be designated by each of the EU member states respectively until 17 February 2024 (Digital Services Coordinators) have investigative and corrective powers, including the power to impose administrative fines in case of breach of the obligations under the DSA of up to 6% of the annual worldwide turnover of the provider of the intermediary service, as well as up to 5% of the average daily worldwide turnover for periodic penalty payment. The experience with fines under the EU General Data Protection Regulation, which has established a similar legal regime, indicates that, after an initial period of uncertainty, fines in the five- to seven-digit area may be realistic for smaller and medium-sized enterprises. However, if fines under the DSA will develop similarly remains, of course, to be seen.

Our EU Data Protection and IT team is available to assist you in preparing your compliance with the DSA. We are an international law firm with European offices in Brussels, France, Luxemburg, Germany, Italy, and the United Kingdom. Our lawyers regularly advise on technology and media law, privacy law, consumer protection and product safety laws, and antitrust law. 

First Publication: K&L Gates Hub with Thomas Nietsch, Giovanni Campi, Veronica Muratori & Andreas Müller

References

References
1 An “offering” of such services shall require either a material number of users in the European Union or any of its member states in relation to its population or an active orientation of the services to the European Union or its member states.
2 While the definition of “intermediary service” does not expressly mention search engines, the definition of “online search engines” include them as intermediary services.
3 Micro enterprises have less than 10 employees and an annual turnover or balance sheet below €2 million; small enterprises have less than 50 employees and annual turnover or balance sheet below €10 million.

The UK’s Information Commissioner’s Office (the “ICO”) has recently sent warnings to the UK’s most visited websites to inform them that they may face enforcement action if they do not make changes to their cookie banner to ensure compliance with UK data protection law. For example, some websites warned by the ICO do not provide their user with a fair choice on tracking for personalised advertising. This position aligns with the EU’s stance, noting France (see prior Alert here).

The ICO’s actions are part of a larger commitment to ensure individuals’ privacy rights are upheld by companies active in the online advertising industry. Publishers receiving a warning only have 30 days to amend their websites in line with UK GDPR. As further incentive for publishers to get compliant, the ICO has also warned that it will publish the details of those websites that have not made the requested changes in January. Such publicity may be even less welcome than the potentially large fines associated with breach of the data protection framework.

The statement made by the ICO highlights once again the importance for companies to review how cookies are used on their websites and how their cookie banners, along with the cookie consent management solution, are displayed. To be compliant, websites must make it as easy as possible for users to reject all advertising cookies. Personalized advertising can be compliant as long as it is based on the user’s consent. In case users reject all advertising cookies, websites can only show general adverts that are not tailored to the users’ browsing history. Consequently, websites should display a cookie banner that makes it as easy for users to reject cookies, as it is for them to accept cookies.

The ICO’s guidance in relation to cookie banners can be found here, which may need to be further updated with the newly presented Data Protection and Digital Information Bill.

First publication: Cyber Law Watch Blog with Sophie Verstraeten

Post-Brexit EU businesses have needed to rethink how they approach showing compliance with a host of regulations, managing international data transfers and building trust with data subjects. Having to comply with the GDPR, prepare for other data protection bills, all while continuing to comply with the EU-GDPR as well as a host of global regulations means businesses might look to certification as a common system for adequacy as a one-stop shop, when addressing the overlaps and more crucially closing the gaps on their privacy compliance programs.

Featured speakers:

  • Noshin Khan, Senior Compliance Counsel, Ethics Center of Excellence, OneTrust 
  • Claude-Étienne Armingaud, Partner, K&L Gates

Register here.

The UK Government has laid adequacy regulations before Parliament that, once in force from 12 October 2023, will permit use of the UK – US “Data Bridge” as a safeguard for personal data transfers from the UK to the US under Article 44 UK GDPR.

The UK – US “Data Bridge,” AKA the UK Extension to the EU – US Data Privacy Framework (Framework), allows UK organisations to transfer personal data to organisations located in the United States that have self-certified their compliance with certain data protection principles and appear on the Data Privacy Framework List. This scheme, administered by the US Department of Commerce, provides a redress mechanism for data subjects in the European Union to enforce their rights under the EU General Data Protection Regulation, in relation to a participating US organisation’s compliance with the Framework, and to US national security agencies’ access to personal data. This new redress mechanism attempts to prevent a challenge to the Framework similar to the Schrems II case, which invalidated the Framework’s predecessor EU – US Privacy Shield. Despite this, the Framework has already been the subject of a short-lived case at the Court of Justice of the EU, and there may be more legal challenges.

Alongside the adequacy regulations, the UK government published an analysis of the US laws relating to US national security agencies’ access to the personal data of European data subjects. This analysis effectively completes the international data transfer risk assessment (TRA), which UK organisations have been required to carry out before transferring personal data to the US. It is likely that UK organisations relying on the other Article 44 UK GDPR safeguards, such as the International Data Transfer Agreement, may also rely on this analysis in place of completing a TRA.

First publication: K&L Gate Cyber Law Watch Blog in collaboration with Noirin McFadden

In this webinar, our lawyers discuss generative artificial intelligence (AI). Fast paced growth in generative AI is changing the way we work and live. With such changes come complex issues and uncertainty. We will address the legal, policy and ethical risks, mitigation, and best practices to consider as you develop generative AI products and services, or use generative AI in the operation of your business.

With Annette Becker, Guillermo Christensen, Whitney McCollum, Jilie Rizzo, and Mark Wittow

If you were not able to join last Tuesday, you can watch the replay below:

Source: K&L Gates Hub

On 14 June 2023, the European Parliament (Parliament) plenary voted on its position on the Artificial Intelligence Act (AI Act), which was adopted by a large majority, with 499 votes in favor, 28 against, and 93 abstentions. The newly adopted text (Parliament position) will serve as the Parliament’s negotiating position during the forthcoming interinstitutional negotiations (trilogues) with the Council of the European Union (Council) and the European Commission (Commission).

The members of Parliament (MEPs) proposed several changes to the Commission’s proposal, published on 21 April 2021, including expanding the list of high-risk uses and prohibited AI practices. Specific transparency and safety provisions were also added on foundation models and generative AI systems. MEPs also introduced a definition of AI that is aligned with the definition provided by the Organisation for Economic Co-operation and Development. In addition, the text reinforces natural persons’ (or their groups’) right to file a complaint about AI systems and receive explanations of decisions based on high-risk AI systems that significantly impact their fundamental rights.

Definition

The Parliament position provides that AI, or an AI System, should refer to “a machine-based system that is designed to operate with varying levels of autonomy and that can, for explicit or implicit objectives, generate outputs such as predictions, recommendations, or decisions, that influence physical or virtual environments.” This amends the Commission’s proposal, where an AI System was solely limited to software acting for human-defined objectives and now encompasses the metaverses through the explicit inclusion of “virtual environments.”

Agreement on the final version of the definition of AI is expected to be found at the technical level during trilogue negotiations, as it does appear to be a noncontentious item.

Another notable inclusion relates to foundation models (Foundation Models) that were not yet in the public eye when the Commission’s proposal was published and were defined as a subset of AI Systemtrained on broad data at scale, is designed for generality of output, and can be adapted to a wide range of distinctive tasks.

(more…)