ADCG’s Explainer – EU’s “Digital Services Act”

On November 1, the European Union (EU) published the details of the Digital Services Act (DSA). According to this article in Wilson Sonsini (WS), the DSA “complements” the Digital Markets Act (DMA), which took effect on November 1, 2022. The two acts will regulate digital service providers in and outside of the EU.

WS states the DSA will act in accordance with existing pieces of legislation that “impose moderation and transparency requirements” on the provision of digital services, including the review of online content for terrorist activity, “political advertising,” and illegal activities.

The DSA applies to Intermediate Service Providers, which include:

  • Conduit Providers: —organizations that transmit “in a communication network of information provided by a recipient of the service, or the provision of access to a communication network.”

  • Caching Service Providers: organizations that transmit “a communication network of information provided by a recipient of the service, involving the automatic, intermediate and temporary storage of that information, performed for the sole purpose of making more efficient the information’s onward transmission to other recipients upon their request”

  • Hosting Service Providers: organizations that store information “provided by, and at the request of, a recipient of the service.”

The application of the provisions of the DSA will depend on the “nature and size” of an organization. Intermediate service providers will be subject to the following obligations:

  • Comply with any order issued by a national judicial or administrative authority requiring an intermediate service provider to remove content or information relating to service recipients.

  • Implement and enforce terms and conditions (T&Cs) that are sufficiently detailed and cover the decision-making of algorithms and the DSA compliance procedures utilized by the organization in engaging in the use of the algorithms. The T&Cs must be “sufficiently detailed” and, if the services will be primarily utilized by children, they must be clear and comprehensible by the proper age group. Additionally, any time “significant” changes are made to these T&Cs, the consumer must be notified.

  • Publish a DSA-compliant content moderation report on at least an annual basis in a “machine-readable format” that is easily accessible by the public.

  • Hosting service providers must implement a mechanism for users to identify illegal content on online platforms, as well as a process for reviewing the received notice, taking proper action in response, and notifying the user of what action was taken in response to their report.

However, according to WS, the “most burdensome” obligations are related “to content moderation, online advertising, and trader transparency.” For example, online platform providers must also:

  • Avoid engaging in targeted advertising based on a consumer’s sensitive data or the data of children.

  • Maintain clear information on the advertisements they elect to display on their platform, including the details of the transaction and the variables that will be utilized to determine who views the advertisement.

  • Where the online platform uses “fully or partially automated systems to recommend content” to consumers, the T&Cs must include the recommendation process, including the most significant criteria for determining what information would be presented and ways in which a consumer can modify these recommendation parameters.

  • Refrain from designing or organizing their online interface to influence consumer behavior, such as prominently displaying certain information or choices when users are prompted to make a decision.

  • Provide consumers with a compliant mechanism to challenge algorithmic decisions relating to the content they receive or their accounts and issue user warnings where an user of an online platform has repeatedly provided undoubtedly illegal content or misused the platform in another manner.

  • Maintain Know Your Consumer requirements to collect user information before permitting them to utilize the provided service.

  • Notify consumers directly or, where that is not possible, provide public notice if an online marketplace is offering illegal products or services.

  • Design their online interface to allow compliance with DSA obligations and provide consumers with a clear identification of the products and services offered by the platform.

Additionally, WS further provided that the providers of online platforms or search engines which reach more than 45 million consumers will be designated as “very large online platforms” (VLOPs) and “very large online search engines” (VLOSEs) by the Council of the European Union and will be “subject to the highest degree of regulation.” Therefore, in addition to the above outlined obligations, VLOPs and VLOSEs must engage in the following:

  •  Create an accessible advertisement search function on their user interface that can be utilized to identify the content of an advertisement, how long the advertisement was presented for, whether the advertisement was targeted to a specific group of people, and the exclusion criteria for groups who did not receive the targeted advertisement.

  • Where applicable, offer consumers recommender systems that are not based on their personal data profile.

The DSA created a European Board for Digital Services (EBDS) to enforce its provisions in a consistent manner across all covered entities. Companies that are not established in the EU, but intend to continue offering their online services to EU consumers or companies owned by EU individuals will be required to designate a “legal representative in the EU” who will ensure compliance with the DSA and be held individually liable for any noncompliance by their organization. According to The Paypers, the inclusion of the representative requirement “has so far been accepted positively by European players as it would ensure a level playing field within the Single Market.”

The Board will impose fines for noncompliance which WS states a “will reach a maximum of six percent of a company’s annual worldwide turnover.” The Paypers notes that an issue has been raised amongst industry participants that the “threat of significant fines for non-compliance might lead to preventive removal of content, which might otherwise be considered legal, putting companies in the uncomfortable position of risking fines under the Digital Services Act or being criticized for violating freedom of expression by censorship.”

Covered entities will have until February 17, 2024, to reach compliance with the DSA. However, for VLOPS and VLOSEs, the DSA will apply four months after their designation by the EC, which The Paypers states “could take place as early as in the first half of 2023.”

* * * * * * *

This week’s news alerts include: Senators’ letter addressing concerns of the FTC’s proposed rulemaking on Commercial Surveillance and Data Security, Joint Advisory warning to healthcare sector of latest extortion scheme, and Texas sues Google over biometric data collection. Click here to read.

This week’s breach report covers breaches of the following companies: Dropbox, Medibank Private Ltd., Morrison Products Inc., St. Luke’s Health and Vodafone. Click here to find out more.

To browse through our previously published articles and news alerts, please visit our website, and don’t forget to subscribe to receive free weekly Data and Cyber Governance news and Breach Reports directly to your email.

ADCG’s podcast returns this week. In our newest episode (to be released Thursday), two incredible guests, Gary Corn and Jamil Jaffer, join our host, Jody Westby, to discuss Cyber Command, its role and jurisdiction, and what it can do in cyber conflict situations and how it may help the private sector when under nation state attacks.

Gary Corn is director of the Technology, Law & Security Program at American University’s Washington College of Law and former career military with his last position as the Staff Judge Advocate (General Counsel) to U.S. Cyber Command. 

Jamil N. Jaffer is the Founder and Executive Director of the National Security Institute, and an Assistant Professor of Law and Director of the National Security Law & Policy Program and the nation’s first Cyber, Intelligence, and National Security LLM at the Antonin Scalia Law School at George Mason University.

Episodes can be enjoyed on many platforms including Spotify and Apple Podcasts. Our most recently released episodes:

79 | Understanding 5G Cybersecurity Issues (with guest Carlos Solari)

78 | The Nexus Between Privacy, Cybersecurity & National Security (with guest, Corey Simpson)

77 | Privacy & Cybersecurity Whistleblowers: A New Trend? (with guest, Andrew Grosso)

Don’t forget to subscribe!

Previous
Previous

News Alerts and Breach Report for Week of November 14, 2022

Next
Next

News Alerts and Break Report for Week of November 7, 2022