With data protection, security and other governance risk and compliance responsibilities growing increasingly complex and demanding, some businesses might not have much time to consider their obligations around content moderation.
However, the EU Digital Services Act (DSA), the final text of which was published on Wednesday, may change this. The DSA brings a substantial number of new obligations on many online services—including cloud providers, marketplaces and some websites that allow users to publish content.
Every organisation should consider whether the DSA’s broad definition of an “intermediary service” applies to them, paying particular attention to whether their website constitutes an “online platform”. If so, a barrage of new legal and compliance obligations may be in store.
Who is covered by the DSA?
The DSA covers “intermediary services”, which include “mere conduit” services, “caching” services, and “hosting” services (including “online platforms”, more on which below).
The main focus of the Act is “very large online platforms” (VLOPs) and “very large online search engines” (VLOSEs).
VLOPs and VLOSEs have at least 45 million monthly active users in the EU (or at least 10% of the population).
However, the DSA’s scope extends well beyond big tech, and an “intermediary service” can be an organisation of any size. The following types of companies are specifically mentioned either in the text or in the Commission’s guidance:
-
Internet service providers (ISPs)
-
Cloud services
-
Publicly available messaging services
-
Marketplaces
-
Social networks
-
Content-sharing platforms
-
App stores
-
Travel and accommodation platforms
An important part of the DSA is about shielding certain intermediary services from liability from user-generated content under certain circumstances. However, the Act also comes with a lot of compliance obligations attached.
What about websites with comments sections?
There is some ambiguity about the extent to which the DSA also covers entities such as websites or blogs with comments sections.
Figuring out whether such a service could be covered by the DSA requires an examination of the law’s definition of an “online platform” at Article 3(h)(i):
‘online platform’ means a hosting service that, at the request of a recipient of the service, stores and disseminates information to the public, unless that activity is a minor and purely ancillary feature of another service or a minor functionality of the principal service and, for objective and technical reasons, cannot be used without that other service, and the integration of the feature or functionality into the other service is not a means to circumvent the applicability of this Regulation
There’s no question that a social media platform falls under this definition. But what about a website that allows users to leave comments under an article?
Let’s consider the key elements of the “online platform” definition. An “online platform” is:
-
A hosting service that:
-
Stores and disseminates information to the public
-
At the request of a recipient
-
-
Unless the above functionality:
-
Is a minor and purely ancillary feature of another service, or
-
A minor functionality of the principal service
-
The other elements of the definition capture services that might be attempting to escape this definition. This is not relevant for our purposes.
To determine whether a given website falls under this definition, it is necessary to establish the “principal service” of the website.
For a news website, the “principal service” is delivering news. A comments section under a news article is probably best described as a “minor functionality” of this principal service.
This is made explicit in Recital 13 of the DSA, which states that “the comments section in an online newspaper could constitute such a feature, where it is clear that it is ancillary to the main service represented by the publication of news under the editorial responsibility of the publisher.”
Whereas in the case of a wiki or a forum, the main service itself enables users to contribute and edit their own content for dissemination to the public. These sorts of websites likely would fall under the definition of an “online platform”.
Other types of online services might be a grey area, and case law or further guidance could help clarify the definition in future.
What are the main obligations on online platforms under the DSA?
Obligations under the DSA vary according to the type of service in question and, in some cases, the number of users or the employee count and turnover.
Some of the obligations on online platforms—regardless of company size, turnover or user base—include the following (this list is not comprehensive):
-
Points of contact for Member States’ authorities, the Commission and the Board (Art. 11)
-
Designate a single point of contact to communicate with the authorities
-
Publish whatever information is necessary to ensure the easy identification of and communication with the single point of contact
-
-
Points of contact for recipients of the service (Art. 12)
-
Designate a single point of contact to communicate with recipients of the service
-
Publish whatever information is necessary to ensure the easy identification of and communication with the single point of contact by recipients of the service
-
-
Legal representatives (Art. 13)
-
If based outside the EU, designate someone to act as the service’s legal representative in the EU. This representative must be capable of being held liable for non-compliance with the DSA.
-
-
Terms and conditions (Art. 14)
-
Publish “terms and conditions” that clearly explain the service’s content moderation policies
-
Inform recipients of the service about any changes to the terms and conditions
-
If the service is directed at, or predominantly used by, minors, explain any restrictions to the service aimed at minors
-
Apply any restrictions of service under the terms and conditions in a “ diligent, objective and proportionate manner” that respects EU fundamental rights
-
-
Transparency reporting obligations for providers of intermediary services (Art. 15)
-
Publish content moderation reports at least annually
-
-
Notice and action mechanisms (Art. 16)
-
Put easy-to-access, user-friendly mechanisms in place to enable individuals to notify them of the presence of potentially illegal content
-
Take a decision about the content in a “a timely, diligent, non-arbitrary and objective manner”
-
Confirm receipt of the notice and provide notice of the decision to the individual
-
-
Statement of reasons (Art. 17)
-
Provide a “statement of reasons” to any affected recipients of the service following any restrictions of potentially illegal content or content that violates the service’s terms and conditions
-
-
Notification of suspicions of criminal offences (Art. 18)
-
Inform law enforcement about any information giving rise to a suspicion of a criminal offence involving a “threat to the life of safety of a person or persons”
-
The above obligations apply to all online platforms (and remember, this list is a summary and is not comprehensive). The obligations listed above under Arts. 11-15 apply to all intermediary services (not just online platforms).
Online platforms that are not “micro or small enterprises”, i.e. online platforms that have more than 50 employees or a turnover of more than €10 million, have more obligations in addition to those above.
What are the penalties for non-compliance with the DSA?
Compliance with the DSA will be overseen by a Digital Services Coordinator designated by each EU member state. These bodies will have the power to issue orders and penalties under the law.
Fines under the DSA for non-compliance with one or more of the law’s obligations will be a maximum of 6% of annual worldwide turnover.
Fines for supplying or failing to rectify “incorrect, incomplete or misleading information” will be a maximum of 1% of annual worldwide turnover.
No comments yet