EU Agrees the Digital Services Act
77
16.05.2022

The Digital Services Act agreed in April between the Parliament, the Commission, and the Council of the EU is the first regulation in the world to set the rules for removing illegal online content, increasing the transparency of the functioning of websites, and improving the safety of network users. The implementation of the act in the European Union will be a challenge due to its wide scope, requiring increased involvement of the EU administration in the ongoing control of platforms. At the same time, the EU hopes that the act will become a model for similar laws in other parts of the world.

POOL/ Reuters/ FORUM

The agreement reached on 22 April is another negotiation success, after the Digital Markets Act (DMA), in the field of online platform regulation. The European Commission (EC) presented the draft regulation in December 2020, and work in the trilogues (between three EU institutions: the Council, the Commission, and the Parliament in order to reach a compromise) lasted less than four months (from January to April this year). The Digital Services Act (DSA) is expected to enter into force in January 2024 at the latest, and European small and medium-sized enterprises (SMEs) will have more time than other companies to adapt to the new regulations.

Assumptions of the Digital Services Act

 The DSA regulates mainly internet platforms and other internet service providers such as marketplaces like Amazon or Allegro, social media, and search engines operating in the EU.

The obligations envisaged by the regulation depend on the size of the company. The larger the entity, the more extensive the list. Very large online platforms (VLOPs) and very large online search engines (VLOSEs) are categories that cover companies with more than 45 million monthly active users in the EU. Although the detailed agreement on how to count users has not been reached yet, these categories include the largest U.S.-based platforms (Google, YouTube, Amazon, Apple, Meta). VLOPs and VLOSEs will have to constantly analyse and minimise “systemic risks”, such as the spread of illegal and harmful content (e.g., disinformation) and the manipulation of user behaviour. An annual mandatory external audit will assess these activities and their compliance with the provisions of the DSA. VLOPs also will have to provide supervisory authorities (national and European Commission) and researchers with access to this data and algorithms that will allow for a detailed assessment. In addition, the provisions provide for a crisis mechanism under which, in an event threatening public safety or health (such as the Russian invasion of Ukraine, the COVID-19 pandemic), the EC will be able to force VLOPs to take specific actions for a period up to three months, such as removing selected content repeating harmful disinformation or user accounts inciting dangerous behaviour.

Under the DSA, all platforms must implement a transparent system for monitoring and removing illegal content, products, and services. The mechanism used must be understandable to users, and platforms must respond promptly to user reports. In addition, platforms will have to continuously monitor the effectiveness of their own content management systems to combat illegal and harmful content and breaches of privacy. The DSA obliges platforms to share the principles of operation of algorithms that generate personalised content to users (e.g., the news feed on Facebook), and to enable the adjustment of parameters to the individual needs of recipients.

An important aspect of the DSA is increasing the security of network users. There will be a ban on the use of “dark patterns,” which are online services that manipulate the behaviour of users (e.g., in the form of pop-up windows giving a choice of options but presented in such a way that the user almost always selects the one desired by the author). Behavioural advertising, which is advertising based on the patterns of behaviour and interests of users, will be banned with respect to minors. The responsibility of online platforms for companies selling products or services through them also will increase.

Entities that do not comply with the provisions of the DSA must take into account penalties from the European Commission in the amount of up to 6% of their global annual turnover.

Challenges Related to the Implementation of DSA

 The unclear division of competences between the European Commission and the Member States may prove to be a challenge. The DSA provides for extensive prerogatives for the EC over the largest platforms, while the rest are to be overseen by the Member States. The DSA introduces a position called the Member State Digital Service Coordinator, which will be the authority responsible for DSA enforcement. In addition, the national coordinators will create an advisory body for the European Commission, which will decide in some cases together with the Commission on, for example, the introduction of a crisis mechanism forcing specific actions on VLOPs and VLOSEs. The DSA does not define clear rules of cooperation between institutions at the national and European levels, especially in cases where the competences of these entities may overlap (e.g., in the area regulated by the General Data Protection Regulation, or GDPR). This means a potential risk of litigation and a weakening of the enforcement of the regulation. At the same time, the European Commission anticipates the need to employ about 200 people to supervise the largest internet platforms. It wants these costs to be borne by the VLOPs in the form of a “supervisory fee” of up to 0.05% of annual turnover.

Another challenge will be inconsistency in the interpretation of DSA provisions by online platforms, which may potentially lead to legal disputes or the lack of visible effects of the introduction of the act, for example, in the case of the obligation to introduce mechanisms to combat harmful content and disinformation. Thus, the DSA could turn out to be a regulation with the right assumptions, but difficult to apply in practice due to the inability to precisely define some aspects of the functioning of the digital economy.

Impact on Global Regulations

The functioning of the DSA will have ramifications beyond the borders of the EU. Global tech companies operating in different markets may decide it is more cost-effective to implement one consistent content monitoring and takedown strategy and benchmark the relatively strict EU rules in other countries.

The EU also hopes that, as was the case with the GDPR, third countries will initiate the regulations themselves. Currently, the United Kingdom is developing legislation with a similar scope it its Online Safety Bill. In turn, in the United States, President Joe Biden in this year's State of the State address called for stronger protection of minors online, which may contribute to the introduction of regulations to limit behavioural advertising or manipulation of user behaviour. Proposals for such legislation have already appeared in Congress.

Conclusions

 The EU is a pioneer in regulating online platforms. However, effective enforcement of the DSA provisions against the largest platforms will require not only the involvement of greater financial and personnel resources on the part of the EC but also close coordination with the Member States and EU institutions, including The European Data Protection Board. An ongoing informal dialogue with regulated companies is also essential to prevent possible breaches of the DSA and, once they occur, increase the effectiveness and speed of response.

From the point of view of both Poland and the entire EU, it would be beneficial to introduce similar regulations in more third countries in order to more effectively combat disinformation and hate speech, especially generated on global platforms by accounts (often fictitious) from authoritarian countries. As the example of the Russian invasion of Ukraine shows, the fight against the spread of illegal and harmful (false) content was effective in a limited geographical area (mainly Europe). The EU could advocate for such solutions through dedicated bilateral forums on digital cooperation, such as the recently established Transatlantic (2021) and EU-India (2022) Technology and Trade Councils. The success of at least partial diffusion of DSA outside the EU is possible because of the central idea behind this regulation, which is to increase the safety of online users and prevent the spread of harmful content. This cannot not be openly criticised by other countries.