U.S. and EU Clash over Regulation of Digital Content Moderation

37
21.03.2025

The dispute between the European Commission (EC) and U.S.-based digital platforms over compliance with the Digital Services Act (DSA) is part of a broader difference in the EU and U.S. approaches to technology regulation. The U.S. administration opposes regulations that constrain U.S. companies, arguing that regulation stifles growth and innovation. Republicans cite freedom of speech, which they understand expansively. The EU, on the other hand, while respecting freedom of expression, sees content moderation as protecting users. Growing tensions in this area could lead to a temporary suspension of U.S. social media platforms within the European Union.

Niall Carson / PA Images / Forum

Regulating Content Moderation in the EU

 The DSA was adopted by the EU in October 2022 and, due to its complexity, was implemented in phases, with full application from 17 February 2024. It was intended to respond to two types of threats in connection with the operation of Very Large Online Platforms (VLOPs), which reach at least 45 million users. One is the dissemination of illegal content (such as incitement to violence and hate speech) and the other is the sale of illegal goods and services. Due to the wide reach of VLOPs (e.g., Meta, X, but also the Germany-based Zalando or China-based TikTok), there is a high risk of illegal and harmful content being disseminated on them. This is why the EU put VLOPs under special scrutiny. The DSA is intended to make platforms more accountable for content moderation and to ensure the transparency of algorithms and online advertising. The regulation requires VLOPs to remove illegal content more effectively, to better protect users’ rights, and allows EU supervisory authorities to impose penalties in case of violations.

Regulating the moderation of digital content in the EU is a benchmark for the introduction of similar policies worldwide (the so-called Brussels effect), which in turn strengthens the EU’s global position. Its influence is all the more effective in the digital arena as VLOPs operate globally, making it easier for other countries to replicate EU regulations. Examples include the UK (Online Safety Act) or Australia’s laws. From the perspective of the companies, the application of uniform rules and procedures across different markets, despite the loss of potential revenue that illegal content may have generated, is also often more efficient both in productivity and costs, which also favours the widespread use of DSAs.

DSA in Practice. The EC, which oversees compliance with the DSA, can impose a penalty of up to 6% of a service provider’s global turnover if it has no or insufficient mechanisms to remove illegal content. The penalties therefore do not relate to illegal content appearing on the platform, but to the failure to implement mechanisms to counter its publication. The EC is currently pursuing cases against two U.S. companies—X and Meta. In the case of Meta, the EC is investigating whether the company takes sufficient measures to protect minors. In the case of X, it is examining both whether the platform counteracts the dissemination of illegal content and the effectiveness of combating the manipulation of information on the platform (e.g., during elections).

Election Security

 The DSA includes the concept of “systemic risk”, which is considered to comprise, among other things, threats to democratic processes, particularly elections. The EC requires VLOPs to take measures to mitigate these risks. The DSA is thus intended to safeguard elections against external interference. An example is the 2024 EP elections, when the EC monitored the reactions of VLOPs to posts with false voting information or attempts to demotivate voters. For example, Meta published additional text with links to official sources under posts about elections. The DSA also contributed to a resilience test of public institutions and digital platforms ahead of Germany’s February general election, testing their effectiveness in responding to a potential crisis.

The DSA does not directly regulate disinformation or FIMI (foreign information manipulation and interference, the external intrusion into the information space), not considering them to be illegal content, only harmful. However, the DSW allows limiting their reach indirectly. Platforms are obliged (e.g., in the context of elections) to provide users with access to verified information. Nevertheless, the EC, like Poland, treats disinformation as a key security threat. This gives rise to the disputes with the U.S., all the more so given that the mechanisms for access to verified information provided by U.S. VLOPs are unsatisfactory to the EC. This is particularly evident in the case of X and its “Community Notes”, which, while replacing the traditional method of verifying content, is becoming less and less popular with users. Artificial intelligence systems (e.g., Grok) digest huge amounts of content on the internet to provide answers but do not yet seem to be able to accurately distinguish between truth and falsehood, among other problems. As the rating agency NewsGuard showed in one analysis, a survey of 10 leading AI-based generative chatbots found they repeated false narratives promoted by the Russian portal network Pravda 33% of the time.

The Trump Administration’s Response to the DSA

 The current U.S. administration regards European content moderation rules as censorship and an unjustified imposition of additional obligations on companies. Vice President J.D. Vance stressed in Munich in February this year that such regulations are contrary to the freedom of expression. His claim, however, is a manipulation of the facts, as the only content removed in accordance with the DSA is incitement to violence or, for example, posts promoting the sale of illegal goods. In his statement, he also criticised the decision of the Constitutional Court of Romania to annul the presidential election due to undocumented, and thus illegal, campaign financial assistance (including with the use of TikTok; the candidate in question had a network of accounts supporting his candidacy but reported “$0” in campaign funding) and argued that EU regulations allow the EC to “annul” elections, even though the legal basis for the court's decision was directly from national law.

Among Trump’s first executive orders was one titled “Restoring Freedom of Speech and Ending Federal Censorship”, which show that he shares Vance’s position. The administration has also issued a memorandum on “defending U.S. companies from foreign extortion, unfair fines and penalties”, which targets the DSA directly. The memorandum assumes that if other countries impose additional charges (including penalties), the U.S. will take retaliatory actions necessary to mitigate the alleged harm. However, such actions could prove detrimental to the VLOPs themselves, as the European market is crucial to them.

Trump’s victory has encouraged U.S. technology companies to challenge the DSA. Shortly before his inauguration, Meta unveiled a marked turnaround in its content moderation policy, announcing the abandonment of its existing fact-checking system. In its place, Meta plans to introduce a system of community notes similar to that on X. This system poses a risk, as it is relatively easy to use bots to infect it so that incorrect information appears in the notes. What’s more, as the X example shows, after a certain period the notes lose popularity and cease to serve their purpose. Meta’s decision is a signal of a fight against EU regulation and manifestation of a new direction in the moderation of digital content.

Conclusions and Perspectives

 Recent statements by U.S. government officials make it clear that content moderation is judged by them to be a threat to freedom of speech and the free operation of U.S. online platforms. The common goal of the administration and U.S. technology companies can be considered to be to weaken the effectiveness of the DSA, and therefor to reduce the need to comply with its rules. VLOPs cannot withdraw from the European market altogether, as it generates between 30% and 50% of their revenues. However, it is possible that the dispute could be exacerbated and the EC could impose record fines on Meta and X. If the fines are not paid and the DSA regulations continue to be ignored, the Commission could suspend the platform’s operations in the EU for a period. In response, Trump is likely to impose additional tariffs on the EU or use another form of retaliation that is difficult to predict. Further fragmentation of the internet—the emergence of “European” and “American” versions (or even completely separate) social media platforms with different moderation rules, is also a future possibility.

There is a risk that the dispute between the U.S. and the EU will lead to concessions from the EU that weaken the DSA. For Poland, it is important to maintain the current interpretation of the DSA, as a more liberal interpretation of the rules could weaken the EU vis-à-vis technology platforms. In Poland, the threat of disinformation remains high due to the intense, aggressive activities of Russia and Belarus. Information security is also one of the priorities of the Polish presidency of the EU Council. It is important, both within and outside the presidency, to act in the EU field in such a way as to strengthen this security, including in terms of regulation. It is also worth considering international cooperation with countries also in a dispute with American social media platforms, such as Brazil.