Skip to main content

DSA – The European Commission’s new supervisory powers over large platforms and search engines

di Caterina Bo


The final adoption of the Digital Services Act and the Digital Market Act in March and April 2022 marked the end of a legislative path initiated by the European Commission in December 2020 and concluded in a very short time, given the impact these legislations will have on the Union’s digital market.

As stated in the Communication entitled ‘2030 Digital Compass: the European way for the Digital Decade’, the Commission’s aim with these regulations was to promote a level playing field in digital markets, a secure cyberspace and upholding fundamental rights online.

The Digital Services Act (Regulation (EU) 2022/2065 or ‘DSA’), in particular, aims to make online intermediary services – including content sharing platforms and search engines – liable for illegal activities that are carried out through them. The aim is to make concrete the principle that what is illegal in the analogue world is also illegal in the digital world, and must be detected and punished with equal effectiveness.

Online platforms and large search engines

In order to achieve the goal of a secure web, the new regulatory architecture of the DSA starts from an observation: in addition to the type of service offered, the size of the service providers also matters.

The Regulation therefore provides for four categories of obligations that are gradually increasing on the basis, on the one hand, of the degree of involvement of the service provider with respect to the activities of end users and, on the other hand, of the size of the services themselves, in particular with reference to the number of active users who make use of them within the Union. More specifically, platforms and search engines that have an average monthly number of active users in the Union of 45 million or more are considered ‘large’[1] .

The aim is to align the economic revenues of large online platforms with the responsibilities that come with being the dominant market players. In the words of EU Commissioner Thierry Breton, the EU intends to be the first market in the global context to turn the page on the era of ‘too big to care‘ platforms.

Online platforms as regulated entities: the European Commission’s new powers

The obligations introduced with respect to large platforms and search engines are specifically aimed at outlining clear rules for the management of online debate and advertising, with the goal of removing them from the goodwill and thus the arbitrary power of the companies that administer them. The underlying reasoning is that, although formally they remain private entities whose purpose is to generate profit for their stakeholders, in reality they host public debate, as well as the personal interactions of individual EU citizens.

Articles 34 to 43 of the DSA thus provide for a number of additional obligations compared to those for smaller platforms or other types of online intermediation services. These include the obligation to produce assessments of the distortions and risks that content recommendation and moderation systems may entail, and to propose solutions that moderate the aforementioned risks, if necessary also by modifying their algorithms and the functioning of the platforms themselves.

As a preliminary step, moreover, all search engine platforms operating in the Union had to provide the Commission by February 2023 with the number of monthly active users in the EU territory, in order to be designated as ‘large’ or not. In this regard, it is interesting to note that there have already been complications in the implementation of the legal rule, as some platforms merely declared that they did not reach the threshold of relevant users (45 million) instead of reporting the actual number[2] .

The obligations imposed are matched by penetrating control and enforcement powers in the hands of the European Commission, modelled on the powers attributed to central banks with respect to credit institutions.

It is foreseen, for instance, that the Commission may impose measures on platforms and search engines to prevent or counteract a ‘serious threat to public safety or public health in the Union or significant parts thereof‘ if such a crisis is fuelled or generated by the misuse of such intermediaries. It is not difficult in this respect to catch a reference to the recent spread of fake news and incitement to crime that occurred during the Covid-19 pandemic.

Furthermore, the obligation to make information on the targeted advertising that is proposed using platforms or search engines easily available to users corresponds to the Commission’s power to request access to the data necessary to monitor the implementation of the obligations, to order inspections, obtain explanations on the functioning of the content recommendation algorithms and impose fines. The aim is to avoid that, through the recommendation algorithms, de facto discrimination prohibited by EU law is carried out, or ‘weak’ subjects such as minors are targeted (in fact, targeted advertising towards minors is expressly prohibited).

No less important aspect in the overall assessment of the new legislation is that the cost of setting up the system of obligations and even of controls by the authorities in charge will have to fall entirely on the regulated entities: the DSA in fact expressly provides that these activities will have to be borne economically by the platforms and search engines. To this commitment must naturally be added the costs arising from the need to devote a considerable amount of resources to the design, supervision and implementation of control measures, as well as to the management of requests for information that will be received from the Commission and local authorities, as well as from independent auditing bodies.

Conclusions

The DSA will enter into force in its entirety as of February 2024. It is clear as of now, however, that the Union has intended to take a clear stance on the issue of regulating so-called big tech, recognising their public relevance and thus the need for them to be regulated in all aspects of their operation that impact on the exercise of European citizens’ fundamental rights.

However, since these are private-law entities with their main registered office in non-EU countries (the United States, but also China), one has to wonder whether the huge economic burden that such regulation places on platforms and search engines is worth the profit they can make from the European market.

What is certain is that the European Commission does not seem willing to accept a standard of protection that does not fully guarantee the fundamental rights protected by the Union. From this perspective, it therefore seems that if this will lead to a decrease in the number of large players in the European market and the emergence of a more differentiated landscape of platforms and search engines, all the better.


References:

[1] Regulation (EU) 2022/2065, Article 33.

[2] See in this respect the speech of Commissioner Thierry Breton at the Annual Conference of the European Commission: https://ec.europa.eu/commission/presscorner/detail/en/SPEECH_23_1761.


Author:

Caterina Bo

en_US