BRUSSELS. The European Parliament has definitively approved the new law on digital services (DSA) and the law on digital markets (Dma), which aim to address the social and economic effects of the technology sector, establishing clear rules for the methods of operation and provision of services in the EU. To address the spread of illegal content, online disinformation and other risks to society, the Digital Services Act (DSA) sets out clear obligations for digital service providers such as social media or online marketplaces. These obligations are commensurate with the size and risks that the platforms pose.
The new obligations include: measures to tackle illegal content online and the obligation for platforms to react quickly, while respecting fundamental rights, such as freedom of expression and data protection; the enhancement of traceability and controls on commercial operators in online markets to ensure the safety of products and services, and the commitment to carry out random checks for any reappearance of illegal content; more transparency and accountability of platforms, for example by providing clear information on content moderation or the use of content recommendation algorithms (so-called recommendation systems). Users will be able to contest content moderation decisions. Then there is a ban on deceptive practices and some types of targeted advertising, such as that aimed at minors and that based on sensitive data. In addition, so-called “dark patterns” and deceptive practices aimed at manipulating user choices will be prohibited.
Online platforms and very large search engines (starting with 45 million monthly users), which present the highest risk, will have to comply with stricter obligations applied by the Commission. These include the prevention of systemic risks (such as the dissemination of illegal content, adverse effects on fundamental rights, electoral processes and gender-based violence or mental health) and the requirement to undergo independent audits. These platforms will also have to offer users the option to opt out of receiving recommendations based on profiling. They will also have to allow access to their data and algorithms by authorized authorities and researchers. The law on digital markets establishes obligations for large online platforms operating on the digital market as ‘gatekeepers’ (‘access controllers’, platforms which by virtue of their dominant online position are difficult for consumers to avoid), to ensure a fairer trading environment and more services for consumers.
To avoid unfair business practices, parties designated as gatekeepers will have to allow third parties to interact with their services, which means that smaller platforms will be able to ask dominant messaging platforms to allow their users to exchange messages, send voice messages or files from one messaging app to another. In this way, users will have a wider choice and avoid the so-called “lock-in” effect, ie the limitation to a single app or platform. They will also need to allow commercial users to access the data they generate on the gatekeeper’s platform, to promote their offers and conclude contracts with their customers outside the gatekeeper’s platform.
Gatekeepers can no longer classify their products or services more favorably than those of other market operators (self-facilitation); prevent users from easily uninstalling any pre-installed software or applications, or from using third party applications and app stores; process users’ personal data for targeted advertising purposes, without their explicit consent. To ensure that the new provisions of the Digital Markets Act are implemented correctly and in line with the continuing evolution of the digital sector, the Commission may carry out market surveys. If a gatekeeper does not comply with the rules, the Commission can impose fines of up to 10% of its total worldwide turnover from the previous financial year, or up to 20% in the event of repeated non-compliance.