Home » Pornhub, Stripchat, XVideos under EU scrutiny: here’s why and the consequences

Pornhub, Stripchat, XVideos under EU scrutiny: here’s why and the consequences

by admin
Pornhub, Stripchat, XVideos under EU scrutiny: here’s why and the consequences

Listen to the audio version of the article

The life of the most famous porn sites now becomes more difficult. But that of vulnerable subjects who in one way or another suffer the effects of these sites is easier.
These are the consequences of a decision by the European Commission on 20 December. Pursuant to the Digital Services Act, it has designated three companies with eloquent names as “large online platforms”, with related obligations: Pornhub, Stripchat, XVideos.
The three services in fact meet the threshold of 45 million average monthly users in the EU.
We remind you that, according to the Digital Services Act, all online platforms and search engines (except very small ones) will have to comply with various obligations by February 17, 2024. Two of these obligations are of great relevance when it comes to porn platforms: control over illegal content and protection of minors.
Additionally, in addition to the basic obligations of the digital services act, Pornhub, XVideos and Stripchat will have to take specific measures to empower and protect online users, including minors, and duly assess and mitigate any systemic risks arising from their services.

Illegal contents

The Commission speaks in particular of risks linked to the dissemination of illegal content online, such as child pornography, and of content that threatens fundamental rights, such as the right to human dignity and private life in the event of non-consensual sharing of intimate material online or of deepfake porn, created with artificial intelligence, sometimes without the permission of the subjects portrayed. “These measures may include adapting terms and conditions, interfaces, moderation processes or algorithms”; “they must strengthen their internal processes, resources, tests, documentation and supervision of all their activities related to the identification of systemic risks”, we read in a note from the Commission. These are very concrete indications, if we think about what has happened in recent years. The first known victim of sexual exploitation on Pornhub was Rose Kalemba. A video of her being brutally sexually assaulted at the age of 14 was uploaded to Pornhub. And there you remained for a long time despite your reports to the company. The New York Times in 2021 profiled other survivors of similar abuse. They are forced to relive them because of platforms of this type. Due – in particular – to the way in which they have dealt with the problem so far: with little attention in managing user reports. Hence the intervention of the European Commission, which follows similar principles to those with which it asks big tech companies to equip themselves to remove harmful content more promptly and listen to reports with the utmost attention. Pornhub has already recognized the presence of minors (child pornography) and even victims of sex trafficking and in recent years it has carried out mass removals of videos, by the millions. There is also the case of 22 victims who sued MindGeek – Pornhub’s parent company – for $80 million. They made porn by deception (drug addicts) and their videos ended up online. Again, Pornhub has kept videos of these survivors on its platform for months despite reports. Last year, two top MindGeek executives resigned following allegations that the site does not immediately or sufficiently remove content involving non-consensual sex and minors.

See also  Apple stops selling AirPods Pro

Strong protection of minors

The other protection front concerns the access of minors to porn platforms. The Commission requires them to “design their services, including interfaces, recommendation systems and terms and conditions, to address and prevent risks to the welfare of children”. “Mitigation measures to protect children’s rights and prevent children from accessing pornographic content online, including with age verification tools”. “Risk assessment reports should in particular detail any adverse effects on mental health protection and physics of minors”. Greater transparency and accountability Among other relevant measures, imposed by the Commission: these platforms must carry out an independent external verification of their compliance with the law; they will have to allow access to publicly available data to researchers, including selected researchers designated by the digital services coordinators; must meet additional transparency requirements, including publishing transparency reports on content moderation decisions and risk management every six months, as well as reports on systemic risks and audit findings once a year; must appoint a compliance function and undergo an independent external audit every year. The Commission will “closely monitor these platforms’ compliance with their obligations, in particular as regards measures to protect minors from harmful content and the dissemination of content illegal.”

You may also like

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

Privacy & Cookies Policy