EU Investigation Guide: Snapchat & Porn Sites Face Child Safety Probe Under DSA
The European Commission has launched a formal investigation into Snapchat and major pornographic websites under the Digital Services Act (DSA) over serious concerns about child protection failures. Announced on March 26, 2026, this regulatory action targets Snapchat's alleged exposure of minors to grooming, criminal recruitment, and illegal product sales, while simultaneously examining porn platforms' inadequate age verification systems that allow children to access explicit content.
What is the Digital Services Act (DSA)?
The Digital Services Act is landmark European Union legislation that entered into force in 2022, establishing comprehensive legal frameworks for digital service accountability, content moderation, and platform transparency. The DSA creates tiered obligations based on service size and risk levels, with the strictest requirements applying to Very Large Online Platforms (VLOPs) with over 45 million monthly active users in the EU. This legislation represents the EU's most ambitious effort to regulate online spaces and protect users, particularly vulnerable groups like children, from systemic risks and illegal content.
Snapchat Investigation: Five Critical Areas of Concern
The European Commission's investigation into Snapchat focuses on five specific areas where the platform may have violated DSA requirements:
1. Age Verification and Under-13 Access
Regulators question whether Snapchat's age verification systems are sufficient to prevent children under 13 from accessing the platform. Despite requiring users to be at least 13 years old, evidence suggests the platform relies primarily on self-declaration, which has proven ineffective. "Snapchat appears to have overlooked DSA requirements for high safety standards, particularly concerning grooming risks and account settings that undermine minors' safety," stated EU Commissioner Henna Virkkunen.
2. Grooming and Criminal Recruitment Risks
The investigation examines whether Snapchat adequately protects minors from grooming attempts and recruitment for criminal activities. Preliminary findings indicate adults may be pretending to be minors on the platform to lure children, with insufficient safeguards to prevent such interactions.
3. Illegal Product Sales and Content
Snapchat faces scrutiny over its content moderation tools' effectiveness in preventing the dissemination of information about illegal goods. The platform has been identified as a channel for promoting the sale of drugs, vapes, and alcohol to minors, with regulators questioning whether adequate measures exist to block such content.
4. Default Privacy and Safety Settings
The Commission is investigating whether Snapchat's default settings provide sufficient privacy, safety, and security protections for younger users. Concerns include the 'Find Friends' system recommending children to adult users and push notifications remaining on by default, potentially exposing minors to inappropriate contacts.
5. Reporting Mechanisms Accessibility
Regulators are examining whether Snapchat provides user-friendly, accessible mechanisms for reporting illegal content and safety concerns, as required under DSA provisions for transparent content moderation systems.
Pornographic Platforms Under Scrutiny
In a parallel action, the European Commission has accused four major pornographic websites – Pornhub, Stripchat, XNXX, and XVideos – of failing to implement effective age verification systems to prevent minors from accessing explicit content. Despite terms of service requiring users to be adults, these platforms appear to lack robust mechanisms to enforce age restrictions, potentially exposing children to inappropriate material.
The porn site investigation follows similar regulatory actions against other tech platforms, including the EU's investigation into TikTok's addictive features that resulted in potential billion-euro fines earlier this year. This pattern demonstrates the EU's systematic approach to enforcing digital safety standards across different platform categories.
Potential Consequences and Fines
If found in violation of DSA requirements, Snapchat could face fines of up to 6% of its global annual revenue. For context, Snap Inc. reported $4.6 billion in revenue for 2025, meaning potential penalties could reach approximately $276 million. The pornographic platforms face similar financial risks, with additional possible measures including mandated changes to their age verification systems and content moderation practices.
This regulatory action builds on previous investigations into vape sales on social media platforms, where Dutch authorities identified ongoing issues with illegal product promotion targeting minors. The coordinated approach between EU and national regulators demonstrates the comprehensive enforcement strategy emerging under the DSA framework.
Industry Response and Compliance Efforts
In a statement responding to the investigation, Snapchat emphasized its commitment to user safety: "We have always proactively, transparently, and in good faith complied with European legislation and will continue to do so throughout the investigation. The platform was designed from the beginning to ensure privacy and security, and additional protections for children have been built into the app."
The company highlighted existing safety features including parental controls, reporting tools, and content moderation systems. However, regulators remain concerned about whether these measures meet the stringent requirements established by the DSA, particularly regarding systemic risk assessments and age verification effectiveness.
Broader Implications for Digital Regulation
This investigation represents a significant test case for the DSA's enforcement mechanisms and the EU's ability to hold global tech platforms accountable for child protection failures. The outcome will likely influence regulatory approaches worldwide, as governments increasingly seek to balance innovation with user safety in digital spaces.
The EU's actions reflect growing international concern about social media's impact on youth mental health, with similar regulatory pressures emerging in the United States, United Kingdom, and other jurisdictions. As digital platforms continue to evolve, the balance between user protection, privacy, and innovation remains a central challenge for policymakers globally.
FAQ: EU Investigation into Snapchat and Porn Sites
What triggered the EU investigation into Snapchat?
The European Commission launched the investigation based on preliminary findings from a smaller probe initiated in May 2025, which identified concerns about Snapchat's child protection measures, particularly regarding grooming risks, illegal product promotion, and inadequate age verification.
Which porn sites are being investigated?
The European Commission is investigating Pornhub, Stripchat, XNXX, and XVideos for allegedly failing to implement effective age verification systems to prevent minors from accessing explicit content on their platforms.
What penalties could Snapchat face?
If found in violation of DSA requirements, Snapchat could face fines of up to 6% of its global annual revenue, which based on 2025 figures could amount to approximately $276 million.
How does the DSA protect children online?
The Digital Services Act establishes specific obligations for online platforms to implement robust age verification systems, protect minors from harmful content, provide transparent content moderation, and conduct regular risk assessments regarding child safety.
When will the investigation conclude?
The European Commission has not announced a specific timeline for concluding the investigation, but similar DSA probes typically take several months to complete, with potential enforcement actions following the findings.
Sources
European Commission Official Announcement
Follow Discussion