The European Commission has opened formal proceedings to investigate if Snapchat is ensuring a high level of safety, privacy and security for children online, in compliance with the Digital Services Act (DSA). The move marks an escalation in regulatory scrutiny of social media companies and their obligations to protect minors from online harm.
The investigation follows a review of the platform's risk assessments from 2023 to 2025 and additional information received last October about age verification and illegal activity. At the centre of the regulators' concerns is whether the platform's self-declaration system for age verification is adequate. While Snapchat requires users to be at least 13 years of age to sign up for an account, its self-declaration age assurance system may not be an adequate means of ensuring those younger than the minimum age can't engage with the platform.
The Commission will focus on five areas: age assurances, grooming and recruitment of minors for criminal activities, inadequate default account settings, dissemination of information on the sale of banned products, and reporting of illegal content. Each concern points to potential systemic weaknesses in how the platform screens users and protects those who gain access.
A key fear is that Snapchat is "not adequately protecting" children from being contacted by users seeking to sexually exploit or recruit them for criminal activities, for example, by allowing adults to pretend to be minors. Additionally, whether Snapchat users can buy illegal products, such as drugs, vapes and alcohol, through the platform due to insufficient content moderation that fails to limit videos with information on how and where to obtain them is under scrutiny.
Snapchat has implemented a range of safety features in recent years. Teens on Snapchat (ages 13-17) have additional layers of protection with strict settings turned on by default. The platform also reports using human review and machine learning to filter age-inappropriate content from recommendations. Yet regulators question whether these measures go far enough, particularly given the app's long association with a young user base.
The company has responded to the investigation by emphasising its commitment to safety. Snapchat said the safety and wellbeing of its users was a "top priority". "As online risks evolve, we continuously review, strengthen, and invest in these safeguards," a spokesperson said. "We have fully cooperated with the commission to date -- engaging proactively, transparently and working in good faith to meet the DSA's high safety standards -- and we will continue to do so."
The Digital Services Act requires internet companies and online platforms to do more to protect European users from things like harmful content and suspect merchandise, or risk hefty fines worth up to 6% of annual revenue. The stakes for Snapchat are high; non-compliance with DSA investigations can result in binding orders to change platform operations or substantial financial penalties.
The probe adds to pressure that social media companies are facing on both sides of the Atlantic over the welfare of young people. On Wednesday, a California jury awarded millions of dollars in damages to a 20-year-old woman after deciding that Meta and YouTube designed their platforms to hook young users without concern for their well being. Snapchat parent company Snap Inc. and TikTok were also included in the lawsuit but settled for undisclosed sums before the trial.
The investigation reflects a fundamental tension that platforms must navigate: how to operate services that are genuinely useful and engaging for young people while building credible safeguards against exploitation and harm. Self-declaration of age has long been recognised as a weak point in this approach, yet implementing robust age verification raises privacy concerns and technical complexity that regulators must also account for.
Snapchat is among over 20 very large online platforms that must adhere to the DSA's tougher rules or risk fines that could reach as high as six percent of their global turnover, or even a ban for serious and repeated violations. There is no deadline for the completion of the investigation but Snapchat can offer commitments to address the EU's concerns. The formal investigation process gives the company opportunity to propose remedial measures, though regulators will determine whether these are sufficient.