The European Commission has formally opened an investigation into Snapchat to determine whether the platform has failed to adequately protect minors, potentially exposing them to serious risks such as sexual exploitation and criminal recruitment.
According to the Commission, there are concerns that weaknesses in the platform’s systems may allow adults to impersonate younger users, making it easier to approach children for illegal purposes. These risks include grooming, exposure to harmful or illicit content, and manipulation into criminal activities.
Henna Virkkunen, Executive Vice-President for tech sovereignty, security and democracy, stated that platforms operating under EU law are required to uphold strict safety standards, especially when it comes to minors. She emphasized that Snapchat may have overlooked key obligations under European digital regulations.
The investigation is being conducted under the Digital Services Act (DSA), which imposes stringent requirements on large online platforms to assess and mitigate risks to users. The probe follows a review of Snapchat’s internal risk assessments covering the period from 2023 to 2025, as well as additional information provided regarding age verification systems and illegal activities on the platform.
This step marks the beginning of formal proceedings and could lead to further enforcement measures if violations are confirmed. Potential outcomes include significant fines, mandatory changes to platform policies, or stricter oversight mechanisms to ensure compliance.
Snapchat may also respond by introducing new safeguards, particularly in areas such as age verification, content moderation, and user protection tools, in an effort to align with EU standards and avoid penalties.
The case highlights growing pressure on major tech platforms to strengthen protections for young users, as regulators across Europe intensify scrutiny of digital environments where children spend increasing amounts of time.
