European Union regulators have launched formal inquiries into Apple, Snapchat, and Google’s YouTube, escalating a broad crackdown on major digital platforms over potential failures to protect children from online harms. The European Commission, the EU’s executive arm, is demanding the companies provide detailed information on how they mitigate risks to minors, from exposure to illegal content and addictive designs to insufficient age verification. The move marks a significant enforcement action under the Digital Services Act (DSA), a landmark law designed to hold tech giants accountable for the content on their services.
The probes, officially initiated through formal requests for information, could lead to severe financial penalties if the platforms are found to be in breach of the DSA’s stringent child safety mandates. Companies face potential fines of up to 6% of their global annual turnover, a sum that could reach billions of dollars. These actions are part of a wider EU strategy to create a safer digital environment for young users, which includes ongoing investigations into other major platforms like Meta’s Facebook and Instagram, and signals a new era of intense regulatory scrutiny for the tech industry’s operations within the 27-nation bloc.
The Digital Services Act and Its Mandates
The investigation is grounded in the legal framework of the Digital Services Act, which took full effect in early 2024. The DSA imposes a tiered system of obligations on online intermediaries, with the most stringent rules reserved for “Very Large Online Platforms” (VLOPs)—services with more than 45 million monthly active users in the EU. Apple’s App Store, Google Play, Snapchat, and YouTube all fall into this category, subjecting them to heightened responsibilities.
At the core of the law’s child protection scheme is Article 28, which compels platforms accessible to minors to implement measures ensuring a “high level of privacy, safety, and security.” Beyond this general duty, the DSA requires VLOPs to conduct comprehensive annual risk assessments. These audits must evaluate systemic risks stemming from their service design and algorithmic systems, including potential negative effects on the physical and mental well-being of children. Following these assessments, platforms are legally required to deploy “reasonable, proportionate and effective” measures to mitigate the identified dangers. This could involve redesigning user interfaces to be less addictive, enhancing content moderation, and strengthening age assurance systems.
To guide compliance, the Commission has also issued non-binding guidelines that recommend specific actions, such as making minors’ accounts private by default, deactivating manipulative features that encourage prolonged use, and prohibiting the downloading or screenshotting of content posted by children.
Specific Allegations Against Each Platform
While the inquiries are broad, regulators have outlined specific areas of concern for each company, targeting the unique functions of their services.
Snapchat’s Age Gates and Moderation
The investigation into Snap, the parent company of Snapchat, focuses on two primary issues: age verification and the sale of illegal goods. The Commission is demanding to know how the platform prevents children under the age of 13 from creating accounts, a direct violation of Snapchat’s own terms of service. Regulators are also scrutinizing the effectiveness of features meant to stop the sale of illicit products, such as vaping devices and drugs, to underage users through the platform. The inquiry further touches upon potentially addictive design features, such as “streaks,” which reward users for continuous daily interaction.
YouTube’s Algorithms and Age Systems
For Google’s YouTube, the probe centers on its age-assurance systems and the powerful algorithmic recommender system that drives user engagement. The Commission is acting on reports that harmful content is being disseminated to minors through these automated recommendations. Officials are seeking detailed information on how YouTube assesses and mitigates these risks to prevent children from being led down “rabbit holes” of inappropriate or dangerous material. Google states it has built age-appropriate experiences with robust parental controls developed in consultation with child-development experts.
Apple and Google App Marketplaces
Apple’s App Store and the Google Play store are being investigated for their roles as gatekeepers to the mobile app ecosystem. The Commission’s inquiry examines the measures in place to prevent minors from finding and downloading applications that are illegal or harmful. This includes apps related to gambling, explicit sexual content, or so-called “nudify” applications that generate non-consensual altered images. Regulators are also assessing how effectively the platforms apply their own age-rating systems to ensure developers accurately classify their apps’ content.
The Path of the Investigation
The current requests for information represent the first formal step in the DSA’s enforcement process. This preliminary stage allows the Commission to gather evidence directly from the companies about their internal policies, systems, and risk-mitigation efforts. The platforms are legally obligated to provide the requested data to regulators.
If the Commission finds the companies’ responses unsatisfactory or uncovers evidence of potential breaches, it has the power to open a “formal infringement proceeding.” This would trigger a much deeper, in-depth investigation that could involve further evidence-gathering powers, including on-site inspections and compelled interviews. There is no statutory deadline for the conclusion of such proceedings, as their duration depends on the complexity of the case and the extent of the companies’ cooperation. Ultimately, this process can conclude with a non-compliance decision, the acceptance of binding commitments from the companies to remedy their practices, or the levying of significant fines.
A Widening Regulatory Front
These probes do not exist in isolation but are part of a determined and expanding effort by the EU to regulate the digital sphere. The Commission already has formal proceedings underway against other tech giants, including Meta and X (formerly Twitter), for issues ranging from election disinformation to addictive design. This consistent application of the DSA underscores a fundamental shift in the EU’s approach, moving from self-regulation to active and forceful state oversight.
The regulatory push extends beyond DSA enforcement. EU officials and member states are actively discussing the possibility of establishing a bloc-wide “digital age of majority,” a concept inspired by an Australian law that restricts social media access for users under 16. Alongside these policy debates, the EU is developing its own technologies to aid enforcement, including piloting a privacy-preserving age verification app that could eventually integrate with national digital identity wallets. This multi-pronged strategy highlights a growing political consensus in Europe that the well-being of children online requires robust, legally binding safeguards that are consistently enforced across the digital single market.
“`