In the eighteen months after Australia became the first country to ban social media for under-16s, the policy crossed the Atlantic and the English Channel. By May 2026, five European countries had either passed or actively legislated their own variants. France, Greece, Denmark and Cyprus settled on 15 as the cutoff. Turkey landed on 15 too, but with a markedly different enforcement model that has drawn criticism from civil liberties groups.
This article compares the five laws side by side: what each one prohibits, how it gets enforced, what penalties apply, and how they fit into the wider EU framework that the Digital Services Act and the new EU age verification app are stitching together.
For the live country status table across all jurisdictions, see our country tracker.
The five laws at a glance
| Country | Age | Status | Effective | Enforcement | Notable feature |
|---|---|---|---|---|---|
| 🇫🇷 France | 15 | Passed | 2026 | Arcom blacklist + SREN | Senate vote April 2026 |
| 🇬🇷 Greece | 15 | In Progress | January 2027 | Education ministry portal | Announced April 2026 |
| 🇩🇰 Denmark | 15 | In Progress | TBD | EU age-verification app | Host of May 2026 EU summit |
| 🇨🇾 Cyprus | 15 | Passed | 2026 | EU app + Digital Citizen ID | Sanctions up to 6% turnover |
| 🇹🇷 Turkey | 15 | Passed | 2026 | Centralised state ID | Surveillance-state concerns |
All five sit on top of the EU’s Digital Services Act, which already gives the Commission tools to fine platforms up to 6% of global annual turnover for systemic risks to minors — including the addictive-design risks the Commission preliminarily found against TikTok in February 2026.
Where the laws agree
Four common features run through all five:
The age limit is 15, not 13. This is a deliberate break from the GDPR’s digital-consent baseline (13–16, varying by member state) and from the platforms’ own terms of service (typically 13). Regulators have argued that adolescent brain development through age 15–16 — particularly executive function and emotional regulation — does not match the reward-loop business model of social media. The 15-year line is now the European norm.
The scope is “social media,” not “the internet.” All five laws target services whose primary function is feed-based social posting — Instagram, TikTok, Snapchat, X. Messaging apps, online games, search engines, and educational platforms are excluded. This is a narrower carve-out than US-style “Kids Online Safety” approaches, which try to regulate a broader category of “online services likely accessed by minors.”
Enforcement depends on age verification. None of these laws are operational without a way to know who is under 15. France, Greece, Denmark, Cyprus and Ireland have all signed up as pilot countries for the EU age verification app, which the Commission announced as technically ready in April 2026. The app uses zero-knowledge proofs — a user can prove they are over a threshold without revealing their birthdate or any other personal data. Without this kind of infrastructure, the laws are aspirational; with it, the rules become operational.
Penalties tie back to the DSA. Several countries explicitly reference the DSA’s 6%-of-global-turnover ceiling rather than inventing national fines. Cyprus does this most clearly; France’s SREN law layers a national enforcement track on top.
Where the laws diverge
France: Arcom blacklist + SREN platform obligations. The French model uses Arcom, the audiovisual regulator, to maintain a blacklist of services that fail to enforce the under-15 limit. The SREN law also gives the regulator authority to require ISPs to block listed services. Enforcement is centralised through Arcom; platforms must implement age checks compatible with the EU app or face listing.
Greece: education-portal route. Greece announced the under-15 ban in April 2026 with January 2027 as the target effective date. Enforcement runs through the education ministry’s national student portal — the same infrastructure that links school records and Gov.gr digital identity. The model assumes that a centralised state identity layer is the cleanest age-verification path. The trade-off: data is concentrated in a single state system, which is a different privacy posture from the EU app’s zero-knowledge model.
Denmark: pure EU-app integration. Denmark hosted the May 12 2026 European Summit on AI and Children where Ursula von der Leyen announced the Digital Fairness Act. The Danish model leans heavily on the EU age verification app as the enforcement layer, with national legislation framing the obligation but not building parallel infrastructure.
Cyprus: dual-stack with national Digital Citizen ID. Cyprus combines the EU age verification app with the country’s existing Digital Citizen identity wallet. Two paths to the same age proof, which the government argues reduces the risk of EU-app rollout delays slowing national enforcement. Sanctions match the DSA’s 6%-of-turnover ceiling.
Turkey: centralised state identity. Turkey’s model uses the country’s e-Devlet digital identity infrastructure for age verification. This is functionally similar to the Greek approach but raises sharper concerns: Turkey’s track record on press freedom, opposition surveillance, and platform takedowns makes a state-controlled identity layer for social media access a fundamentally different proposition from the EU-app model. See our Turkey article for the civil-liberties dimension.
What the laws don’t yet address
Existing accounts. Most of the laws apply to new sign-ups. What happens to a 14-year-old who already has an Instagram account? France and Cyprus have signalled platforms must terminate underage accounts; Greece is still drafting. None has published a clear deadline.
Cross-border enforcement. A French 14-year-old using a VPN to register as a German 14-year-old (where Germany has no under-15 ban) sits in a regulatory grey zone. The DSA gives platforms a duty of care that arguably extends across borders, but the practical enforcement question — which national regulator pursues which platform — is unresolved.
Parental override. Several drafts include a parental-consent path: a parent can opt a 14-year-old in with verified consent. The details (how verified, how revocable, how recorded) vary across the five laws and are still being worked out.
Educational and family-related platforms. Discord, Telegram, Roblox, Minecraft, and YouTube sit at the edge of the “social media” definition. Each law draws the line slightly differently, and platforms with multiple modes (a feed and a chat and a marketplace) are likely to be reclassified one piece at a time.
How this fits the bigger EU picture
The national bans are one of three tracks the EU is running:
- National under-15 laws (this article): platform-side obligations, enforced by national regulators.
- The Digital Services Act: bloc-wide platform regulation, with the Commission directly fining systemic risks. The preliminary finding against TikTok for addictive design is the first major test.
- The Digital Fairness Act: a proposal due Q4 2026 that will extend the DSA framework to addictive design, dark patterns, in-game currency, influencer marketing to minors, and AI deployment in social media.
These three tracks are not parallel — they layer. A French 14-year-old is covered by SREN (national), by the DSA (EU-level, addictive design), and (when it lands) by the Digital Fairness Act. Penalties stack. The age verification app is the connective tissue that makes any of it enforceable.
What this means for parents
The practical state of play in mid-2026:
- If you are in France, Cyprus, Greece, Denmark or Turkey, social-media services for under-15s are either already legally restricted or will be within twelve months.
- Enforcement bites once the age verification infrastructure is live — public download is expected by summer 2026, with national wallet integration through year-end.
- Existing accounts are a separate question; platforms are expected to terminate underage accounts but timelines are not published.
- The DSA gives the EU Commission a tool to fine platforms billions for addictive design independent of the national bans. That track is already moving, regardless of what national legislatures do.
For practical steps families can take while the rules are still being finalised, see the EU age verification app guide and the layered online protection guide.
The country deep-dives
- 🇫🇷 France — Senate passes under-15 ban
- 🇬🇷 Greece — under-15 ban announced for January 2027
- 🇩🇰 Denmark — under-15 ban + host of the May 2026 EU summit
- 🇨🇾 Cyprus — under-15 ban + EU app + Digital Citizen ID
- 🇹🇷 Turkey — under-15 ban with surveillance-state dimension
- 🇩🇪 Germany — the debate over a national social-media age limit
For the live status table across all countries, see our tracker. For the EU-level frame, see the global overview of 2026 child protection laws.
This is a synthesis article. Each claim about a specific country’s law links to the deep-dive article for that country, where the primary sources (legislative texts, regulator announcements, parliamentary records) are cited. To flag an outdated status or report a new development, email service@agiliton.eu.