“The question is not whether young people should have access to social media. The question is whether social media should have access to young people.”
That sentence — delivered by European Commission President Ursula von der Leyen on May 12, 2026, at the European Summit on Artificial Intelligence and Children in Copenhagen — marked the strongest signal yet that Brussels intends to redesign how platforms treat children. The Commission announced that its forthcoming Digital Fairness Act (DFA) will directly target “addictive and harmful design practices” — endless scrolling, autoplay, push notifications, and the algorithmic systems that quietly keep minors locked into their phones for hours every day.
Here is what families need to know about the new legislation, the active investigations against TikTok, Meta, and X, and what parents can do while the rules are still being drafted.
What Was Announced on May 12, 2026
Von der Leyen used her keynote in Copenhagen to outline three interlocking measures:
- A Digital Fairness Act, with a legislative proposal expected by the end of 2026, that will explicitly regulate addictive design and dark patterns in consumer-facing digital services.
- Stricter limits on AI in social media, including AI tools that generate intimate imagery — a direct response to revelations about X’s Grok generating sexual images of women and children.
- A push for minimum age requirements across the bloc, backed by the EU’s new age verification app launching in pilot countries this summer.
The DFA will sit on top of the existing Digital Services Act (DSA), which is already being used in active enforcement actions against the biggest platforms. The two laws together form Europe’s regulatory answer to a business model built on the attention of minors.
The TikTok Case: From Investigation to Preliminary Finding
To understand why the Commission is escalating, look at the TikTok file.
On February 6, 2026, the European Commission issued a preliminary finding that TikTok had breached the DSA. It was the first time the Commission targeted design rather than illegal content. The specific features cited:
- Infinite scroll that removes natural stopping points
- Autoplay of videos that continues without any user action
- Push notifications engineered to pull users back to the app
- Personalised recommender systems that profile each user — including minors — to maximise time spent
The Commission found that TikTok had not adequately assessed how these features could harm the physical and mental wellbeing of users, especially minors and vulnerable adults. It also found that TikTok had ignored clear signals of compulsive use — for example, the amount of time minors spend on the app at night, and the frequency with which young users reopen the app.
The Commission’s verdict on TikTok’s existing safeguards was blunt: screen-time management tools and parental controls were “easy to dismiss and introduce limited friction.” In other words, the safety features were window dressing.
If the preliminary finding is confirmed, TikTok faces fines of up to 6% of its global annual turnover — a number that could run into billions of euros.
Meta: Failing to Enforce Its Own Age Rule
The Commission’s case against Meta is different but no less serious. Instagram and Facebook both require users to be at least 13 years old. The Commission’s preliminary view, expressed in von der Leyen’s Copenhagen speech, is that Meta has failed to enforce its own minimum age.
A checkbox asking “Are you over 13?” has never stopped a determined ten-year-old. Internal Meta research leaked in 2021 showed the company already knew Instagram was linked to body image issues and depression in teenage girls. The platform did not significantly change.
In October 2025, the Commission also found that both TikTok and Meta had systematically blocked researchers from studying how content reaches children on their platforms — a separate violation of DSA transparency requirements. The combined exposure from those proceedings has been estimated at roughly 20 billion dollars in potential fines.
X and the Grok Problem
The third platform under scrutiny is X, where the Commission is examining the use of the Grok AI tool to generate sexual imagery of women and children. This is the clearest example yet of how generative AI is creating entirely new categories of harm — and why the Digital Fairness Act will include explicit limits on AI deployment in consumer social platforms.
What “Addictive Design” Actually Means
The Commission’s language is precise. “Addictive design” is not a general complaint about teenagers spending too much time online. It is a specific list of product decisions that, taken together, are designed to defeat self-control:
Endless feeds. A page that has no end teaches the brain there is always one more reward just below the fold.
Autoplay. Removing the user’s choice to start the next video shifts the default from “stop” to “continue.”
Variable rewards. Likes, comments, and “for you” surprises arrive on a schedule designed to mimic slot machines — unpredictable, but frequent enough to keep the user pulling the lever.
Push notifications. A buzz in the pocket every few minutes turns the phone into a slot machine that pages you when it has been idle too long.
Personalised recommender systems. Algorithms that profile each user’s vulnerabilities — including those of children — and optimise the feed for engagement, not wellbeing.
Scientific research cited by the Commission shows that these patterns “fuel the urge to keep scrolling and shift the brain of users into autopilot mode,” reducing self-control and reinforcing compulsive behaviour. That research, and the Commission’s findings about TikTok, are the legal foundation for treating addictive design as a systemic risk — the threshold required for action under the DSA.
The Age Verification App: Pilot This Summer
The Commission’s enforcement strategy depends on knowing who is a child. That is the gap the new EU age verification app is designed to fill.
Announced as technically ready in April 2026, the app uses zero-knowledge proofs — cryptography that lets a user prove they are over a given age without revealing any other personal data. No birthdate, no passport scan, no face stored on a corporate server.
Seven member states are integrating the app into their national digital identity wallets this year:
- 🇩🇰 Denmark — host of the May 12 announcement, already pushing an under-15 ban
- 🇫🇷 France — Senate passed an under-15 ban in April 2026
- 🇬🇷 Greece — under-15 ban announced for January 2027
- 🇮🇹 Italy — digital age of consent set at 14
- 🇪🇸 Spain — Organic Law for Protection of Minors in parliamentary phase
- 🇨🇾 Cyprus — under-15 ban announced, EU app + Digital Citizen integration
- 🇮🇪 Ireland — Coimisiún na Meán enforces binding online safety codes
Public download is expected by summer 2026, with national wallet integration following through the end of the year.
How the DFA Will Change the Rules
The Digital Fairness Act is still a draft, but the Commission has been clear about its scope. The proposal — due between October and December 2026 — will target:
- Addictive design in consumer-facing digital products
- Dark patterns that nudge users into unwanted choices
- Personalised pricing based on tracking and profiling
- Influencer marketing to children
- In-game currency mechanics that resemble gambling
- AI deployment in social media services
Unlike the DSA, which focuses on the largest “Very Large Online Platforms,” the DFA is expected to apply more broadly — across consumer markets — and to interact with existing consumer-protection law. For families, the practical upshot is that the legal definition of an “unfair” digital service will widen to include the very design choices that keep children glued to their screens.
What Parents Can Do Right Now
The DFA is still a proposal. The age verification app is still rolling out. The TikTok proceedings will play out over months or years. None of that helps a child who is on the app tonight.
Five concrete steps that work today:
Switch off push notifications for every social media app on your child’s phone. This is the single change with the biggest effect on compulsive checking. Many of the addictive-design features the EU has flagged depend on notifications doing the work of pulling users back.
Use device-level screen time controls (iOS Screen Time, Google Family Link) to set firm daily limits. These tools are not perfect — the Commission found TikTok’s own version “easy to dismiss” — but a system-level limit set by a parent is harder to bypass than an in-app reminder.
Add a portable filtering layer. Device controls cover one device, router controls cover one network. A family VPN with DNS filtering blocks social media categories on the child’s device wherever it connects — home Wi-Fi, mobile data, school networks, holidays.
Have the conversation. The EU’s findings are not secret. Explaining to a thirteen-year-old that the Commission has formally accused TikTok of designing the app to make them lose track of time is a more useful conversation than a generic “spend less time on your phone.”
Watch the sleep signal. The Commission specifically cited nighttime use by minors as an indicator of compulsive behaviour that TikTok ignored. If a child’s phone is active between midnight and 6 a.m., that is one of the clearest red flags.
The Road Ahead
The Digital Fairness Act will not be law in 2026 — proposals at this stage take eighteen months or more to clear the Parliament and the Council. But the direction of travel is clear. The DSA already gives the Commission the tools to fine platforms billions of euros for addictive design. The DFA will extend those tools to AI, to dark patterns, and to a wider range of digital services. The age verification app will make minimum-age rules enforceable for the first time.
For parents, the takeaway is that Europe is finally treating the design of social media — not just the content on it — as the problem. The era of “we just provide a neutral platform” is ending. The era of designs being judged on whether they harm minors is beginning.
Related reading: Read more on how tech companies turned addiction into a business model, the EU age verification app and what parents can do today, and whether TikTok is safe for children.
What is the Digital Fairness Act?
Why is TikTok being singled out by the EU?
What is Meta accused of?
When will the EU age verification app be available?
What can parents do before the new rules take effect?
This article reflects EU regulatory developments as of May 13, 2026. The Digital Fairness Act is a proposal in development; the Digital Services Act is in force and being actively enforced. For a wider view of national laws, see our global overview of 2026 child protection legislation.