European Parliament backs social media age limit of 16 to shield children from harm

by Newsroom

Source: in-cyprus.philenews.com

The European Parliament has called for a minimum age of 16 across the EU for social media access and sweeping bans on addictive design features as part of efforts to protect children online.

MEPs backed the non-legislative report on Wednesday by 483 votes to 92, with 86 abstentions, citing growing alarm over the physical and mental health damage children face from manipulative platform strategies that fuel addiction and harm concentration.

Parliament wants a harmonised EU-wide digital age threshold of 16 for social media, video-sharing platforms and AI companions, though 13- to 16-year-olds could still access these services with parental consent.

MEPs welcomed the Commission’s plans for an EU age verification app and European digital identity wallet, but stressed that age checks must protect children’s privacy. Platforms cannot dodge their duty to make products safe and age-appropriate by design, they added.

To push compliance with the Digital Services Act and related legislation, Parliament suggested holding senior managers personally liable for serious and repeated violations, particularly on child protection and age verification.

MEPs want the Commission to ban the most harmful addictive practices and switch off other addictive features by default for children, including infinite scrolling, auto play, pull-to-refresh, reward loops and harmful gamification. They also called for action against persuasive technologies such as targeted advertising, influencer marketing and dark patterns under the planned Digital Fairness Act.

Other demands include blocking engagement-based recommendation systems for minors, applying DSA rules to online video platforms, and outlawing loot boxes and randomised gaming features such as in-app currencies and pay-to-progress mechanics.

Parliament urged protection against commercial exploitation, including a ban on platforms offering financial rewards for kidfluencing, where children act as influencers. MEPs also pressed for urgent action on generative AI risks including deepfakes, companionship chatbots and AI-powered apps that create non-consensual manipulated images.

Rapporteur Christel Schaldemose (S&D, Denmark) said during the debate: “I am proud of this parliament, that we can stand together in protecting minors online. Together with strong, consistent enforcement of the Digital Services Act, these measures will dramatically raise the level of protection for children. We are finally drawing a line. We are saying clearly to platforms: your services are not designed for children. And the experiment ends here.”

Research cited in the report shows 97% of young people go online daily and 78% of 13 to 17-year-olds check their devices at least hourly. One in four minors display problematic or dysfunctional smartphone use mirroring addiction patterns.

The 2025 Eurobarometer found over 90% of Europeans view action to protect children online as urgent, particularly regarding social media’s impact on mental health (93%), cyberbullying (92%) and restricting age-inappropriate content (92%).

Member states have begun introducing measures such as age limits and verification systems.

Read more:

You may also like