EU ministers gathered on Monday to discuss how to better protect young people online – from misinformation and addiction to digital harassment. While not all proposals are likely to make it into law, the message was clear: the EU wants to get tougher on tech when it comes to kids. 

Phones out of the classroom?

One of the suggestions was a full ban on mobile phones in schools. Several EU countries have already gone down that path: France, the Netherlands, Italy, some Spanish regions, and more recently, Luxembourg, have introduced school-wide bans on phones during the day. 

Speaking to Euronews in December, Ben Carter, professor of medical statistics at King’s College London, said: “Nobody has the answer to whether banning them in schools is a good thing or a bad thing.” 

Some governments want to go further. France has proposed a Europe-wide ban on social media for under-15s. “In the absence of a European agreement, France will have to take action,” said Clara Chappaz, France’s minister for artificial intelligence. She added that she would try to “rally a coalition, with Spain, Greece, and now Ireland, to convince the European Commission.” 

French President Emmanuel Macron has also backed tighter rules, calling last year for a ban on smartphones for children under the age of 11 and tougher age verification on social media platforms. 

Spain pushes for digital ID checks

Spain’s Secretary of State for Youth and Children, Rubén Pérez Correa, stressed the need for more robust online age checks. He highlighted a new child protection law under discussion in the Spanish parliament, which would expand the use of the Cartera Digital – a national ID wallet app to access adult sites – to verify that users are over 18 before watching YouTube videos or creating an account on social media. 

Spain is also calling for stronger parental control systems to be built into platforms by default. 

The EU’s age-check problem

But so far, there’s no reliable and privacy-friendly system in place to verify users’ ages online. Meta – the parent company of Facebook and Instagram – says responsibility should vest with app stores, calling for checks to be done at that level instead. 

Several EU laws already require platforms to check the age of their users. The Digital Services Act (DSA) and the Audiovisual Media Services Directive (AVMSD) include provisions to shield children from harmful content, while the General Data Protection Regulation (GDPR) covers data privacy for minors. The proposed Child Sexual Abuse Material (CSAM) regulation, still under negotiation in the Council of the EU, would also require effective age identification to shield children against predators. 

“For some movement to be done, we need the European Commission and cooperation. Single countries will not be able to put effective age limitation [in place], whereas together it can be done,” Poland’s Education Minister Barbara Nowacka told Euronews. “It is possible, but it is of course a long-term project.” 

EU Commission: child safety is a priority

Youth organisations invited at the Council’s table urged the EU to better enforce existing laws – such as the DSA, AVMSD and the “Better Internet for Kids” strategy – which already contain tools to protect younger users but are often under-implemented. 

European Commissioner for Intergenerational Fairness, Youth, Culture and Sport, Glenn Micallef, told Euronews that several new initiatives are in the pipeline, including EU-wide guidelines on child protection, an action plan against cyberbullying, and a new study into the mental health impact of social media – due to be published before the end of 2025. 

Share.
Exit mobile version