As social media profoundly shapes the behavior and interactions of young people worldwide, a growing number of governments are questioning the minimum age for accessing these platforms. Balancing child protection, privacy, and digital freedom, the rules vary considerably from one country to another. Here's an overview of the policies implemented internationally.
Different age requirements depending on the continent
In the United States, federal legislation known as COPPA (Children's Online Privacy Protection Act) prohibits companies from collecting personal data on children under 13 without parental consent. As a result, most platforms—TikTok, Instagram, Snapchat—set the minimum age at 13. This limit, however, is often circumvented, largely due to the lack of systematic identity verification.
In Asia, several countries are adopting a stricter approach. In China, minors must undergo mandatory identity verification. Since 2021, authorities have also imposed screen time restrictions, notably through "anti-addiction" systems on video apps. In South Korea, the law requires parental consent for those under 14 to register for an online service.
Europe, between harmonization and diversity
Since 2018, the General Data Protection Regulation (GDPR) has allowed Member States to set a minimum age for access to digital services between 13 and 16 years old.
- Germany, Ireland, Netherlands: 16 years.
- Italy, Spain: 14 years.
- France: 15 years old. French law requires parental consent for those under 15, but a recent legislative proposal aims to completely prohibit access to social networks below this age.
- United Kingdom: 13 years old, in accordance with the international standard applied by the majority of platforms, although the country has implemented the Age Appropriate Design Code, requiring platforms to adapt their services for minors.
This diversity within the European Union reflects the difficulties of harmonization, despite the existence of a common framework.
Australia is moving towards mandatory age verification for those up to 16 years old.
In Australia, the official minimum age for using social media platforms remains 13, in accordance with the terms and conditions of TikTok, Meta, and Snapchat. However, in 2023, the Australian government launched a public consultation on introducing mandatory age verification for accessing social media platforms, with the aim of setting the minimum age at 16. This proposal is part of a broader project to reform the online protection of minors and is based on studies showing the harmful effects of early exposure to social media.
A global debate on mental health and platform responsibility
Numerous scientific studies have established links between intensive social media use among teenagers and an increase in anxiety, depression, and low self-esteem. These findings are generating growing concern worldwide. In response, several governments are seeking to strengthen legislation, notably by requiring age verification or increasing the transparency of algorithms. Meanwhile, the platforms are developing tools such as parental controls, screen time limits, and "teen modes," but are struggling to guarantee their effectiveness in the face of potential circumvention.
In summary, for 13- to 16-year-olds, access thresholds for social media vary considerably from country to country. A global trend is emerging: a strengthening of mechanisms to protect minors. France, by considering a total ban for those under 15, is following an international trend that places young people's mental health and digital safety at the heart of public debate. It remains to be seen whether future legislation will manage to reconcile effectiveness, respect for digital rights, and technical feasibility.
