The New Law
Starting 10 December 2025, Australia will introduce a law banning individuals under the age of 16 from creating accounts on specific social media platforms. Enforced under the Online Safety Act, the legislation is designed to safeguard children’s health and wellbeing by reducing their exposure to online risks such as inappropriate content, cyberbullying, and algorithm-driven features that may impact mental health.
Platforms covered by the law include Facebook, Instagram, TikTok, Snapchat, X, Reddit, Kick, Threads, and YouTube, with significant penalties for those that fail to comply.
How the Rules Will Work
Age-restricted platforms must show that they have taken ‘reasonable steps’ to prevent under-16s from holding accounts, usually through some form of age assurance. Meta, the company behind some of the most popular age-restricted social media platforms, has not revealed exactly how it will identify under-16 users, to avoid giving minors ways to bypass the rules.
The law targets platforms, not the children themselves, so legal responsibility falls on companies rather than parents or minors. Under-16s will not be able to hold accounts on age-restricted platforms, including existing accounts, but they can still access content while logged out.
Some services are exempt from the minimum age requirement, including:
- Messaging, email, voice, or video calling apps;
- Online games;
- Services providing product or service information;
- Professional networking and development platforms; or
- Education and health services.
These exemptions aim to protect children from social media harms while maintaining access to essential communication, learning and health services.
Helping Your Children Stay Safe
Even with the law in place, parents and guardians remain essential in keeping children safe online. Practical steps include:
- Talking openly with children about digital safety and online behaviour;
- Using parental controls or monitoring tools where appropriate; and
- Encouraging offline activities to reduce time spent on social media.
The law provides a safety net, but parental guidance is key to developing responsible social media habits.
Penalties for Non-Compliance
Platforms that breach the minimum age rules face serious penalties such as:
- Individual providers: up to 30,000 penalty units (approx. A$9.9 million)
- Corporate bodies: up to 150,000 penalty units (approx. A$49.5 million)
These penalties are purposely large to reflect the significance of the harms the social media minimum-age ban is intended to safeguard against.
Potential Outcomes
If enforced effectively, the law could reduce children’s exposure to harmful content and encourage safer online habits. It may also set an international precedent for proactive digital safety regulation. However, unintended consequences are possible, such as minors moving to less-regulated platforms or privacy concerns arising from verification methods.
Final Thoughts
Australia’s social media minimum-age ban is a big step. Whether it works will depend on how well the rules are enforced, how responsibly platforms handle compliance and how involved parents stay in guiding their kids online. For everyone, from parents to policymakers to legal professionals, it is a chance to rethink how we keep children safe while they navigate the digital world.

