Australia will start on Wednesday implementing the world's first nationwide ban on social media accounts for children under 16, preventing them from using or maintaining profiles on major platforms such as Facebook, Instagram, X, TikTok, YouTube, Snapchat and others.
Describing the move as "a social media delay," the Australian government says it aims to shield minors from risks linked to online platforms, including cyberbullying, harmful content, grooming and predatory behavior.
"The aim of the new law is to protect young people from design features that encourage them to spend too much time on screens and show them content that can be harmful to their health and wellbeing," Australia's eSafety commissioner said on its website.
The ban officially begins on Dec. 10.
Hundreds of thousands of Australian teens are already active on social media, with reportedly about 440,000 aged 13-15 on Snapchat, roughly 350,000 on Instagram, 325,000 on YouTube and more than 200,000 on TikTok.
The measures fall under the Online Safety Amendment (Social Media Minimum Age) Act 2024, which requires age-restricted social media platforms to "take reasonable steps to prevent Australians under the age of 16 from creating or keeping an account," the government says.
The country's eSafety lists Facebook, Instagram, Kick, Reddit, Snapchat, Threads, TikTok, Twitch, X and YouTube as restricted platforms.
Messaging services such as WhatsApp and Messenger are not covered by the ban. Other platforms and tools, including Discord, GitHub, Google Classroom, LEGO Play, Pinterest, Steam, Steam Chat and YouTube Kids, are also excluded.
Despite criticism from parents and digital rights groups about the harms of online gaming platforms such as Roblox, it remains outside the scope. The regulator notes that "services that have the sole or primary purpose of messaging, email, voice calling or video calling" and online games are exempt.
The government says restricted platforms will be required to identify and remove accounts held by under-16s and block loopholes that allow minors to circumvent the rules.
Although no specific method is mandated, the government has encouraged platforms to use "age assurance technologies" based on trial findings released in August.
Some steps under consideration include requesting official ID verification, with alternative options for users unwilling to share sensitive documents, age inference tools that analyze user behavior and data patterns, facial assessment or voice-recognition verification, requiring a photo, video or audio sample and attempts to prevent under-16s from accessing platforms via VPNs.
Under the new law, penalties target platforms rather than children or their parents. Non-compliant platforms face fines of up to 49.5 million Australian dollars ($33 million).
The country's eSafety says it "will monitor compliance and enforce the law," using its regulatory powers under the Online Safety Act.
"Age-restricted platforms are also expected to give users who are under 16 information about how they can download their account information in a simple and seamless way prior to account deactivation or removal, or request access to their information within a reasonable period after account deactivation," eSafety said.
"The information should be provided in a format that is easily accessible. Platforms should consider formats that could allow end-users to transfer their information and content to other services, or to upload the information on the same platform if they sign up again after turning 16," it added.
Meta announced that it has begun removing accounts of Australian users under 16 from Instagram, Facebook and Threads. According to the BBC, users aged 13-15 were informed their accounts would start being deactivated from Dec. 4.
The company estimates that 150,000 Facebook users and 350,000 Instagram accounts will be affected. As Threads is accessed via Instagram, those accounts will also be impacted.
Meta has suggested that the government require app stores to verify age at download and obtain parental approval for under-16s, rather than placing the burden solely on platforms.
Snapchat, meanwhile, said it will rely on inference signals for user ages and apply age-assurance tools to verify them.
YouTube has said that it will comply with the under-16s social media ban and keep underage users locked out of accounts, but its parent company, Google, warned that the new laws "won't keep teens safer online."