About 550,000 social media accounts have been blocked by Meta as part of Australia's new law intended to protect children from harmful online experiences. This significant action coincided with the enforcement of a ban that restricts Australians under 16 from accessing platforms like Instagram and Facebook.
Initially implemented in December, the law's aim is to shield minors from exposure to damaging content and algorithms. The worldwide observation of Australia's move has drawn attention from various entities advocating for child safety online.
While companies such as Meta have expressed support for measures aimed at safeguarding youth, they advocate for alternative strategies. In a recent blog post, Meta emphasized the need for engaging with the Australian government to explore better solutions rather than enforcing blanket bans.
During the first week of compliance, Meta reported blocking 330,639 accounts on Instagram, 173,497 on Facebook, and 39,916 on Threads. The company argues for (app store level age verification) to facilitate better compliance and proposed incorporating parental approval exemptions.
Australia’s initiative, being one of the strictest globally, has garnered high approval ratings among parents but has faced criticism. Some experts note that children could easily bypass these restrictions, either through technological manipulation or switching to less secure online alternatives.
Mental health advocates and children’s rights groups have raised concerns that such a ban may hinder young people’s social interactions, particularly for those from diverse communities. As various governments around the world consider similar restrictions, Australia’s no-exemption policy for parental approval sets a new and rigorous precedence.


















