Measures to block under-16s from accessing popular social media platforms are proving to be controversial and, so far, imperfect.
On the face of it, Australia’s under-16 social media ban has proved popular with voters. Some 77 per cent of Australians supported the ban in a YouGov poll in November 2024, while another poll by media and marketing specialist B&T taken two months later registered 68 per cent parental support. Teenagers understandably don’t favour it, but in a survey by consumer insights consultancy Nature, two-thirds acknowledged the toxic nature of certain social media platforms.
At its worst, the internet can present as a veritable cesspit of cyberbullying, rape threats, grooming by pedophiles, sex-related extortion, deepfake image-based abuse, misinformation and, more recently, unhealthy attachments to AI chatbots. For its part, the Federal Government has chosen to focus on selected sites that are particularly attractive to millions of young Australians: Facebook, Instagram, X, YouTube, TikTok and Snapchat.
In practice, the implementation of the ban has been a convoluted process. More than 50 companies took part in a trial that involved either verifying users’ age or providing an assurance of a user’s age. The verification options included government ID checks, such as passports and licences, mobile carrier data, and biometric methods such as card-based checking. Standalone biometrics that could be used without storing data, and methods including face recognition, voice analysis and hand movement analysis, all of which are used to estimate a user’s age.
The Age Assurance Technology Trial’s preliminary report was released in June and its general conclusion was encouraging enough. “Age assurance can be done in Australia and can be private, robust and effective,” it said. But a significant issue emerged with the subsequent resignation of a member of the trial’s overseeing panel. Tim Levy, managing director of Qoria, one of about a dozen Australian companies taking part, questioned its independence.
Some of the technologies tested had a reported 85 per cent error rate in estimating the age of users, requiring “18-month tolerances” and possibly multiple age-assurance steps. Levy says: “These results were predicted and echo sentiments expressed by Qoria and the parental control, school safety and social media platforms last year.”

According to him, UK-based Age Check Certification Scheme (ACCS), which oversaw the trial, has created an age assurance industry. “They’ve set the standards for age assurance and then they’ve gone and tested it,” he says. “They wanted to prove that it works, because that’s their business.”
Levy is not confident age verification alone can be effective. He believes the government’s focus on a handful of popular platforms will protect children from only one corner of the internet, and that mobile device management systems that can filter content across the internet would achieve a better outcome.
“The power of these tools that businesses and big schools, mostly in the US, have, and private schools in Australia have, is astounding,” Levy continues. “They can provide a safe, age-appropriate experience across the entirety of the internet, just not on the six social media platforms.”
Some, such as the Age Verification Providers Association, a global trade body for providers of age assurance technologies, believe there are simpler methods than biometrics for accurate assessments. “If anyone tries to claim they have devised an algorithm that can accurately assess your age within a day, a week or even a month of your real age based solely on a few selfies, they are not being truthful,” the association said in a media release in June.
Simpler methods could include utilising GovID or one of the ID verification technologies being developed by banks, telcos and other service providers. Another suggestion is for users to obtain an encrypted QR code from their state’s office of Births, Deaths and Marriages to be provided as proof of age to social media platforms.
The Federal Government’s COVIDSafe app was a prime case of sticking too long to an advanced but flawed technology solution, as low-tech QR verification ultimately proved more practical for tracking infections during the pandemic.
Ric Richardson, the Australian inventor who patented anti-piracy software for product activations, recommends taking government checks out of the equation altogether. Instead, he says, parents can add their children’s phone numbers to a blacklist. His SafeGen system would require social media companies to check this register when performing verifications. “Parents know exactly how old their kids are, and they know exactly what devices their kids use,” he says.
About a dozen Australian firms, including Sydney-based Australian Payments Plus, which showcased its established identity verification tool ConnectID, were among the 53 in the Age Assurance Technology Trial. “When a business requests age verification from their customer, ConnectID allows that user to choose a trusted entity,” says Andrew Black, the company’s managing director. “For example, their bank can confirm whether the user is over the required age, returning a simple ‘yes’ or ‘no’ response without needing to share date of birth with either the business or ConnectID.”

Other Australian companies in the trial included FrankieOne, Deep Media, One Click Group, R2 Labs, RightCrowd, ShareRing and TomorrowX. FrankieOne co-founder and chief technology officer Aaron Chipper says his company has aggregated more than 350 verification tools. “We can bring those different technologies together for the social media companies to be able to pick and choose what’s most appropriate for the customer,” he says.
Technology aside, there is also a risk that labelling the initiative as a ban will trigger FOMO (fear of missing out) in young teens who may seek to circumvent age verification via alternative strategies, including using a VPN or registering accounts in other countries. Australia’s eSafety Commissioner Julie Inman Grant is aware of this. At her National Press Club address in June, she reframed the ban as more of a delaying tactic, offering a timeout from the internet’s most toxic platforms during which teens can be taught how to identify and avoid dangers, as well as identify misinformation.
“Calling it a ban misunderstands its core purpose and the opportunity it presents,” she said. “We are not building a great Australian firewall … it may be more accurate to frame this as a social media delay, giving children a reprieve from the persuasive pull of platforms engineered to keep them digitally entranced – and entrenched.”
The education options for under-16 teens, however, have not received the same analysis and debate as the technology proposed for age assurance. These include a curriculum packaged by the Australian Curriculum, Assessment and Reporting Authority (ACARA) about the dangers of online platforms suitable for young teens, while the eSafety Commission has also compiled a website dedicated to online safety.
Others want to see measures beyond banning under-16s from social media platforms. Australian Martin Dougiamas is the founder of Moodle, a free open-source learning management platform he says is used by two-thirds of the world’s universities. Organisations download and run the platform on their servers to create a learning management system. He says educating children to navigate toxic forms of social media was like “teaching them to run safely down the middle of a highway”.
“We’re not actually attacking the real problem, which is that the platforms themselves are causing the problems,” he says. “The fact is they’re owned and controlled by profit-focused companies who are using advertising models. I would rather see the government put energy into Australia building its own infrastructure. Anybody in Australia can make new platforms on the internet. It’s something we just need to decide we can do, and that we’re not a second-class citizen in the world that relies on American companies to provide our infrastructure.”
Western Australian-developed DiGii Social, New Zealand’s MyMahi and German-based federated network Mastodon offer social media alternatives where standards can be set and enforced locally.
The ban, however, is far from the only shot in Inman Grant’s locker. She has worked with tech companies on an online code to limit children’s access to pornography, violent content, suicide and self-harm themes and sites that encourage disordered eating. Her office is driving a measure that will prompt an age assurance check for anyone logging into a search engine account, and the commission has also formed a youth council to liaise with teachers, parents, carers and young people. Yet still, it’s the proposed social media ban that continues to capture the most headlines.
Originally published in The List, Innovators 2025, The Australian, October 10, 2025
