Australia social media ban sets up fight with big tech over children’s safety

Sports

Australia social media ban sets up fight with big tech over children’s safety

2025-12-08 22:28:38

Lily Jamali,North American technology correspondent, San Franciscoand

Tiffany Turnbull,Sydney

Getty Images A group of 5 tech CEOs are sworn in at a US Senate Judiciary Committee hearingGetty Images

When Stephen Schiller became head of Facebook in Australia in early 2010, he was a true believer in the power of the internet and social media for the greater good.

It would herald a new era of global connectivity and the democratization of learning. It will allow users to build their own public plazas without traditional gatekeepers.

“There was a phase of extreme optimism when I first joined and I think a lot of the world shared that,” he told the BBC.

But by the time he left the company in 2017, the seeds of doubt had been planted about its work, and they have flowered ever since.

He believes that “there are a lot of good things about these platforms, but there are a lot of bad things.”

This is no longer an uncommon view as scrutiny of the largest social media companies grows around the world. Much of it has focused on teenagers, who have emerged as a lucrative market for incredibly wealthy global corporations – to the detriment of their mental health and well-being, according to critics.

Various governments, from Utah to the European Union, have experimented with limiting children’s use of social media. But the most radical move yet is set to unfold in Australia: a ban on under-16s, which has left tech companies scrambling.

Many of the affected social media companies have spent a year loudly protesting the new law, which requires them to take “reasonable steps” to prevent underage users from having accounts on their platforms.

They have claimed that this ban actually risks making children less safe, argued that it infringes on their rights, and repeatedly pointed to questions about the technology that would be used to enforce the policy.

“Australia is engaged in blanket censorship that will leave its young people less informed, less connected, and less equipped to navigate areas they are expected to understand as adults,” said Paul Tasci of NetChoice, a trade group that represents several major technology companies.

The concern within the industry is that Australia’s ban – the first of its kind – may inspire other countries.

“This could become a proof of concept that gains traction around the world,” says Nate Fast, a professor at the University of Southern California’s Marshall School of Business.

Whistleblowers, Lawsuits and Questions

Getty Images Former Meta engineer Arturo Bejar speaks during a rally from a podium with a blue banner reading "Protecting children online". The US Capitol can be seen in the background. Getty Images

In recent years, numerous whistleblowers and lawsuits have claimed that social media companies prioritize profits over user safety.

In January, a landmark trial will begin in the US to hear claims that several – including Meta, TikTok, Snapchat and YouTube – designed their apps to be addictive and deliberately covered up the harm their platforms were causing. Everyone denies it, but Meta founder Mark Zuckerberg and Snap director Evan Spiegel denied it Both were ordered to testify in person.

The case brings together hundreds of claims from parents and school districts, and is among the first to come forward after a torrent of similar lawsuits alleging that social media contributes to poor mental health and child exploitation.

In another ongoing case, prosecutors allege that Zuckerberg personally thwarted efforts to improve the well-being of teens on the company’s platforms, including vetoing a proposal to get rid of Instagram’s face-altering beauty filters, which experts say fuels body dysmorphia and eating disorders.

Former Meta employees Sarah Wynn Williams, Frances Haugen, and Arturo Bejar have testified before the US Congress alleging a range of irregularities they observed during their tenures at the company.

Meta confirms that the company has worked hard to create tools that keep teens safe online.

Watch: What do teens think about social media bans in Australia?

But the broader industry has recently been taken to task for misinformation, disinformation, hate speech and violent content.

Footage of Charlie Kirk’s assassination went viral on various platforms, even encountering people who weren’t looking for it. Elon Musk has sued states in the US over laws that require social media companies, including X, to determine and disclose how they combat online hate speech. Meta came under fire earlier this year afterward Declaring that it is getting rid of fact-checkers Who monitor its platforms for misinformation.

A rare bipartisan front has emerged among US lawmakers eager to downsize tech CEOs.

During a hearing last year, someone urged Zuckerberg to apologize to bereaved families who came to watch in person. Among those in attendance was Tammy Rodriguez, whose 11-year-old daughter Selena committed suicide after being sexually exploited on Instagram and Snapchat.

“That’s why we’re investing so much and will continue to make industry-wide efforts to make sure no one has to go through the things your families went through,” Zuckerberg said.

Public scrutiny and private pressure

However, there is widespread criticism from many experts, lawmakers, parents — and even children — who feel that social media companies are hiding from taking real action and accountability on these issues.

While Australia’s social media ban was being considered and then drafted, companies had little to say publicly.

“Hiding from public discourse… only breeds more suspicion and more distrust,” says Schiller.

However, many were secretly seeking to bend the government’s ear. Spiegel sat down personally with Australian Communications Minister Annika Wells. She also claimed that YouTube sent world-famous children’s entertainers The Wiggles to lobby on their behalf.

In carefully worded public statements, many companies have tried to push responsibility elsewhere. Both Meta and Snap said the operators of the major app stores — namely Apple and Google — should take over age verification duties.

Many argued that the government was overstepping its bounds. They say parents know best and should decide what makes sense for their teens when it comes to social media use.

“While we are committed to meeting our legal obligations, we have consistently raised concerns about this law… There is a better way: legislation that enables parents to approve app downloads and age verification allows families – not the government – ​​to decide which apps teens can access,” said a statement from Meta provided to the BBC.

WATCH: Annika Wells says big tech companies won’t be intimidated by Australian social media ban

When asked why her government was unsympathetic to that reasoning — why anything short of a ban was unacceptable — Wells said tech companies had plenty of time to improve their practices.

“They’ve spent 15 or 20 years in this business doing it on their own volition now, and … it’s not enough.”

She says leaders in other countries feel the same way and have been knocking on her door for help, citing the European Union, Fiji, Greece and even Malta as examples.

Denmark and Norway have already started working on similar laws, and Singapore and Brazil are watching closely as well.

“We’re thrilled to be the first, we’re proud to be the first, and we stand ready to help any other jurisdiction that seeks to do these things,” Wells said.

Too little, too late?

As the Australian ban looms, increasing pressure has prompted companies to offer versions of their products marketed as safer for younger users, said Pinar Yildirim, a marketing professor at the University of Pennsylvania’s Wharton School.

Australia, after all, is a major market for social platforms. At parliamentary hearings in October, Snapchat said it believed it had about 440,000 account users in the country between the ages of 13 and 15. TikTok said it has about 200,000 accounts under the age of 16, and Meta said it has about 450,000 accounts between Facebook and Instagram.

Experts say they are also keen to ensure others do not lose out in larger markets around the world.

In July, YouTube announced the rollout of artificial intelligence technology that estimates a user’s age in an effort to better identify and protect people under 18 from harmful content.

Snapchat has special accounts for kids that it says turns on security and privacy settings by default for users ages 13 to 17.

Last year, Meta unveiled Instagram Teen Accounts that put users under 18 into more restrictive privacy and content settings, which Meta says are designed to limit unwanted contacts and exposure to explicit content. This development was accompanied by a massive marketing campaign in the United States.

“If they create a more protective environment for these users, the thinking is that might reduce some of the damage,” Yildirim said.

However, critics are not satisfied. Biggar, a Meta whistleblower, led a study published in September that found nearly two-thirds of new security tools on Meta’s Instagram Teen accounts were ineffective.

“The main issue here is that Meta and other social media companies are not substantively addressing the harm that we know teenagers are exposed to,” Biggar told the BBC.

Getty Images Little girl with ponytail in pink sweater looks at phoneGetty Images

Companies have been forced to go on the defensive, trying to communicate that they are making a good faith effort to comply with the impending Australian ban despite not agreeing with it.

But analysts say they hope the obstacles — which include legal challenges, technological lapses for children, and any unintended consequences of the ban — will strengthen the case against such moves in other countries.

Professor Fast points out that companies “have a fair amount of influence over how smoothly things run”.

“[They] They have an incentive to walk a very fine line in terms of compliance, but make sure they don’t comply so well that other countries say, “Great, this works.” “Let’s do the same,” Mr. Schiller agrees.

And the fines – a maximum of A$49.5 million ($33 million, £24.5 million) for serious violations – may be viewed as a cost of doing business, according to Carnegie Mellon University marketing professor Ari Lightman. “[They’re] “A drop in the bucket,” he says, especially for larger players eager to secure the next generation of potential users.

Despite concerns about policy implementation, Schiller says he feels this is social media’s “seatbelt moment.”

“Some might argue that bad regulation is worse than no regulation, and sometimes that’s true, but I think in this case, even imperfect regulation is better than nothing, or better than what we had before,” he says.

“Maybe it works, maybe it doesn’t, but at least we’re trying something.”

Watch: Explaining the social media ban in Australia… in 60 seconds

https://ichef.bbci.co.uk/news/1024/branded_news/4e1d/live/fcf22b70-d48a-11f0-a892-01d657345866.jpg

إرسال التعليق