Table of Contents
the legal battle between Australia’s online safety regulator and Elon Musk’s X is shaping up to be the first real test of the power governments can wield over tech platforms, not just within Australia but globally.
There have been skirmishes between Musk and eSafety Commissioner Julie Inman Grant before. But tensions rose following the stabbing of Bishop Mar Mari Emmanuel on April 15 while he was giving a livestreamed service at the Assyrian Church of Christ the Good Shepherd in the Sydney suburb of Wakeley.
The next day, X was ordered to delete 65 tweets containing the video of the knife attack. When he decided to hide the posts only from Australian users, the commissioner launched an urgent court case seeking an injunction. The federal court then ordered X to hide the posts from users around the world, pending a hearing on May 10.
If the order is upheld, But the case represents a further test of the regulator’s ability to force multinational technology companies to comply with Australian law.
“This is really the first real test of the powers of the eSafety commissioner,” says Australian Lawyers Alliance spokesperson Greg Barns. “The powers are quite substantial in the sense that he can order the removal of material, with the ability to impose quite significant daily fines.
“However, I think the limitation is also going to be tested because the difficulty is that making orders that are intended to have a global effect requires the cooperation of other countries.”
‘X poses a challenge’
The idea of an e-safety commissioner first emerged in 2013, when the Conservative coalition government came to power, as a way to tackle online child bullying.
Shortly after its launch in 2015, eSafety’s powers were expanded to include image-based abuse and, with the passage of the Online Safety Act in 2021, adult harassment.
At the time, critics warned that giving eSafety additional powers to regulate content under Australia’s classification code (something that was written in the days of VCRs and is still under review) could have wide ramifications for freedom of expression and what people can see online.
But for the most part, eSafety’s use of power hasn’t generated much controversy.
Of the 33,000 reports the office received about potentially illegal content in the last financial year, the eSafety commissioner passed off about half as informal requests for platforms to remove URLs. According to the commissioner’s office, 99% of them were related to child sexual abuse material. Only three formal warnings related to violent content were issued that year.
The regulator has also used its more graduated powers to remove URLs from search results, but has never used its power to remove an app from app stores.
Most of the time, platforms comply or eSafety does not pursue the matter. At least, that was the case until Musk took over X in late 2022.
Digital Rights Watch president Lizzie O’Shea says eSafety relied heavily on cooperation with technology platforms, but that has changed.
“The recent dispute with as far as regulators are concerned. ,” she says.
“The eSafety Commissioner has a difficult and important job, but her powers are also limited and must be balanced against other concerns, whether practical or human rights.”
eSafety and X are now involved in at least three legal cases in Australia over notices issued to the company. The knife attack warning is expected to be the first to be heard later this month.
‘A legitimate debate’
While the eSafety commissioner’s office argues that the use of its powers in this case is intended to “ensure that Australians are protected where possible from extremely violent and other Class 1 material”, placing the video of the attack with knife as his hill to die has raised some Eyebrows.
Emmanuel himself has argued that the video should remain online. And ABC Media Watch and News Corp columnist Andrew Bolt found a rare moment of unity in arguing that the video is not as bad as many others still available online. Opposition leader Peter Dutton has argued that seeking a global ban goes too far.
Australian federal police, in a sworn statement to the federal court, said the video could be used to recruit people for terrorist groups or carry out terrorist acts.
Alistair MacGibbon, who was Julie Inman Grant’s predecessor in the role and is now chief strategy officer at CyberCX, says he understands both sides of the argument, but the fact that police are describing it as a terrorist act changes the impact of the video.
“It has become a legitimate debate about whether or not a potentially inflammatory film should be circulated or not because it could lead others to commit acts of violence against certain parts of our community,” he says.
“That’s what community is about. In reality, sometimes it involves restricting broader rights to avoid [violence] against others… They deserve to be free of violence, as much as I deserve the freedom to watch a video.”
Barns says the office has to be very judicious when removing orders and this particular video likely caught the attention of eSafety because of the extensive media coverage. He points to the WikiLeaks “collateral murder” video as something the world needed to see, along with the footage from Gaza that has galvanized public opinion.
“It is a difficult exercise in terms of the exercise of freedom of expression and the right to know, on the one hand, and, on the other, gratuitous acts of violence that cannot serve any useful purpose and, in fact, can become a negative force in society.”
Julia Powles, associate professor of law and technology at the University of Western Australia, says “the regulator is too focused on removing singular content.”
“To regain public trust, it is necessary to take a systemic, victim-informed approach to the structural factors of online hate and abuse.”
Technical solutions, social problems.
On Monday, Communications Minister Michelle Rowland announced the government was seeking feedback on whether the office’s enforcement powers and sanctions are fit for purpose, as part of the review of the Online Safety Act.
Two days later, it announced $6.5 million in additional funding for the eSafety commissioner to test age-monitoring technology for adult social media and websites.
Both O’Shea and Powles say they are concerned about eSafety’s focus on private communications, particularly when it comes to proposed standards for detecting child sexual abuse material and terrorist material in end-to-end encrypted communications.
“The government focuses a lot on technical solutions to these problems, when often they are social in nature and require political leadership,” says O’Shea.
Barns says the government should wait for the outcome of the X case before making changes to the powers of the eSafety commissioner.
“It potentially provides an opportunity for courts to explore the scope of the legislation and the way in which the powers can and cannot be used,” he says. “And it might be wise to wait for a decision that can explore those areas before rushing to pass more laws.”