It’s my pleasure to rise and speak on what is a very important issue for all Australians, and that is the online safety of Australians. We all want to know that, when we go online, we are not going to be subjected to some of the behaviour that we’ve just heard in Senator Payman’s speech.
I want to start by making a couple of important corrections to the representations that Senator Payman has made. Firstly, I think it’s very important to understand that there is the criminal law in this country. Many of the terrible things that Senator Payman outlined—threats to kill, other threats to menace, deepfake porn and the like—already have very strong existing provisions in both our civil and criminal law.
Of course, starting with the criminal law, it is an offence to threaten to kill, to threaten to harm or to menace, to harass or to intimidate someone using a carriage service, and there are very significant consequences for that. It was never the intention of the parliament to enact the online safety framework and cut across the critical role that the criminal law plays in protecting people from those sorts of heinous acts, regardless of whether they are online, on the street or otherwise in the community.
I also want to briefly mention—and I congratulate Senator Payman on her pregnancy; it’s very exciting that she’s about to give birth to a baby girl—and reference what Senator Payman has gone through personally. She talks about what she endured after she crossed the floor, and I make the very brief point that I think a lot of what she endured was isolation and condemnation by fellow members of the Labor Party after she exercised her conscience, crossed the floor and was forced to leave the party—or expelled. I want to put that on record.
I also want to make the very important point that the adult cyberabuse scheme in the Online Safety Act was designed to target serious, targeted harm, not lawful disagreement. The statutory test requires that material needs to be intended to cause serious harm—harm that is serious, not trivial, subjective or merely offensive. That high threshold was not accidental. It was a conscious safeguard to prevent regulatory overreach, so I want to flag it also in relation to the private senator’s bill. I want to flag that lowering the definition of adult cyberabuse risks converting a harm based safety regime into, potentially, what I would say is a speech-policing mechanism. In relation to the first particular example that Senator Payman gave, in relation to some of those heinous threats that were made, I am interested to understand the detail because there is no doubt that any sort of suggestion of criminal conduct or doxxing—the eSafety Commissioner does have very important powers to act in relation to some of those matters, but it sounds to me, Senator Payman, that they are principally criminal matters. As I say, there are very important provisions in our criminal law to deal with that sort of conduct.
I do want to raise my concerns in the context of the full Federal Court decision which was handed down just a couple of weeks ago, where the full Federal Court ruled in favour of the children’s rights activist Celine Baumgarten, finding that the eSafety Commissioner had improperly issued a take-down notice to X seeking to remove a post where Ms Baumgarten had raised concerns about a queer club at a Melbourne primary school. This was a damning finding against the eSafety Commissioner, because what we know is that, in fact, the eSafety Commissioner has been using these so-called ‘informal notices’, writing to the online platforms, saying that these are not within the terms and conditions of the platforms’ operations and requesting take-downs of those posts. However, what the court has found is that those informal notices did actually constitute take-down notices and that they were, in fact, in breach of the Online Safety Act because they did not meet the threshold for adult cyberabuse. So what the eSafety Commissioner has been doing in approximately hundreds of cases every year under the Adult Cyber Abuse Scheme is issuing notices that don’t comply with the law.
I do want to put on record that the eSafety Commission has now changed the way it goes about issuing those informal notices—after Celine Baumgarten took that matter to the Administrative Review Tribunal—making it very clear that there is no obligation for the platform to take any action and that it’s a voluntary scheme only. That’s in stark contrast to the sorts of notices that were previously being issued, which, of course, the full Federal Court found were unlawful. Those notices included a reference to section 7 of the Online Safety Act, suggesting very clearly that the conduct complained of was a breach of the Online Safety Act, specifically the cyberabuse scheme.
One of the recommendations in the Online Safety Act review is recommendation 14. That says:
For the avoidance of doubt, the legislation should make it clear that informal requests for takedown are legal and legitimate as they lead to quicker results for individuals who are often in severe distress.
Well, clearly, we’ve now had this full Federal Court decision, which makes it clear that you can’t turn something that’s unlawful into something that’s legal just by stating so. Recommendation 14 is, I would put, a way of trying to circumvent the current law, which the parliament, as I say, has very deliberately crafted to ensure that the eSafety Commissioner has powers to act in limited cases.
What I’m concerned about in relation to the Celine Baumgarten case is that she has, in my view, raised very legitimate concerns about extreme gender activism at a primary school. She wrote in her post and took significant issue with the fact that the school, she felt, was indoctrinating young children aged between eight and 12 about radical gender ideology. She said:
Children should NOT be learning about sexualities at such a young, impressionable age. This is foul. Leave the kids ALONE.
She certainly did identify a particular teacher. This teacher had actually published some information about the queer club in a school newsletter, and this, of course, ended online. My concern about lowering the threshold is that there is a real risk that, if we turn the Adult Cyber Abuse Scheme into one which prevents someone from being deeply offended, we are then getting into a whole new world of stifling free speech, and we know that this is a fundamental right for all Australian. This is a fundamental right. I have to say that we are proud of the intention to combat some of the risks that we are seeing now online. As I say, the online safety of all Australian is incredibly important, but in recent times we have seen a number of decisions by the eSafety Commissioner that I would suggest are really a bridge too far. They are really stepping into the area of stifling free speech, and that is a fundamental right of every single Australian.
I want to refer to the evidence that the eSafety Commissioner gave in estimates when I was questioning her about this case. This, of course, was before the full Federal Court made this decision. In a statement that I released on 25 February I said that the eSafety Commissioner should clarify her evidence, because, when I asked about Ms Baumgarten’s post, at one point the eSafety Commissioner did actually say that she thought this was adult cyberabuse and then she corrected her evidence. So it was quite confusing, and then the general counsel for the eSafety Commissioner made it clear:
We didn’t see it as adult cyberabuse. That’s our assessment: it wasn’t adult cyberabuse.
We have to get the balance right. That is critically important. The Federal Court scrutinised the limits of the eSafety Commissioner’s powers under the act, and I think that what this case has illustrated is that we’ve got to be very careful to ensure there are objective legal standards. There’s a lot of constitutional sensitivity surrounding the regulation of online speech, given there is an implied freedom of political speech in our Constitution, and if that threshold were lowered, there are real concerns that, in a case such as that of Celine Baumgarten, concerns about freedom of speech would intensify.
There was another case, involving a person by the name of Chris Elston, known as Billboard Chris. That particular case also demonstrated overreach by the eSafety Commissioner when he took the eSafety Commissioner on and, again, the eSafety Commissioner was found to have overreached in her powers. This was back in 2023-24, when the eSafety Commissioner issued a removal notice that referred to an Australian transgender activist. The commissioner formed the view that the material constituted adult cyberabuse under the act and the post was geoblocked in Australia following the notice. This was challenged in the Federal Court and, again, the decision was made that this material did not meet the statutory threshold of being intended to cause serious harm. That is the really key issue here. The act provides that there must be an intention to cause serious harm.
If we were to adopt the approach of Senator Payman in relation to conduct which causes deep offence, then we are straying into a very, very different world. Someone can say something where there’s no intention to cause offence, but, by changing the act quite dramatically to include that and many other recommendations in the online safety review, I think we are then facing a whole new world in terms of the government’s obligation to protect freedom of speech. I certainly think there is merit in looking at issues where perhaps there are some gaps in the law. Senator Payman identified that there was no room for the eSafety Commissioner to act because it involved an organisation, not a person. I’m not suggesting that there is not merit in perhaps having a look at some aspects of the Online Safety Act, but I am concerned that we are entering a whole new world where the eSafety Commissioner is using the Online Safety Act in a way that the parliament did not intend. That, as I said, is a bridge too far.
We’ve got to remember that serious offence is not necessarily harm. Democracies can be noisy. Conversations can be robust; they can often be uncomfortable. Political speech can be confronting, offensive, passionate, even harsh. But offence is not the same as cognisable harm. So, if subjective distress becomes the benchmark, almost any controversial view could be suppressed. This could have a chilling effect on public debate. We cannot lower the threshold to cross this bridge. It is a bridge too far.