1 hour ago
About sharing
“I never falsely suggested anything,” Simeon Boikov tells me.
Under the alter ego “Aussie Cossack”, he posted untrue speculation that a 20-year-old Jewish university student was the attacker who had stabbed and killed five women and one man at a Westfield shopping centre in Sydney.
He said on X: “Unconfirmed reports identify the Bondi attacker as Benjamin Cohen. Cohen? Really? And to think so many commentators tried to initially blame Muslims.”
The actual attacker, shot dead by police, was later identified as Joel Cauchi, 40. The authorities say his actions were most probably related to his mental health.
Within hours of Mr Boikov’s post on X, the false claims he amplified had reached hundreds of thousands of people on X and Telegram, and had even been repeated by a national news outlet.
I tracked him down because I want to understand how his posts triggered an online frenzy that reached the mainstream media – with serious consequences for Mr Cohen, who’s described his distress at being accused of an attack he had nothing to do with.
Mr Boikov is speaking to me from the Russian consulate in Sydney, to where he fled more than a year ago after a warrant was issued over his arrest for alleged assault. The pro-Kremlin social media personality was granted Russian citizenship by Russian President Vladimir Putin last year – and has requested political asylum in Russia.
He wasn’t the first user to mention the name Benjamin Cohen. It appeared to originate on a small account sharing almost exclusively anti-Israel content.
This is one of the ways disinformation now spreads.
Digital disinformation expert Marc Owen-Jones says: “It’s less obvious and suspicious than if an influential and known partisan account was to initially tweet it.
“Then more established accounts can use this ‘seeded’ narrative as if it’s a legitimate vox pop, and claim they are just ‘reporting’ what’s being said online.”
There were also other larger accounts suggesting the attack was somehow connected to Israel or Gaza – before Aussie Cossack’s posts on X.
But his were the first featuring Mr Cohen’s name to go viral.
That’s likely because he had purchased a blue tick, meaning his content was prioritised ahead of other users and appeared higher up on people’s feeds, including users who did not follow him.
The initial post racked up more than 400,000 views, according to X’s own data – before police identified the attacker as Cauchi, not Benjamin Cohen. Aussie Cossack followed up with another post on X with the likeness of a video showing the actual attacker, Cauchi, side by side with a picture of Mr Cohen.
On Telegram, he also posted a screengrab of Mr Cohen’s LinkedIn page, revealing where he worked and studied.
But speaking to me, Mr Boikov stresses the scepticism in his tweet – he says he was the “first large platform to warn this is unconfirmed”.
He suggests he pointed out the unconfirmed nature of the claim to “the hundreds of thousands of people who saw my posts”.
However, comments from lots of users online in response to his posts suggest they viewed it the opposite way, and assumed Mr Cohen was the attacker.
I challenged Mr Boikov on how his posts had amplified false claims to hundreds of thousands of people, causing serious harm to the student at its centre. This came as families were – and still are – grieving for loved ones killed in the attack.
“Sorry, love, you’re doing that right now,” he said. “What you’re doing now is you are talking about the speculation of a false claim, and you’re writing a story about it.”
Mr Boikov’s is one of hundreds of very active accounts on X with blue ticks now regularly sharing content in this way – whether or not it’s true.
Under X’s new guidelines – since Elon Musk bought the social media company – users can receive a “share of the revenue” generated by ads from their posts, if they purchase a blue tick.
Aussie Cossack’s posts were picked up and re-circulated by dozens of other accounts, including several with a track record of sharing false claims. Several regularly share content critical of Israel or content relating to the war in Gaza.
These false accusations soon bled on to other social media platforms.
“Benjamin Cohen” was the search option suggested on several videos of the attack by TikTok when I was looking through content related to the stabbing on Saturday night.
Scrolling through these clips, I found the comments were littered with his name before the police had confirmed the real identity of the attacker.
“The attacker’s name is Benjamin Cohen IDF Soldier,” one user wrote. Their account had no posts, and no profile picture. I sent a message. No response.
“Shame he’s a Jew right? Why don’t the media outlets label him?” another account wrote on a video showing people running through the mall. As soon as I messaged this one asking about its comments, it blocked me.
Repeated by news channel
It’s hard to confirm definitively where these accounts are based. They have the hallmarks of inauthentic profiles, without any identifying features and sharing divisive comments repeatedly.
X, Telegram and TikTok have not yet replied to the BBC’s requests for comment.
Worryingly, the speculation was picked up by Australian media outlet 7News, which named Benjamin Cohen as the “40-year-old lone wolf attacker”. Screengrabs of their report further fuelled the wildfire online.
7News later retracted the report and apologised, attributing it to “human error”.
But by this point, antisemitic threats were being directed at Benjamin Cohen, who has described the incident as “highly distressing and disappointing to myself and my family”. He has expressed shock not only that he was falsely accused repeatedly on social media, but that even a major news network had identified him.
While the social media frenzy was unfolding, his dad Mark Cohen defended his son on X. He called on New South Wales Police to reveal the name of the attacker “before this nonsense claiming it was my son causes more harm”.
In parallel, false claims were circulating that the attacker was Muslim. These were shared by prominent journalists and political accounts on X with hundreds of thousands of followers from the UK to the US.
British journalist and presenter Julia Hartley-Brewer suggested the stabbings were “another terror attack by another Islamist terrorist”, while TV presenter Rachel Riley said it was part of a “Global Intifada”. They both later retracted their posts.
Hartley-Brewer posted that she had been “incorrect” and that the Sydney massacre “was not an Islamist terror attack”, while Riley said she was “sorry” if her message had been “misunderstood”.
Dozens of accounts on TikTok also spread false claims that the attacker was Muslim. I messaged several of them – but they haven’t responded.
New South Wales Police have suggested the real attacker, Cauchi, deliberately targeted women – who make up five out of six of the victims.
Several online forums dedicated to incels – a subculture who define themselves as unable to get a sexual partner, despite desiring one – have praised Cauchi as one of their own for the attack.
But so far there’s no concrete evidence of Cauchi being involved directly with these online movements. When asked why Cauchi could have targeted women, his father said that his son had “wanted a girlfriend” and that had had “no social skills and was frustrated out of his brain”.
Increasingly, attacks in the real world are being followed by this kind of social media frenzy – where misinformation is hugely amplified.
For the families and friends of those who have been killed, and the innocent bystanders falsely accused, this toxic rumour-mill is causing serious harm.