A warning for parents: Here’s why I would never let my (future) kid use TikTok
I went viral(ish) on TikTok. I posted the video, went to bed and this is the disturbing reality that awaited me.
Gold Coast
Don't miss out on the headlines from Gold Coast. Followed categories will be added to My News.
I went viral (ish).
Well at least a video I posted on TikTok of my mother did.
For a person who doesn’t use TikTok I was shocked when I got tens of thousands of views within two days on a video of mum and I goofing around.
What started as a bit of fun with my 70-year-old mother attempting and failing at asking the equivalent of ChatGPT a question (there was a lot of nothing, a lot of ums and a lot of pauses), quickly turned into a bunch of incels attacking me and telling me – yes, you read it right – to go kill myself.
Huh?
Yeah, I’m just as confused as you are.
As a 33-year-old woman who considers herself relatively headstrong (on a good day) I was and still am shocked.
Not only did I receive dozens of comments from people I clearly do not know and have never spoken to, mansplaining how something I can already be heard explaining works, but direct messaging me with weird, and aggressive messages.
After God knows after how many messages and comments, I had had enough, so I did the worst thing possible. I engaged.
Yikes.
Big mistake.
It blew up, young men, boys (or incels in this case) banded together to gang up on me.
That’s when the comments of a picture of a man holding his hand as a gun to his head with the message “kill your b---- a-- up” started.
What? I’m so confused – I was sharing a funny video of my mum being utterly and unintentionally hilarious?
From that, to sticking up for myself online – it blew up.
These incels really didn’t like that.
I reported and blocked one account who started privately messaging me, and then hello – there’s a message from the same strange young man messaging me from a different account: “lol (laugh out loud) why’d you block me?”
Who the hell even are you?
The report went nowhere as TikTok didn’t think his code of conduct constituted harassment.
Maybe I’m just too old for this, but if this is considered “normal” behaviour it is utterly terrifying.
I’m a grown woman, how do young children cope with this baiting and downright disgusting behaviour?
I know the internet is a dark place, trust me. I grew up in the days of MSN, ChatRoulette and the dawn of the dark web.
But this? I know plenty of kids with access to TikTok.
If I’m exposed to this harassment, what are these kids’ experiencing?
If I, a grown woman, am overwhelmed by this, how are children supposed to cope with the realities of social media?
EXPERT OPINION: Trolls show desire to reassert control
Dr Veya Seekis BPsych (Hons), PhD | Lecturer School of Applied Psychology
When Ashleigh uploaded a playful TikTok of her mum experimenting with AI, she expected a few laughs, not a lesson in online misogyny.
Within 48 hours, the video was viewed in the tens of thousands.
But it wasn’t long before comments turned into condescension, mostly from men, lecturing her on how AI really works.
As a woman in media, she’s no stranger to unsolicited correction.
But this time, when she pushed back, the tone changed.
The comments turned personal.
Then violent.
Some told her to kill herself.
One sent DMs laced with intimidation.
She reported him and blocked him. He found her through another account.
This isn’t just trolling, it’s gendered harassment. And it’s disturbingly familiar.
In today’s climate, some men appear increasingly emboldened to put women ‘back in their place’, especially in spaces seen as male territory – tech, politics and even humour.
When women assert competence or claim visibility, they’re often met not just with disagreement, but with shaming, mockery and threats.
Social psychology offers insight into what’s happening here.
Take hostile sexism, which punishes women who challenge traditional roles, or benevolent sexism, which rewards women who conform.
Both can fuel reactions like the ones Ashleigh received: men who first ‘corrected’ her as if she were naive and then vilified her when she refused to stay silent.
Add to that our unconscious need to defend the social order, and it becomes easier to see how digital pile-ons aren’t just random outbursts, but expressions of a deeper desire to reassert control.
As a researcher of objectification, I’m used to examining how women are reduced to their bodies in digital spaces.
But objectification isn’t only visual, it’s psychological.
It’s about reducing women to a function, a stereotype, or a role: to be watched, judged, corrected, or dismissed.
In this case, Ashleigh wasn’t sexualised, she was silenced.
Her real offence, it seemed, wasn’t being wrong, it was not stepping aside to let a man explain it better.
What we often miss in conversations about online abuse is that humiliation is itself a form of objectification.
It says: you are not a full participant here; you are someone we get to manage, correct, or punish.
When women are gaslit, told they’re ‘too sensitive’, or ‘overreacting’, it’s a strategy to destabilise their voice.
To make them second-guess their competence.
To put them back in their ‘place’.
This is the emotional labour many women are forced to perform online; navigating spaces where their intelligence is suspect, and their resistance is pathologised. And when they do push back, they’re often met not with debate, but with abuse designed to shame them into silence.
That’s not discourse. That’s power preservation.
What makes all this worse is that the platforms meant to protect users often do the opposite.
After reporting one of the men who followed her across accounts to continue harassing her, Ashleigh received a response from TikTok saying the behaviour did not meet their criteria for harassment.
How bad does it have to get before it does?
Platforms like TikTok, and many others, continue to rely on narrow, reactive definitions of abuse.
They look for isolated slurs or explicit threats, missing the broader context of patterned, persistent, and gendered targeting.
They treat harassment like a technical glitch rather than what it is: a structural problem rooted in social power.
In doing so, they uphold the very hierarchies they claim to moderate.
We need platforms to do better, not just with detection, but with understanding and compassion.
That means training moderation teams in the dynamics of gendered abuse.
That means giving weight to patterns of behaviour, not just individual messages.
That means believing women when they say they feel unsafe.
And it means actively creating digital spaces where women aren’t just tolerated, but protected, heard, and respected. That includes designing tools that empower users to set boundaries and refusing to treat targeted harassment as a PR issue instead of a systemic one.
Doing nothing is not neutral, it’s a choice that enables harm.
Because this isn’t just about one video, one woman, or one platform.
It’s about the way we still punish women for being smart, speaking up, and not backing down.
It’s about how we dress up control as ‘correction’, and hostility as ‘debate’.
And it’s about who gets to feel entitled to online space, and who gets told, in thousands of ways, that they don’t belong.
If we want a digital world that reflects equity, not hierarchy, we need to stop calling this behaviour ‘harmless’.
We need to name it for what it is: gendered harassment designed to silence.
And we need to start listening – to women, to researchers, and to the patterns that are too often ignored.
More Coverage
Originally published as A warning for parents: Here’s why I would never let my (future) kid use TikTok