Ignorance

Everyone else is responding, when those you should act, refuse to!

January 2025

# WARNING - The following article discusses suicide and online child exploitation. Reader discretion is advised.

This week, I have read a couple of research papers coming out of the US which appear to contradict a great deal of evidence collected from myself and peers in my field. The papers allege there is no correlation at all between SM use and adolescent mental health.

These reports draw opinion based on research collected between 2006 and 2014, with MySpace and Facebook the main networks being analysed. Of the 220 plus reports commissioned over the past 19 years referred to by many commentators in this space, over 35% of them do not even mention TikTok or Snap Chat, the primary apps used by teens today.

The evidence we need to hear and truly evaluate is what is being collected right now and over the past ten years at most. Data collected before 2015, does not truly reflect the massive rise in teen and pre-teen use of SM. More importantly, it does not address the methods which are being used by Big Tech to attract and engage adolescent users.

In my opinion, I believe it is extremely important to question any research coming out of the US over that country’s next term of government. It is very apparent Big Tech have aligned themselves with incoming leadership, which has already initiated significant changes in the USA’s response to online safety. In fact, I have seen more papers touting how SM does not impact adolescent mental health released out of the US in the last month, than I have in the past seven years!

Whether it is my footy coaching or my cyber safety education, I am massive on stats! Gathering data is a great tool in assessing the need for change or to identify trends, risks or room for improvement. As someone who has been at the coal face of juvenile SM use for 14 years, I have had the privilege of speaking with well over half-a-million Australian children aged 10 to 18 years. Over the past 7 years, I have surveyed over 50 thousand Australian children.

In no way should my research be the only data we rely on to assess or implement change in this country or globally. Nor should it be solely cyber safety advocates who must be listened to when it comes to understanding the risks or benefits of the online world. Though I do offer my understanding to them, I feel some people in my field are too aggressive in their approach to getting kids off the internet.

It is always important for me to remain as balanced as I can. Whilst I do not want to be an old, grey-haired guy who comes out to schools to tell kids off, I still tell my juvenile audiences about the risks the internet does present to them.

But we must also not overlook there is a massive number of children in Australia who are using SM effectively and responsibly. They are successfully walking the tight-rope that is juvenile SM use and engaging in so many positive and fun ways that I never had as a kid. In my education to them, I discuss how delicate that rope is.

Over the past two years in particular, my education has changed somewhat. Though I still discuss the risks and positives of their online world, I have started to share with them exactly what I worry about. I outline that my fear and that of their parents is not so much ‘what if’ they stuff up online, but ‘when’!

This is a very difficult world for me. Over the past sixteen years, I have seen the worst the online world has to offer. I have dealt with the offenders and victims of some of this nation’s worst online crimes, and I have seen children suffer the most brutal acts at the hands of adults. I have worked directly with the families of five children who took their own lives as a result of online harm. Each week, I worry about who of ‘my kids’ might be next to make that terrible decision.

I don’t make such comments to seek sympathy, but to offer perspective. The life of a cyber safety advocate is tough and so very often it is overwhelming. The past year for me in particular has been extremely difficult. I have fought against threats from Big Tech and have battled with relentless feelings of wanting to simply give up because of how high this mountain is becoming. But I have to keep going because of what I am seeing and hearing directly from the kids I speak to.

I should be agreeing 100% to the ban introduced by our Government. I should be demanding kids under the age of 18 be removed from SM and my education should reflect the unwavering support of that. But I please ask you to understand it is not as simple as that!

Governments are failing our children by placing flimsy fences around environments which have been allowed to attract and entrap them with impunity. The onus to protect children on Social Media has been left to parents, educators, advocates and even children themselves, whilst Big Tech have been empowered to build their environments without the same regard to safety.

Kicking kids off SM does nothing to address the unethical design of SM networks and fails to ensure Big Tech act on a safety by design principal for their environments. They are being left to ply their trade without regulation or responsibility, ready to grab that 16-year-old Aussie kid as soon as they are of age!

As a kid in the 80’s, the online world for me was like my local park in Albany. Mum would say don’t talk to strangers and protective parents would keep an eye out as we played. Today, that analogy continues but all the while, you have Big Tech in their smelly old van, wearing their trench coat, sitting just around the corner with their pocket of lollies, waiting for that unsuspecting kid to stray!

The reason I am so critical of Big Tech is because of what I am experiencing the most in my schools. Out of everything I deal with, it is the lack of support from SM on reported content which is damaging our kids the most. Being ignored and betrayed by Meta, Snap Inc and Byte Dance when they desperately need to be heard, is negatively impacting on juvenile mental health globally. Being ignored in this way is damaging our children, and at higher levels than ever before, it is killing them!

Ignorance

My opening statement to all high school kids I speak to in Australia is this; “You will not realise how poorly you are protected online, until something goes wrong. I will try to help you, as well as your teachers and parents, but the internet does not give a damn about you. I do!”

That statement has reflected my lived reality since my first day at Technology Crime Investigation Unit in September of 2009, and it has not changed. The kids need to hear it. I would love it if none of them were on SM, but 68% to 86% of them from 12 to 16 years of age are.

A great deal of these kids do not encounter any major issues at all or can successfully navigate risk and harm as it presents itself. Many also have parental support with restrictions in place to minimise risk and have systems in place to build a culture of appropriate use. But even with the best intentions and guidance, our kids are being targeted and then thrown on the trash heap when something goes wrong.

We, as a society have allowed this evolution of indifference to occur. We have permitted Big Tech to devolve their responsibility to protect users and over the past 10 years it is our kids who have suffered the most as a result. We are blaming parents, educators, legislators and others for their failures to act, when the main offenders have been allowed to build without rules or responsibility.

VICTIMS ARE IGNORED

Between March of 2022 and September of 2024, 324 young Australians personally reported an offence of Sextortion to me. 268 of these young people aged 13 to 21, sent a sexual image or video via Instagram as a result of being coerced by a scammer. The average age was 16 years, with 262 being males, 4 female and 2 transgender children.

Of the 268 scam accounts on Instagram;
    • 7 (2.6%) were removed by Instagram within 12 hours, after being reported 1 to 5 times.
    • 31 (11.5%) were removed by Instagram within 24 hours, after being reported between 6 to 10 times.
    • 45 (16.8%) were removed by Instagram within 36 hours, after being reported between 11 to 20 times.
    • 64 (23.8%) were removed by Instagram within 48 hours, after being reported between 21+ times.
    • 72 (26.8%) when reported, returned a message advising the account did “not go against our Community Guidelines”, and remained active on the network.
    • 49 (18.2%) were reported numerous times without response and remained active on the network.

Of the 72 accounts which returned a message advising the account did “not go against our Community Guidelines”;
    • 16 (22.2%) were removed within 24 hours, after being reported a further 20+ times.
    • 18 (25%) were removed within 48 hours, after being reported a further 20+ times.
    • 27 (37.5%) were reported several times again, returning the same ‘not against guidelines’ message and remained active on the network.
    • 11 (15.2%) were reported numerous times with no further response from Instagram and remained active on the network.

When discussing the impact SM has on juvenile mental health, it is vitally important to take into consideration this lack of support response from the networks. Sadly, this aspect of the debate is rarely discussed.

A teen who for some time, may well have been using SM without incident to engage with peers and enjoy non-harmful content, can within only moments have their life turned upside down. After a regretful decision or when trying to back away from an error of judgement, any teen or pre-teen will attempt to self-manage that error first. Telling a parent, peer or cyber safety educator is usually not their first thought, and in many instances, it is the last thing they want to do.

This complete lack of response from SM in the first instance, is a massive contributor to the negative mental health being experienced by Australian children. In many instance, this abandonment has resulted in children taking their own lives. The case of Ryan Last in the US outlines this precisely. Within four hours of Ryan being Sextorted, he was dead! He was abandoned by SM.

Ryan Last

Ryan Last

If a Lifeguard at a beach ignored a drowning child, would we accept that? If a teacher ignored a child in the play ground being beaten by other students, would they keep their job? If the owner of an adult porn shop allowed a 10-year-old to enter the store and flick through the magazines, would we accept that? In a real-world society, we would deride such ignorance of child safety. Yet we are accepting of Big Tech not adopting an equal expectation.

I do not overlook the argument that many kids should not be on SM. I am also never one to shy away from telling parents to take the lead role in the protection of their children online. I have been working in schools for fourteen years to drive better education, and I have been pushing governments here and overseas to respond legislatively to the dangers of the online world.

What has remained constant over the past sixteen years though, is the ignorance of SM networks to the harm of their environments and the protection of their users. This is set to increase as Meta and X begin to scale back moderation and safety mechanisms, and as TikTok continues to receive pressure to concede to US based ownership.

Everyone else seems to be stepping up to the plate to protect kids online except those who are creating the playgrounds our kids are in. They are enticing them in with open arms and false promises, yet when our kids need their help, those same SM networks are turning their backs.

They have blood on their hands.