Ban

Remember Jerry! It's not a lie, if you believe it!

February 2024

I have been a Gamer for as long as I can remember! From playing Asteroids and Galaga on my Atari in the 80's, Unreal Tournament and Doom on LAN connections in the 90's, right up to Call of Duty and Gran Turismo today, I have always loved gaming!

As a 50 year old, much to my wife's annoyance 😁, I still have my racing simulator sitting in front of the big screen TV in our Theatre Room at home, so I can smash out a few laps on F1 2023 in between games of PGA 2K23! Though my games are a bit tamer these days, they are great fun and an awesome stress reliever.

This article is difficult for me to write! Finding the balance in understanding the great fun and positivity our kids experience in the gaming world, compared to the horrible stuff I see every day, is extremely challenging. It is a constant battle for me to get my education right.

One of the reasons my talks at schools are so popular is because I can relate to the gaming culture of juniors much more than most other educators in my field. However, the risks and harms I have witnessed across gaming over the past 10 years cannot and must not be underestimated or taken lightly.

Speak to any young gamer, and Discord is the platform at the forefront of any conversation. An online chat environment released in 2015 and designed by Jason Citron (pictured above), Discord has steadily grown to be the highest rising online network being used by juveniles in Australia.

Even though the network (App and Website) is 13+, almost 40% of Australian children age 10 to 12 are using it! 66% of those kids are boys, with over 50% of them chatting on the network every day. These numbers coming directly from the 135 thousand Year 5 & 6 students I have surveyed over the past 4 years.

Compared to most of the other networks I discuss, Discord is not the worst design. If used with the right level of caution, filtering, moderation and parental guidance, people of all ages can have a positive and engaging chat experience about anything from online gaming to painting!

However, like all online environments, Discord can be dangerous and can lead users to areas of high risk and harm. Over the past few years I have been involved in direct investigations where some horrific crimes have occurred through the use of Discord.

Quite often I am confronted with questions such as "Why are parents allowing their kids on Discord?" and "Why are these kids using it?" To a degree, these questions are warranted. I discuss Discord at all my parent presentations, with almost 40% of parents unaware their kids are in fact using it.

I get that many parents need to get up to speed and become more aware and involved in the online world of their kids. It is always the first piece of advice I offer them. However, perhaps the questions which should be asked with just as much frustration are, "Why are Discord allowing these children on the network?" and "Why are they not working harder to protect them?"

The fight to protect kids online cannot be one sided! Parents, educators and users themselves should not be left to accept that burden themselves. Citron needs to take more responsibility for the safety of the environment he has created.

Citron recently fronted the US Senate Hearing into Online Child Safety, where he was asked to voluntarily attend the hearing in order to provide information as to what Discord is doing to protect children on the network.

The request was submitted based on the fact his network was responsible for a considerable number of online child exploitation cases in recent times. Discord had also been identified in a number of child kidnapping cases and child prostitution investigations within the US and overseas (including Australia). Mr Citron rejected the invitation to attend the hearing.

When the committee re-sent the request, Mr Citron was informed that if he refused to attend voluntarily, they may elect to issue a subpoena for him to attend. Mr Citron again refused, and so the subpoena was issued and delivered! After Citron refused to accept the subpoena, the Senate Committee were forced to send two US Marshals to Discord's head office in California, to ensure his attendance and "escort him if required".

In his opening statement to the Senate, Mr Citron, without a hint of embarrassment stated, "I am pleased to be here today, to discuss the important topic of the online safety of minors.".

This attitude from Citron outlines precisely the matters I have been alluding to for a number of years. On the outside Big Tech promote this false concern for the safety of users, when in reality it is the exact opposite. Citron was pretty much dragged to this hearing kicking and screaming, yet when he gets there, he exudes a persona of true concern.

All five global networks attending the Senate hearing (Meta, X, Snap, TikTok and Discord) boast they are doing everything they can to make their networks safer. They allege to be spending millions on "tools" and "systems" to "improve user experience". Sadly, in my opinion these systems are merely token in nature and they are dismally failing.

In 2023, Discord made a revenue of $600million. That was up from $135million in 2020. In 2021, the company reached a valuation of $15billion, so I would expect that would be much higher today.

Discord

The Discord Network

Discord has an average of 5million active users per day visit the network. These visits either to chat in gaming servers, participate in video calls and streaming or sharing digital content.

However, Discord only has 800 people "working" on their network. The vast majority of those employees are working on design and "other areas", with only 120 people working on moderation and safety. This is my most crucial point! 120 people monitoring 5million users per day? That's 208thousand users per minute!

Each of those users busily posting content, visiting multiple servers, chatting with randoms, streaming video and watching others doing the same. How is this acceptable? Why have we as a society allowed this to happen? Why have legislators failed to act?

Meta chief Mark Zuckerberg is very good at deflecting such questions, stating he is spending a great deal of money on AI and tools to pick up on harm. Citron is no doubt also relying on software based mechanisms to pick up on illegal or offending behaviour. But are they working?

I was contacted last year by the older brother of a 13 year old girl in Perth who was being groomed by a man in Melbourne. They started chatting in a gaming server (Roblox), then shifted the conversation to a private server. This male was alleging to be 18, when in fact he is 31. His sister had failed to log out of the server, so when her brother jumped on moments later and read the chat thread, he was shocked at what he was reading.

The boy, sixteen years of age and a student from one of my schools, then contacted me for advice. Obviously we got their parents on the phone and the police were called. The Melbourne male was arrested and charged the following day! I raise this matter for one reason and one reason only.

This 31 year old male, was very careful about what he was saying and posting. His language was very precise but guarded. His audio chats were softly spoken and the manner in which he presented himself was careful, almost rehearsed. He knew exactly what he was doing and being very careful about how he was doing it.

Even the best AI would not have been able to pick up on his grooming behaviour, but a human would have, very early and especially once the private chat commenced. Her 16 year old brother knew straight away his sister was in immediate danger. Why did Discord not suspect the same?

All five global networks attending the Senate hearing (Meta, X, Snap, TikTok and Discord) boast they are doing everything they can to make their networks safer. They allege to be spending millions on "tools" and "systems" to "improve user experience". Sadly, in my opinion these systems are merely token in nature and they are dismally failing.

The point of this article is simple. We will never be able to fully eradicate online risk and harm. However, Big Tech have failed monumentally in their methods to identify and address such events. Their response to user reports and pleas for help have always been and continue to be ignored globally, and millions of users have suffered as a result of that inaction.

Tens of thousands of people on this planet are now dead because Big Tech did not care and they failed to act on the most basic of requests for action. Requests that if addressed ethically could have saved lives.

How many more kids are going to go through hell before they finally start to listen and act?