Social Media, To Ban or Not to Ban – Part 1

Social Media: To Ban or Not to Ban
Part 1: The Government’s Opening Salvo

Prelude

I started writing about this topic about four weeks ago. It began life as a sub section in my editorial “Privacy, GenAI, You and Social Media”, where I had the opportunity to ask Zsofi Paterson, the CEO of Tinybeans some questions around GenAI and social media.

As I chase the topic down the rabbit hole, it got too vast and deviated from the focus of that editorial. Even as I broke it out into its own piece, the feedback from other DRN editors was that I need to break it up some more.

The TL;DR

This article is the first part of an editorial discussing a proposed social media ban by the Australian Government, which aims to enforce a minimum age of 16 for social media use by the end of 2024. The legislation focuses on protecting children from online harm while supporting parents and carers in managing their children’s online interactions.

Australia is no stranger to offering online protection measures, starting with early internet filtering in schools in the late 1990s and the failed National Filter Scheme in 2007. Despite these efforts, dangers such as sextortion and cyberbullying continue to pose serious risks to children. The Australian Centre to Counter Child Exploitation (ACCCE) reported 40,232 cases of child sexual exploitation in the 2022-23 financial year.

We also cover off the recent developments like Instagram’s introduction of “Teen Accounts” for users aged 13 to 17, designed to limit unwanted contact, reduce exposure to harmful content.

In conclusion, this editorial highlight the growing concerns about child safety online, with social media being a significant risk vector. However, dangers also exist in online gaming and other platforms. The next part of the editorial will explore actions taken by other technology companies to address these challenges.

Introduction

In recent weeks, there has been substantial chatter around the Albanese Labor Government’s intention to introduce legislation enforcing a social media and other relevant digital platforms minimum age (until potentially sixteen years of age is Albanese’ preference). A Commonwealth-led initiative focusing on the protection of Australian children from online harm, the legislation will also support parents and carers to protect their charges.

It is hardly surprising communities have erupted with discussions and divisions regarding this potential legislation. The digital world may be a great levelling medium, providing children all over the world equal access to information and services, but with paradigm shifts come new inequalities.

As a parent myself, the balancing and guiding of children’s interactions with others in any medium is a work in progress at the best of times. Throw in a mix of ages gap of up to ten years, online games and social media interactions, and the capacity to create an absolute legislative nightmare becomes very real.

A Brief (And No Doubt Incomplete) History Of Australian Online Protection

In the late 1990’s I was a fresh-faced support technician when the internet first become available in Victorian schools, on dial up no less. Back then I was doing a lot of Novell (anyone remember them?) BorderManager implementations to filter the internet traffic for schools.

A while later the Department of Education introduced a keyword filter to block unsavoury content from school age children. It didn’t take long for teachers to tell me that students have found ways around it.

Various forms of internet filtering have been since been implemented in Victorian Government schools. Other types of schools have also created their own internet filter policies. But to be honest it is a game of cat and mouse with students. The filtering acts more as a deterrent to the casual “offender” rather than a full Fort Knox type of security.

In 2007 the Howard Liberal Government created the National Filter Scheme “NetAlert” programme. This initiative made available fifteen free software products for parents and carers to block content on their home computers. Famously the 2007 NetAlert filter was “hacked” by a 16 year old in about thirty minutes. Former Senator Helen Coonan had this to say at the time, “Unfortunately, no single measure can protect children from online harm and … traditional parenting skills have never been more important.

This National Filter Scheme did not last long and was shut down by the then Labor Communications Minister Stephen Conroy in 2008. Statistics collected at the time showed an adoption rate of barely over 10% of the target market at the time the programme was shut down. The replacement solution was to be an ISP-level internet filtering policy.

Many industry leaders at the time advocated for better cyber-safety education. Symantec Australia’s Managing Director Craig Scroggie, noted that “consistent, relevant and memorable” information was needed to protect our children.

These measures were an early attempt to introduce tools to assist parents and children with safe web browsing. At this time browsing was the dominant activity on the internet, along with Internet Relay Chat (IRC) and AOL rooms.

The Apple App Store was only introduced in July 2008 with iOS 2.0, which triggered the current era of apps. Notably the NetAlert filter predates the launch of the wildly popular social media platforms such as Instagram (2010) and Snapchat (2011), with the latter still particularly popular with teenagers to date.

Internet censorship in Australia was further broadened in June 2015, enabling court-ordered censorship of websites deemed to primarily facilitate copyright infringement.

The Traps of Social Media

The minimum age for the creation of a social media account generically seems to be 13 years old. This appears to be the case for Facebook, Snapchat, Instagram, Tiktok, X and Discord. However, some basic research shows this varies by country (legislated), app and sometimes the clauses in the App’s Terms of Service.

To say social media can be toxic would be a fair comment to make. But more broadly, any forum of interaction can be – just have a peek at some subreddits! Yet social media does not even have that layer of moderators to run interference.

This issue is obviously on the mind of many Australian parents and the current Australian government of the day. With a Federal election due within the next twelve months, watch this space.
Brendan Hawke, one of our DRN editors is at the coal face daily as an educator. One app of significant concern from his experience is Snapchat. He describes it as “the perfect tool for sextortion because kids think they can post stuff and have it disappear. Then it tells you when someone has screenshotted and saved your posts. The problem is horrifically widespread.

The bigger problem is that kid safe apps will only work if it is a common platform. But the first kid to choose an app is often the one who dictates the platform that the social group connects to en masse.

In Brendan’s experience, the ring leaders are almost always the ones with little or no parental oversight. The safeguarding of our children requires foresight and coordination – between parents and the school community at large. At the same time, a school has limited influence outside of the formal learning environment, which leaves the tricky conversations to be conducted in isolation between parent groups.

Sextortion is only one aspect of abuse that is happening online. Cyberbullying is also prevalent and can fly under the radar when children are too scared to disclose it to trusted adults.

What are the statistics in Australia?

What Brendan was talking about is sexual extortion, also known as sextortion.

The Australian Centre to Counter Child Exploitation (ACCCE) defines it as “a form of blackmail where someone tricks you into sending your sexual images then threatens to share them unless demands are met.

According to ACCCE statistics, in the 2022-23 financial year, the ACCCE Child Protection Triage Unit received 40,232 reports of child sexual exploitation.

In the 2023 calendar year, they received an average of 300 reports a month. In the first six months of 2024, the numbers have declined to an average of 93 per month. It is early days and there is plenty of work to do to get the numbers to zero.

Looking at statistics provided by the eSafety Commissioner, in May 2023 tallied 1,700 sextortion cases reported just in the first quarter of 2023 with 90% were from males (mostly from ages 18 to 24 years).

In the same period of January to March 2023, the eSafety Commissioner received almost 3,000 reports of child sexual abuse material (CSAM). In September 2023 it was reported that one in eight children have been coerced into producing CSAM remotely. “Even in the safety and sanctity of your home, online predators can infiltrate your child’s world through smart devices.

Analysis also reveals that 25 per cent of all analysed material had been produced in areas of a family home: 16 per cent in a bedroom, 5 per cent in a living room and 4 per cent in a bathroom. With an overwhelming 88% of the material featured girls and 86% are children of pre-pubescent age.

Clearly social media is not the only avenue where bad actors are finding their victims, and also some of the most at-risk demographics is males outside the age group proposed by the Albanese government.

Who Is Doing What? Instagram

Hot off the press as I am working through this editorial, on 17 September 2024 Instagram announced the introduction of Instagram Teen Accounts for all of its users aged 13 years old to 17 years old. It is a new controlled environment designed to prevent unwanted contact and inappropriate material, as well as offering tools for parents to manage screen time.

Accounts which fall within this age range will transition automatically to this new environment. A quick look through the main items is:

Private accounts: With default private accounts, teens need to accept new followers and people who don’t follow them can’t see their content or interact with them. This applies to all teens under 16 (including those already on Instagram and those signing up) and teens under 18 when they sign up for the app.

Messaging restrictions: Teens will be placed in the strictest messaging settings, so they can only be messaged by people they follow or are already connected to on app.

Sensitive content restrictions: Teens will automatically be placed into the most restrictive setting of our sensitive content control, which limits the type of sensitive content (such as content that shows people fighting or promotes cosmetic procedures) teens see in places like Explore and Reels.

Limited interactions: Teens can only be tagged or mentioned by people they follow. We’ll also automatically turn on the most restrictive version of our anti-bullying feature, Hidden Words, so that offensive words and phrases will be filtered out of teens’ comments and DM requests.

Time limit reminders: Teens will get notifications telling them to leave the app after 60 minutes each day.

Sleep mode enabled: Sleep mode will be turned on between 10 PM and 7 AM, which will mute notifications overnight and send auto-replies to DMs.

Instagram calls it a reimagined experience for teens and on face value there is some good focus on key issues. Within 60 days, Teen Accounts will be enforced in Australia, United Kingdom and United States – three of the five countries in the HMD survey. The EU will follow by the end of 2024 with the rest of the world to follow commencing in January 2025.

Meta has stated that Teen Accounts will be brought to other Meta platforms in 2025.

Interim Conclusion

In this first part we explored a brief and incomplete history of internet filtering in Australia.

We have had a brief but sobering look at the sextortion and CSAM statistics in Australia – our children. Child exploitation exists, and social media is just one vector that is widely used. Yes some of it comes through social media, but as has been the case for a long time, in online games as well.

Lastly, we covered the recent move by Meta to introduce Instagram Teen Accounts.

In the next part, we will explore what other parties in the technology space are doing about this issue.


Discover more from Digital Reviews Network

Subscribe to get the latest posts sent to your email.

This post was originally published on this site