Listen to Australian and world news, and follow trending topics with
TRANSCRIPT
Can we trust tech companies to keep vulnerable Australians safe?
Well, social media company Meta would like you to think they’re up to the task.
They’ve begun an attempt to re-shape the narrative around how their platforms Facebook and Instagram are impacting children, with Meta’s Regional Policy Director Mia Garlick announcing teenage users will now have new restrictions on their Instagram account.
“Today, we’re excited to share that we’re introducing teen accounts on Instagram that will automatically place teens into built-in protections and reassure parents that teens are having safe experiences. From today, we’ll start the process of teenagers automatically being placed into these new age-appropriate protective settings which limits who can contact them, the content they can see and also provides new ways for them to safely explore their interests and passions. And teens under 16 years of age will need a parent’s permission to change any of the settings to be less strict.”
The announcement follows growing pressure from the Albanese government to address concerns held by parents and academics about harms children are being exposed to on social media.
Meta was also found to be the third most distrusted brand in Australia according to a new Roy Morgan survey of 2000 people.
Dr Karley Beckman, a Senior Lecturer in Digital Technologies for Learning at the University of Wollongong, says Meta is trying to regain the community trust and avoid the government’s proposed ban on young teenagers accessing their platforms.
“I think it’s definitely a bit of a PR move by Meta. It looks to me like they’re trying to get ahead of the laws and they’re really focused on trying to regain trust with parents. A lot of the conversation and the media around teens on social media has really centred upon the adults in their lives and mainly parents being really concerned for their children’s wellbeing and the effects of social media.”
Lisa Given, Professor of Information Sciences at R-M-I-T University, says Meta is hoping to ease the minds of parents but the move may actually put more responsibility on them to monitor their children.
“So, what they’re basically saying is that this teen account will give parents more control, more visibility. So, for example, parents will now be able to see who their children are messaging with. The challenge is that that is actually putting a lot of onus on parents to actually monitor and police the content instead of say the technology company being responsible for policing it on the other side.”
These new accounts will also set teenage users profiles to private by default, have an automatic sleep mode which mutes notifications between 10pm and 7am, and includes a notification to leave the app after 60 minutes of use each day.
Carly Dober, a practicing psychologist and Director at the Australian Association of Psychologists, says these changes are welcome but the new announcement doesn’t address a lot of issues on the Instagram platform.
“So, I think it’s a start. I don’t think it goes anywhere near far enough. It is talking about a block out period, which is great because teens need sleep. Without sleep, your mental health isn’t going to be fantastic. So that’s really good. Filtering out words for bullying and discrimination and other sensitive words. That’s interesting. I would like to know more about what that actually looks like and if it also covers things like eating disorder content, misinformation, disinformation, because there’s been no talk about that, which is still an outstanding concern, not just for young people, but for all people on these apps.”
The Albanese government’s proposed social media ban promises to cut off access for kids using new technology that could accurately determine their age.
They haven’t settled on an exact age when kids would be granted access but it’s likely to be somewhere between 14 and 16-years-old.
Academics have criticised the plan as coming much too late and failing to pressure platforms to address the harmful content which can impact Australians of all ages.
Professor Given of R-M-I-T says it’s one of a number of quick solutions to a complex problem.
“Many people are looking, I think for a quick fix. None of us want kids to be harmed online but does that mean we kind of throw baby out with bath water and say they can’t be on any social media platform? That’s not really practical. It’s not really technically possible, but more importantly, there’s a lot of good things on social media that we’d then be excluding those kids from. It can be the same with this new teen Instagram as well, because if kids are especially using social media in order to educate themselves and be part of communities that are fulfilling a need in their lives, and that means that now they’ve got that kind of parental big brother oversight, they may actually choose to walk away and not participate, that could actually end up being more harmful.”
Dr Joanne Orlando, a researcher in Digital Literacy and Digital Wellbeing at Western Sydney University says she’s not confident either Albanese’s ban or these new teen accounts will really stop tech-savvy kids who can use V-P-N technology to get around local policies.
“I think the platforms keep bringing out these rehashed parental features, and one of the absolute blaring gaps with this feature is that it’s only being trialled in I think four countries, which means A VPN is easily used once again by young people. They just sign up with an account using access via another country that doesn’t have any laws. So, it’s easily bypassed by kids.”
So, what form of regulation can get to the heart of the issue?
Psychologist, Carly Dober, says self-regulation by profit-motivated companies is never reliable and the government needs to be challenging tech companies to make significant changes to the content moderation on their platforms – not just for kids – but for all users.
“I really don’t think that the self-regulation model is helpful or effective. Again, we have seen today that these tech giants, they make changes when they feel like their hands are forced. And even then, personally, professionally, I don’t think it goes far enough. And I argue and I believe strongly that the government must take a stronger approach. The research tells us that when you’re 18, the world doesn’t magically change. Because again, like you said, you can turn 18 and you have full freedom with these apps, and then what happens? You maybe fall into an echo chamber, you become an incel when you haven’t gotten the tools to learn how to navigate the tech space in a healthy and safe way. Again, without any disinformation or misinformation changes, how does it protect you?”
Experts are also pointing to better education for teenagers so they can be equipped with the skills and abilities to navigate social media safely and distinguish between safe and potentially harmful content.
Dr Orlando says this needs to avoid tired moralising that makes social media out to be some big, bad cultural menace as teenagers are likely to tune that out.
“The way we’re approaching it right now is a bit like how we used to approach sex education. It’s this taboo thing, don’t do it. All the downside of it, and of course young people were still doing it, but just not telling anyone. But now it’s really changed, and we discuss consent and it’s really unpacked so that young people have a really good understanding so they can make the decisions with information and evidence and really make a good decision. So I think that’s what we need to approach with social media, that real kind of holistic, unpacking it all, not with judgment, but helping young people to really understand the space so they can make good decisions around it.”