Lurking underneath the teen social media ban is an unsettling game of Would You Rather

In the months since Prime Minister Anthony Albanese mentioned to two commercial radio hosts that Australians under 16 probably shouldn’t be on social media, teenagers, parents and pundits alike have been forced to guess how a seemingly inevitable ban would actually work.

The resulting bill, which broke the land speed record when it passed into law on Thursday night, left most of our burning questions unanswered, but a last-minute amendment contained a surprising clue.

The ink is still drying on section 63DB of the bill — an addition specifying that social media companies can’t insist on ID as the only means of age assurance.

That eleventh-hour change was designed to safeguard our privacy but may well have triggered the countdown to a somewhat unsettling game of Would You Rather for about 20 million Australian social media users: hand over your ID or your facial data if you want to use the platforms.

Previously, providing ID had seemed the most likely method, given it’s still the only ironclad way to verify a person’s age.

Now, platforms will have to give us at least one other option, and biometric data is the next most likely candidate — specifically facial estimation technology, which guesses your age based on your appearance.

Loading YouTube content

The privacy commissioner’s recent findings against Bunnings over its use of facial recognition technology in stores is a reminder that our faces are no joke — biometrics are considered one of the most sensitive types of personal data.

While on the face of it, so to speak, blocking social media companies from insisting on ID might seem like a privacy win, some policy experts are worried it’s leaping from one frying pan into yet another frying pan.

“Perhaps more likely, is a situation where platforms opt for privacy-invasive technologies … including the use of biometrics, as they have few other viable options”, said Lizzie O’Shea, the Chair of Digital Rights Watch.

Why social media companies might want to scan your face

Biometrics aren’t the only non-ID option on the table, but it’s not hard to imagine that, given a choice, many people will feel less nervous about a face scan, similar to the one that unlocks their phone, than uploading their licence.

On top of that, there are already signs that social media companies might favour facial scanning — TikTok, Tinder and Meta have all incorporated some version of the technology already, as part of their voluntary efforts in this department.

Neither option, biometrics or government-issued ID, is ideal from a privacy perspective, but leaving privacy to one side for a moment, there’s still the question of accuracy.

According to an issues paper from the eSafety commissioner, a US study earlier this year examined six algorithms designed to guess a person’s age based on their face alone, and found they work better on some population groups than others.

“For example, some algorithms had higher error rates for people wearing glasses, and error rates were almost always higher for female faces than for male faces,” it found.

“Results from another Australian study also found that accuracy varied based on ethnicity — with higher accuracy for faces categorised as Caucasian, and lower for those in the African category,” according to eSafety’s paper.

Two words that could unravel the ban

The strange part is, for all the sleep collectively lost over this ban, the law doesn’t actually seem too concerned about how it happens — even if the result is far less dependable than an iron-clad ID verification.

After all, the law only states that platforms must take “reasonable steps” to stop Australians under 16 from holding an account.

One of the most pressing and as yet unanswered questions that remains is what “reasonable” really means, and the woman with the answers is eSafety Commissioner Julie Inman Grant.

As the online safety regulator, she’ll have the unenviable task of doing what the government has declined to do in the black letter of the law: telling platforms specifically what’s expected of them.

Loading

The big reveal isn’t expected for months, but when she does issue her advice, Julie Inman Grant will be drawing the battlelines for any subsequent court cases she’ll have to fight if a social media company ever wants to challenge one of the $50 million fines threatened in the legislation.

Whatever she tells them, it will only go so far in the eyes of a judge — “reasonable steps” when it comes to online safety is a notoriously hazy turn of phrase once you test it out in Federal Court.

In fact, it’s the same phrase that played a starring role in eSafety’s failed Federal Court bid to force Elon Musk’s platform X to remove videos of the Wakeley church attack earlier this year.

In that case, legal scholars have pointed out that “reasonable steps” were two crucial words that Musk and his lawyers were able to waltz through.

“The articulation for what ‘reasonable steps’ means is at the heart of this scheme and we might see some platforms try to drive a stake through it in the courts,” warns Alice Dawkins, Executive Director at policy research outfit Reset Tech.

A graphic showing a hand holding a phone. It is superimposed on a drawing of Australia and surrounded by social media apps

Social media companies may end up in Australian courts if they try to fight the laws. (ABC NEWS: Evan Young)

The government has its reasons for keeping things vague — tech has a habit of evolving faster than laws can be written.

According to the bill’s explanatory memorandum, the wording is designed to “allow for platforms and the age assurance industry to evolve over time and continue to comply with the obligation, without the need for legislative amendments”.

Not that those reasons will help the regulator’s lawyers on the day they square up against a tech giant in federal court.

And of course, it may not come to that.

Social media platforms might do a marvellous job of enforcing Australia’s new age ban, rendering fines unnecessary.

Stranger things have happened.

This post was originally published on this site