Apple and Google mandated by Australia to curb terrorism and child sexual abuse on their platforms
‘There is, of course, no Australian internet, so these standards will require changes by companies no matter where they are headquartered,’ Inman Grant said.
Global tech giants will be forced to tackle child sexual abuse material and pro-terror content on their platforms under two new standards to be enforced by Australia’s eSafety commissioner.
From Dec. 22, tech companies Apple, Google, Microsoft, and Meta will be required to take “meaningful steps” to stop this harmful content being stored, distributed or generated under the ‘world first’ standards.
In what eSafety Commissioner Julie Inman Grant described as a world-first, “nudify apps” and AI tools used to create pornography are also explicitly targeted under the standards.
Inman Grant said the standards were a significant step forward in the battle to protect children online and could have global implications for tech firms.
“These standards will be enforceable and require industry to take meaningful steps to prevent their platforms and services from being used to solicit, generate, store, and distribute the most reprehensible and harmful online material imaginable, including child sexual abuse.”
The new standards are known as the Designated Internet Services (DIS) and Relevant Electronic Services (RES) standards.
They cover file and photo storage services including Apple iCloud, Google Drive, and Microsoft OneDrive, along with chat and messaging services like Meta’s WhatsApp.
Inman Grant said the companies that own and operate these services must take responsibility and deter their “misuse.”
“There is, of course, no Australian internet, so these standards will require changes by companies no matter where they are headquartered.
“We know cloud-based file and photo storage services are often used by those who store and share child sexual abuse material and we also know many popular messaging services are also used by these predators to distribute this material, too.
How Did this Power Come About?
The eSafety Commissioner has the power to create and enforce these standards under authority granted under the Online Safety Act 2021.
This provides the commissioner with the tools to set rules for online safety and hold tech companies accountable for harmful content online.
Given no motion passed the Parliament to disallow the standards, they became legally binding in late December.
Privacy Concerns Raised
Apple expressed privacy concerns with the standards in January 2022, while acknowledging issues with the proliferation of abhorrent child sexual abuse material and pro-terror content.
It raised concerns that eSafety could require providers to build backdoors into end-to-end encrypted services to monitor data they were currently unable to access.
“We believe there are alternative ways to achieve the goal of combatting abhorrent content that do not require undermining the privacy and security of all Australians and we urge eSafety to allow providers flexibility to pursue those means.”
Extension on Online Pornography Content
Meanwhile, the eSafety Commissioner also revealed it had granted an extension to the tech industry on draft industry codes that protect children from online pornography and other high-impact content.
“While much of the public focus has been on the new social media age restrictions legislation, this upcoming regulation is just one interconnecting element of a holistic approach to keeping children safe online which includes continued digital literacy for children and empowerment of parents,” Inman Grant added.
“Our codes and standards, our transparency powers and the age assurance trial currently underway can all support social media age restrictions to provide an umbrella of protection for children and young people.”
The eSafety commissioner is working closely with the government to implement the under-16s social media ban, which will come into effect within 12 months.