[ad_1]
“These and other interventions enable WhatsApp to make 1.3 million reports of child sexual exploitation and abuse each year,” she added.
The standards will also cover child sexual abuse material and terrorist propaganda created using open-source software and generative AI. A growing number of Australian students for example are creating so-called “deepfake porn” of their classmates and sharing it in classrooms.
“We’re seeing synthetic child sexual abuse material being reported through our hotlines, and that’s particularly concerning to our colleagues in law enforcement, because they spend a lot of time doing victim identification so that they can actually save children who are being abused,” she said.
“I think the regulatory scrutiny has to be at the design phase. If we’re not building in and testing the efficacy and robustness of these guardrails at the design phase, once they’re out in the wild, and they’re replicating, then we’re just playing probably an endless and somewhat hopeless game of whack-a-mole.”
Inman Grant’s office has commenced public consultation on the draft standards, a process that will run for 31 days. She said the final versions of the standards will be tabled in federal parliament and come into effect six months after they’re registered.
“The standards also require these companies to have sufficient trust and safety, resourcing and personnel. You can’t do content moderation if you’re not investing in those personnel, policies, processes and technologies,” she said.
“And you can’t have your cake and eat it too. And what I mean by that is, if you’re not scanning for child sexual abuse, but then you provide no way for the public to report to you when they come across it on your services, then you are effectively turning a blind eye to live crime scenes happening on your platform.”
The introduction of the standards comes after social media giant X – formerly known as Twitter – refused to pay a $610,500 fine from the eSafety Commissioner for allegedly failing to adequately tackle child exploitation material on its platform.
Loading
X has filed an application for a judicial review in the Federal Court.
“eSafety continues to consider its options in relation to X Corp’s non-compliance with the reporting notice but cannot comment on legal proceedings,” a spokesman for the commissioner said.
Get news and reviews on technology, gadgets and gaming in our Technology newsletter every Friday. Sign up here.
[ad_2]
Source link