[ad_1]
Loading
“What this all shows is that it’s clear that stronger regulation is needed to ensure that platforms build safeguards in their systems without exception,” Farthing said.
“The current system has left us playing ‘whack a mole’, issuing a takedown notice on a page or a post, but it leaves the system completely untouched to millions of other equally similarly risky bits of content.”
“Australia’s Online Safety Act leaves platforms able to decide what steps they want to take, and offers suggestions, but it’s not clear enough and we need meaningful change.”
An overarching duty of care is needed, according to Farthing, so that the platforms themselves are responsible for keeping each of their systems and elements safe. Reset’s recommended changes would also be platform-neutral, meaning they would apply to whatever platform succeeds TikTok, should it be banned.
“They cannot be allowed to pick and choose which safeguards they use, or which systems they protect, as this inevitably leads to patchy protection,” she said.
“We need stronger accountability and enforcement mechanisms including enhanced civil penalties and the ability to ‘turn off’ services which demonstrate persistent failures.
“We need systemic, future-proofed regulation otherwise we are not going to be able to safeguard the quality of life that Australians currently have.”
Communications minister Michelle Rowland said the government expects online platforms to take reasonable steps to ensure Australians can use their services safely, and to proactively minimise unlawful and harmful material and activity on their services.
Loading
“No Australian should be subjected to seriously harmful content online, and the Albanese government is committed to ensuring social media platforms play their part in keeping all Australians safe when using their services,” she said.
“In addition to the review, in November I commenced public consultation on amendments to the Basic Online Safety Expectations Determination, to address emerging harms and strengthen the overall operation of the Determination.
“I will settle the proposed amendments as soon as practicable.”
Teal independent MP Zoe Daniel told this masthead that pro-eating disorder content is rife across all platforms.
Last September, Daniel hosted a Social Media and Body Image roundtable, in which sector experts, people with lived experience of eating disorders and parliamentarians resolved to form a new working group.
“One option being considered is strengthening the Online Safety Act,” she said. “I am also looking at the options for increasing the platforms’ responsibility for their systems and the algorithms that deliver harmful content. I will present the recommendations of the working group to the government mid-year.
“This work is vitally important. Anorexia has the highest death rate of any mental illness. I promised I would fight for families experiencing this cruel and relentless illness. Making social media safer is a big part of it.”
X was contacted for comment.
Loading
A Meta spokesman said the company is proactively working with Daniel and organisations including the Butterfly Foundation on the issue.
“We want to provide teens with a safe, and supportive experience online,” a Meta spokesman said.
“That’s why we’ve developed more than 30 tools to support teens and their families, including tools that allow parents to decide when, and for how long, their teens use Instagram. We’ve invested in technology that finds and removes content related to suicide, self-injury or eating disorders before anyone reports it to us.
“These are complex issues but we will continue working with parents, experts and regulators to develop new tools, features and policies that meet the needs of teens and their families.”
A TikTok spokeswoman said: “We take the mental well-being and safety of our community extremely seriously, and do not allow content that depicts, promotes, normalises or glorifies eating disorders.
“The highlighted ads go against our policy and have been removed. We are also investigating how they were approved for use. There is no finish line when it comes to the safety of our community and we will continue to invest heavily in our people and systems.
Facebook whistleblower Frances Haugen, who is visiting Australia, said the negative effects of social media for young girls are typically greater than for boys, given they spend far more time using it.
Haugen, who worked in Facebook’s civic misinformation team, leaked internal documents showing that the company knew Instagram was toxic to teenage girls, while in public it consistently downplayed the app’s negative effects.
“A lot of our culture puts so much emphasis on appearances for women,” she said. “And when you have such a visual medium, especially one where you get such immediate concrete feedback, it’s all about ‘did you get comments on it, did you get likes on it?’”
The Market Recap newsletter is a wrap of the day’s trading. Get it each weekday afternoon.
[ad_2]
Source link