19.2 C
Melbourne
Saturday, December 21, 2024

Trending Talks

spot_img

Elon Musk’s X marks the spot where free speech comes at a cost

[ad_1]

Silicon Valley tech companies are now some of the biggest in the world and operate on the scale of billions of users. But they also operate under a patchwork of government oversight and incoherent regulation across multiple jurisdictions, which has done little to impact their monopolistic positions and outsized economic power, let alone protect users.

The prime minister has admonished social media companies to take more responsibility around their content, and vowed further regulation. But if government is set on tackling these problems, then it would do better to move away from contentious and ultimately futile debates around content and instead focus on the anticompetitive behaviour that has allowed big tech companies to garner so much market power. Digital communications platforms such as Twitter, Meta and Google have become a combination of de facto service providers as well as advertising and data licensing companies, not just communications platforms, and they should be regulated as such.

Loading

We also have to acknowledge that content moderation still matters. But when it comes to regulating it, there is a crisis of legitimacy all around. Those making and enforcing the rules on social media – whether it’s big tech or government – will lack legitimacy from some if not most sectors of society. Many people believe governments enact regulation around content primarily based on partisan pressures and interests. Nor is anyone happy a handful of tech CEOs can set the rules and norms for so much of the world’s communication and expression.

But it’s much easier to diagnose this wicked problem than to figure out what to do about it. Besides focusing on the regulation of transparency and anticompetitive behaviour, as well as underlying business models and economic logic of digital platforms, average users and citizens could be brought into the equation when considering more contentious issues around content moderation and deplatforming.

Deliberative mechanisms such as “platform councils” – forums made up of average digital users and tech experts – can help achieve a more legitimate consensus on the uses and governance of digital platforms. They would allow responsibility and risk around content moderation and user access to be shared among the technology companies developing and running digital platforms, the governments tasked with regulating them, and the people using them. Similar processes such as citizens’ assemblies, citizens’ panels or consensus conferences can be convened to inform government regulation and legislation of not only social media companies but artificial intelligence and other emerging technologies that promise to pose even more complex challenges to democracy.

Ordinary citizens must be provided the opportunity to contribute to regulatory decisions. Where piloted, digital deliberative democracy has proven to be legitimate and popular. Most participants wanted tech companies to use this deliberative format as a way to make decisions in future.

Lydia Khalil is a Research Fellow on Transnational Challenges at the Lowy Institute.

Get a weekly wrap of views that will challenge, champion and inform your own. Sign up for our Opinion newsletter.

[ad_2]

Source link

Serendib News
Serendib News
Serendib News is a renowned multicultural web portal with a 17-year commitment to providing free, diverse, and multilingual print newspapers, featuring over 1000 published stories that cater to multicultural communities.

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Articles