[ad_1]
Both states enacted their laws before Musk bought what was then Twitter, sacked most of its moderators, reinstated most of those banned by the previous management and opened the platform to posts, some involving hate speech, anti-semitism and disinformation, that have cost X, as it is now known, half its advertising revenues, a lot of users and as much as $US30 billion ($45.9 billion) of value. There is now a strong conservative presence on X.
The legislation also predates the launch of Donald Trump’s Truth Social, in 2022.
If the states’ legislation were upheld by the Supreme Court, the social media companies would effectively have to host a lot of content they currently screen out.
The Texas law prohibits them from blocking, banning, de-platforming, de-boosting, restricting or denying equal access or visibility to any user, with limited exceptions for posts involving violent threats or child exploitation. It stops them from exercising editorial judgments on the views expressed.
Florida’s law prohibits the platforms from removing the accounts of political candidates or suppressing posts by or about them, and stops them from taking any action to censor, de-platform or “shadow ban” a “journalistic enterprise” based on their content.
Loading
Both laws would allow users to sue the companies for alleged censorship.
Section 230 of the Communications Decency Act (a contentions piece of legislation within, for differing reasons, both of the major political parties), however, shields the companies from liability for the content posted by their users and effectively leaves any regulation of content to the platforms.
That section, which states that no provider or user of “an interactive computer service” should be treated as the publisher or speaker of any information provided by someone else, has been described as the “26 words that created the internet.”
At face value it might appear to strengthen the view that the platforms, having opened themselves to most speakers and speech, are common carriers for their content. If they are not deemed to be publishers, then they are akin to regulated businesses like telephone companies or other utilities.
The focus in the court, however, appears to be far more on the first amendment and whether governments ought to be able to govern whether, or how, they edit content or otherwise police their content.
Obviously, the platforms, via their trade group, NetChoice, argued that the platforms aren’t common carriers and exercise editorial judgments like newspapers when they decide what content to suppress or amplify and therefore their decisions are protected by the first amendment.
They have said that upholding the states’ laws would compel them to disseminate a flood of objectionable views, including Russian or terrorist propaganda, or neo-Nazi commentary, or posts encouraging children to engage in risk or unhealthy behaviours. It could, they have argued, undermine democracy and foment violence.
There were some obvious differences among the judges on the issue of whether the platforms are protected by the first amendment.
Chief justice John Roberts asked, “since we’re talking about the first amendment, whether our concern should be with the state regulating what we have called the modern town square?” He said the first amendment puts a “thumb on the scale” when that question is asked, saying it prohibited the government, not private entities, from censoring speech.
Justice Samuel Alito asked whether “content moderation” was anything other than a euphemism for censorship, saying that the term struck him as Orwellian. Justice Clarence Thomas asked whether the companies were seeking constitutional protection for censoring others’ speech and said he knew of no protected speech interest in censoring others’ speech.
Justice Brett Kavanaugh, however, disagreed that the actions of private companies could be regarded as censorship.
“When I think of Orwellian, I think of the state, not the private sector, not private individuals,” he said.
The stakes in the courtroom are potentially wider than the issue of whether the platforms have the right and the discretions to moderate their content as they see fit because companies like Meta (Facebook’s parent), or Alphabet (Google and YouTube’s) do a lot more than operate social media sites.
Loading
Does the first amendment, for instance, enable Meta to “censor” conversations on WhatsApp or Google to “moderate” gmail messages? Does an Uber have the right to screen passengers and decide whether or not to accept them? Similar questions could be posed of other companies in the sharing economy.
It is, of course, possible for the court to confine any decision, whether for or against the two states’ laws, to social media.
Whatever it decides (there are suggestions it could refer some questions to the lower courts) it will shape, or reshape, social media platforms and the nature of the new town squares their judges and lawyers keep referring to, potentially in quite radical ways and, if the platforms lose, perhaps in ways that are unmanageable and (as Musk has discovered) threatens the economic of their platforms.
[ad_2]
Source link