Elon Musk’s X can’t get around California’s content moderation law, judge rules

Share via:

Illustration by Kristen Radtke / The Verge; Getty Images

A federal judge has denied X’s (formerly Twitter) attempt to temporarily halt a California law that pushes social media platforms to disclose their strategies for moderating harmful content.

Passed last year, AB 587 requires large social media companies to share descriptions of how they moderate content that contains hate speech or racism, extremism or radicalization, disinformation, harassment, and foreign political interference. In a complaint filed in September, X argued that the law violates the First Amendment right to free speech.

The company formerly known as Twitter failed to make its case. US District Judge William Shubb denied X’s request for a preliminary injunction of the law. “While the reporting requirement does appear to place a substantial compliance burden on social medial companies, it does not appear that the requirement is unjustified or unduly burdensome within the context of First Amendment law,” Shubb writes in his decision issued yesterday.

“The required disclosures are also uncontroversial.”

X didn’t immediately respond to a request for comment from The Verge, replying with an email that says, “Busy now, please check back later.” The company’s complaint against AB 587 said that it’s “difficult to reliably define” what constitutes hate speech, misinformation, and political interference. It also alleged that AB 587 would force social media platforms to “‘eliminate’ certain constitutionally-protected content.”

Shubb, on the other hand, found that the information AB 587 requires companies to report to the Attorney General twice a year is pretty straightforward. “The reports required by AB 587 are purely factual. The reporting requirement merely requires social media companies to identify their existing content moderation policies, if any, related to the specified categories,” his decision says. “The required disclosures are also uncontroversial. The mere fact that the reports may be ‘tied in some way to a controversial issue’ does not make the reports themselves controversial.”

X has thinned out its ranks since Elon Musk took over last year, with job cuts heavily affecting its trust and safety team. And X’s moderation policies are under scrutiny in Europe now, too. The European Union opened a formal investigation into X this month over whether it has violated the bloc’s Digital Services Act (DSA). “The dissemination of illegal content in the context of Hamas’ terrorist attacks against Israel” is a key concern of the investigation, according to the European Commission. It’s the first time the commission has launched formal infringement proceedings under the DSA. The rules aim to curb illegal activity and disinformation online and went into effect this year.

Disclaimer

We strive to uphold the highest ethical standards in all of our reporting and coverage. We StartupNews.fyi want to be transparent with our readers about any potential conflicts of interest that may arise in our work. It’s possible that some of the investors we feature may have connections to other businesses, including competitors or companies we write about. However, we want to assure our readers that this will not have any impact on the integrity or impartiality of our reporting. We are committed to delivering accurate, unbiased news and information to our audience, and we will continue to uphold our ethics and principles in all of our work. Thank you for your trust and support.

Popular

More Like this

Elon Musk’s X can’t get around California’s content moderation law, judge rules

Illustration by Kristen Radtke / The Verge; Getty Images

A federal judge has denied X’s (formerly Twitter) attempt to temporarily halt a California law that pushes social media platforms to disclose their strategies for moderating harmful content.

Passed last year, AB 587 requires large social media companies to share descriptions of how they moderate content that contains hate speech or racism, extremism or radicalization, disinformation, harassment, and foreign political interference. In a complaint filed in September, X argued that the law violates the First Amendment right to free speech.

The company formerly known as Twitter failed to make its case. US District Judge William Shubb denied X’s request for a preliminary injunction of the law. “While the reporting requirement does appear to place a substantial compliance burden on social medial companies, it does not appear that the requirement is unjustified or unduly burdensome within the context of First Amendment law,” Shubb writes in his decision issued yesterday.

“The required disclosures are also uncontroversial.”

X didn’t immediately respond to a request for comment from The Verge, replying with an email that says, “Busy now, please check back later.” The company’s complaint against AB 587 said that it’s “difficult to reliably define” what constitutes hate speech, misinformation, and political interference. It also alleged that AB 587 would force social media platforms to “‘eliminate’ certain constitutionally-protected content.”

Shubb, on the other hand, found that the information AB 587 requires companies to report to the Attorney General twice a year is pretty straightforward. “The reports required by AB 587 are purely factual. The reporting requirement merely requires social media companies to identify their existing content moderation policies, if any, related to the specified categories,” his decision says. “The required disclosures are also uncontroversial. The mere fact that the reports may be ‘tied in some way to a controversial issue’ does not make the reports themselves controversial.”

X has thinned out its ranks since Elon Musk took over last year, with job cuts heavily affecting its trust and safety team. And X’s moderation policies are under scrutiny in Europe now, too. The European Union opened a formal investigation into X this month over whether it has violated the bloc’s Digital Services Act (DSA). “The dissemination of illegal content in the context of Hamas’ terrorist attacks against Israel” is a key concern of the investigation, according to the European Commission. It’s the first time the commission has launched formal infringement proceedings under the DSA. The rules aim to curb illegal activity and disinformation online and went into effect this year.

Disclaimer

We strive to uphold the highest ethical standards in all of our reporting and coverage. We StartupNews.fyi want to be transparent with our readers about any potential conflicts of interest that may arise in our work. It’s possible that some of the investors we feature may have connections to other businesses, including competitors or companies we write about. However, we want to assure our readers that this will not have any impact on the integrity or impartiality of our reporting. We are committed to delivering accurate, unbiased news and information to our audience, and we will continue to uphold our ethics and principles in all of our work. Thank you for your trust and support.

Website Upgradation is going on for any glitch kindly connect at office@startupnews.fyi

More like this

The EU’s 10 biggest antitrust actions on tech

The U.S. innovates and the EU regulates, or...

Nykaa Completes Acquisition Of Additional 39% Stake In Dot...

SUMMARY After the latest transaction, Nykaa’s shareholding in Dot...

Princeton Digital Group to Invest $1 billion in India...

Princeton Digital Group (PDG), a leading data centre...

Popular

Upcoming Events

Startup Information that matters. Get in your inbox Daily!