Social media giants must face child safety lawsuits, judge rules

Share via:


Meta, ByteDance, Alphabet, and Snap must proceed with a lawsuit alleging their social platforms have adverse mental health effects on children, a federal court ruled on Tuesday. US District Judge Yvonne Gonzalez Rogers rejected the social media giants’ motion to dismiss the dozens of lawsuits accusing the companies of running platforms “addictive” to kids.

School districts across the US have filed suit against Meta, ByteDance, Alphabet, and Snap, alleging the companies cause physical and emotional harm to children. Meanwhile, 42 states sued Meta last month over claims Facebook and Instagram “profoundly altered the psychological and social realities of a generation of young Americans.” This order addresses the individual suits and “over 140 actions” taken against the companies.

Tuesday’s ruling states that the First Amendment and Section 230, which says online platforms shouldn’t be treated as the publishers of third-party content, don’t shield Facebook, Instagram, YouTube, TikTok, and Snapchat from all liability in this case. Judge Gonzalez Rogers notes many of the claims laid out by the plaintiffs don’t “constitute free speech or expression,” as they have to do with alleged “defects” on the platforms themselves. That includes having insufficient parental controls, no “robust” age verification systems, and a difficult account deletion process.

“Addressing these defects would not require that defendants change how or what speech they disseminate,” Judge Gonzalez Rogers writes. “For example, parental notifications could plausibly empower parents to limit their children’s access to the platform or discuss platform use with them.”

However, Judge Gonzalez Rogers still threw out some of the other “defects” identified by the plaintiffs because they’re protected under Section 230, such as offering a beginning and end to a feed, recommending children’s accounts to adults, the use of “addictive” algorithms, and not putting limits on the amount of time spent on the platforms.

“Today’s decision is a significant victory for the families that have been harmed by the dangers of social media,” the lead lawyers representing the plaintiffs, Lexi Hazam, Previn Warren, and Chris Seeger, say in a joint statement.The Court’s ruling repudiates Big Tech’s overbroad and incorrect claim that Section 230 or the First Amendment should grant them blanket immunity for the harm they cause to their users.”

Google spokesperson José Castañeda says the allegations in these complaints are “simply not true,” adding the company has “built age-appropriate experiences for kids and families on YouTube, and provide parents with robust controls.” Snap declined to comment, while Meta and ByteDance didn’t immediately respond to The Verge’s request for comment.

Numerous lawsuits have argued that online platforms include “defective” features that hurt users, but these claims — including a high-profile suit over harassment on Grindr — have often been thrown out in court. As more studies show evidence of the potential harm social platforms may be causing children, lawmakers have pushed to pass new laws specifically targeting child safety, including requirements for age verification. This ruling doesn’t determine that social platforms are causing harm or hold them legally liable for it, but it could still pave the way for a slew of safety claims even without new laws — and make the legal defense against them harder.

Update November 14th, 4:26PM ET: Added a statement from Google.

Update November 14th, 5:15PM ET: Added that Snap declined to comment.



Source link

Disclaimer

We strive to uphold the highest ethical standards in all of our reporting and coverage. We StartupNews.fyi want to be transparent with our readers about any potential conflicts of interest that may arise in our work. It’s possible that some of the investors we feature may have connections to other businesses, including competitors or companies we write about. However, we want to assure our readers that this will not have any impact on the integrity or impartiality of our reporting. We are committed to delivering accurate, unbiased news and information to our audience, and we will continue to uphold our ethics and principles in all of our work. Thank you for your trust and support.

Popular

More Like this

Social media giants must face child safety lawsuits, judge rules


Meta, ByteDance, Alphabet, and Snap must proceed with a lawsuit alleging their social platforms have adverse mental health effects on children, a federal court ruled on Tuesday. US District Judge Yvonne Gonzalez Rogers rejected the social media giants’ motion to dismiss the dozens of lawsuits accusing the companies of running platforms “addictive” to kids.

School districts across the US have filed suit against Meta, ByteDance, Alphabet, and Snap, alleging the companies cause physical and emotional harm to children. Meanwhile, 42 states sued Meta last month over claims Facebook and Instagram “profoundly altered the psychological and social realities of a generation of young Americans.” This order addresses the individual suits and “over 140 actions” taken against the companies.

Tuesday’s ruling states that the First Amendment and Section 230, which says online platforms shouldn’t be treated as the publishers of third-party content, don’t shield Facebook, Instagram, YouTube, TikTok, and Snapchat from all liability in this case. Judge Gonzalez Rogers notes many of the claims laid out by the plaintiffs don’t “constitute free speech or expression,” as they have to do with alleged “defects” on the platforms themselves. That includes having insufficient parental controls, no “robust” age verification systems, and a difficult account deletion process.

“Addressing these defects would not require that defendants change how or what speech they disseminate,” Judge Gonzalez Rogers writes. “For example, parental notifications could plausibly empower parents to limit their children’s access to the platform or discuss platform use with them.”

However, Judge Gonzalez Rogers still threw out some of the other “defects” identified by the plaintiffs because they’re protected under Section 230, such as offering a beginning and end to a feed, recommending children’s accounts to adults, the use of “addictive” algorithms, and not putting limits on the amount of time spent on the platforms.

“Today’s decision is a significant victory for the families that have been harmed by the dangers of social media,” the lead lawyers representing the plaintiffs, Lexi Hazam, Previn Warren, and Chris Seeger, say in a joint statement.The Court’s ruling repudiates Big Tech’s overbroad and incorrect claim that Section 230 or the First Amendment should grant them blanket immunity for the harm they cause to their users.”

Google spokesperson José Castañeda says the allegations in these complaints are “simply not true,” adding the company has “built age-appropriate experiences for kids and families on YouTube, and provide parents with robust controls.” Snap declined to comment, while Meta and ByteDance didn’t immediately respond to The Verge’s request for comment.

Numerous lawsuits have argued that online platforms include “defective” features that hurt users, but these claims — including a high-profile suit over harassment on Grindr — have often been thrown out in court. As more studies show evidence of the potential harm social platforms may be causing children, lawmakers have pushed to pass new laws specifically targeting child safety, including requirements for age verification. This ruling doesn’t determine that social platforms are causing harm or hold them legally liable for it, but it could still pave the way for a slew of safety claims even without new laws — and make the legal defense against them harder.

Update November 14th, 4:26PM ET: Added a statement from Google.

Update November 14th, 5:15PM ET: Added that Snap declined to comment.



Source link

Disclaimer

We strive to uphold the highest ethical standards in all of our reporting and coverage. We StartupNews.fyi want to be transparent with our readers about any potential conflicts of interest that may arise in our work. It’s possible that some of the investors we feature may have connections to other businesses, including competitors or companies we write about. However, we want to assure our readers that this will not have any impact on the integrity or impartiality of our reporting. We are committed to delivering accurate, unbiased news and information to our audience, and we will continue to uphold our ethics and principles in all of our work. Thank you for your trust and support.

Website Upgradation is going on for any glitch kindly connect at office@startupnews.fyi

More like this

Robinhood, Kraken, Paxos launch Global Dollar stablecoin network

A swathe of major firms from traditional finance...

Ant, Grab, StraitsX simplify cross-border payments for SG

Users of Alipay+ partner apps can now transact...

Rumor: Apple developing 90Hz display tech for iPad Air,...

A new rumor today suggests Apple is developing...

Popular

Upcoming Events

Startup Information that matters. Get in your inbox Daily!