Meta will hide suicide and eating disorder content from teens as government pressure mounts

Share via:

Illustration by Nick Barclay / The Verge

Meta is restricting teens from viewing content that deals with topics like suicide, self-harm, and eating disorders, the company announced today. The content, which Meta says may not be “age appropriate” for young people, will not be visible even if it’s shared by someone a teen follows.

If a teen searches for this type of content on Facebook and Instagram, they’ll instead be directed toward “expert resources for help” like the National Alliance on Mental Illness, according to Meta. Teen users also may not know if content in these categories is shared and that they can’t see it. This change is rolling out to users under 18 over the coming months.

In addition to hiding content in sensitive categories, teen accounts will also be defaulted to restrictive filtering settings that tweak what kind of content on Facebook and Instagram they see. This change affects recommended posts in Search and Explore that could be “sensitive” or “low quality,” and Meta will automatically set all teen accounts to the most stringent settings, though these settings can be changed by users.

The sweeping updates come as Meta and other tech companies are under heightened government scrutiny over how they handle children on their platforms. In the US, Meta CEO Mark Zuckerberg — along with a roster of other tech executives — will testify before the Senate on child safety on January 31st. The hearing follows a wave of legislation across the country that attempts to restrict kids from accessing adult content.

Even beyond porn, lawmakers have signaled they are willing to age-gate large swaths of the internet in the name of protecting kids from certain (legal) content, like things that deal with suicide or eating disorders. For years, there have been reports about how teens’ feeds are flooded with harmful content, for example. But blocking all material besides what platforms deem trustworthy and acceptable could prevent young people from accessing other educational or support resources.

Meanwhile, in the EU, the Digital Services Act holds Big Tech companies including Meta and TikTok accountable for content shared to their platforms, with rules around algorithmic transparency and ad targeting. And the UK’s Online Safety Act, which became law in October, now requires online platforms to comply with child safety rules or risk fines.

The Online Safety Act lays out a goal of making the UK “the safest place in the world to be online” — but the law has had vocal critics who say it could violate user privacy. The encrypted messaging app Signal, for example, has said it would leave the UK rather than collect more data that could jeopardize user privacy.

Disclaimer

We strive to uphold the highest ethical standards in all of our reporting and coverage. We StartupNews.fyi want to be transparent with our readers about any potential conflicts of interest that may arise in our work. It’s possible that some of the investors we feature may have connections to other businesses, including competitors or companies we write about. However, we want to assure our readers that this will not have any impact on the integrity or impartiality of our reporting. We are committed to delivering accurate, unbiased news and information to our audience, and we will continue to uphold our ethics and principles in all of our work. Thank you for your trust and support.

Popular

More Like this

Meta will hide suicide and eating disorder content from teens as government pressure mounts

Illustration by Nick Barclay / The Verge

Meta is restricting teens from viewing content that deals with topics like suicide, self-harm, and eating disorders, the company announced today. The content, which Meta says may not be “age appropriate” for young people, will not be visible even if it’s shared by someone a teen follows.

If a teen searches for this type of content on Facebook and Instagram, they’ll instead be directed toward “expert resources for help” like the National Alliance on Mental Illness, according to Meta. Teen users also may not know if content in these categories is shared and that they can’t see it. This change is rolling out to users under 18 over the coming months.

In addition to hiding content in sensitive categories, teen accounts will also be defaulted to restrictive filtering settings that tweak what kind of content on Facebook and Instagram they see. This change affects recommended posts in Search and Explore that could be “sensitive” or “low quality,” and Meta will automatically set all teen accounts to the most stringent settings, though these settings can be changed by users.

The sweeping updates come as Meta and other tech companies are under heightened government scrutiny over how they handle children on their platforms. In the US, Meta CEO Mark Zuckerberg — along with a roster of other tech executives — will testify before the Senate on child safety on January 31st. The hearing follows a wave of legislation across the country that attempts to restrict kids from accessing adult content.

Even beyond porn, lawmakers have signaled they are willing to age-gate large swaths of the internet in the name of protecting kids from certain (legal) content, like things that deal with suicide or eating disorders. For years, there have been reports about how teens’ feeds are flooded with harmful content, for example. But blocking all material besides what platforms deem trustworthy and acceptable could prevent young people from accessing other educational or support resources.

Meanwhile, in the EU, the Digital Services Act holds Big Tech companies including Meta and TikTok accountable for content shared to their platforms, with rules around algorithmic transparency and ad targeting. And the UK’s Online Safety Act, which became law in October, now requires online platforms to comply with child safety rules or risk fines.

The Online Safety Act lays out a goal of making the UK “the safest place in the world to be online” — but the law has had vocal critics who say it could violate user privacy. The encrypted messaging app Signal, for example, has said it would leave the UK rather than collect more data that could jeopardize user privacy.

Disclaimer

We strive to uphold the highest ethical standards in all of our reporting and coverage. We StartupNews.fyi want to be transparent with our readers about any potential conflicts of interest that may arise in our work. It’s possible that some of the investors we feature may have connections to other businesses, including competitors or companies we write about. However, we want to assure our readers that this will not have any impact on the integrity or impartiality of our reporting. We are committed to delivering accurate, unbiased news and information to our audience, and we will continue to uphold our ethics and principles in all of our work. Thank you for your trust and support.

Website Upgradation is going on for any glitch kindly connect at office@startupnews.fyi

More like this

OpenAI’s o3 suggests AI models are scaling in new...

Last month, AI founders and investors told TechCrunch...

The FTC orders Marriott and Starwood to beef up...

The Federal Trade Commission announced on Friday it...

Aave mulls Chainlink integration to return MEV fees to...

The DeFi protocol aims to capture around 40%...

Popular

Upcoming Events

Startup Information that matters. Get in your inbox Daily!