Meta is expanding child safety measures as government and press reports mount

Share via:

Illustration: Nick Barclay / The Verge

In a blog post published today, Meta says it’s expanding and updating its child safety features aimed at protecting kids — even as reports pile up about how its platforms recommend content sexualizing children.

Over the course of several months, The Wall Street Journal has detailed how Instagram and Facebook serve up inappropriate and sexual child-related content to users. In June, a report detailed how Instagram connects a network of accounts buying and selling child sexual abuse material (CSAM), guiding them to each other via its recommendations algorithm. A follow-up investigation published today shows how the problem extends to Facebook Groups, where there’s an ecosystem of pedophile accounts and groups, some with as many as 800,000 members.

Meta’s recommendation system enabled abusive accounts to find each other

In both cases, Meta’s recommendation system enabled abusive accounts to find each other, through features like Facebook’s “Groups You Should Join,” or autofilling hashtags on Instagram. Meta today said it will place limits on how “suspicious” adult accounts can interact with each other: on Instagram, they won’t be able to follow one another, won’t be recommended, and comments from these profiles won’t be visible to other “suspicious” accounts.

Meta also said it has expanded its list of terms, phrases, and emojis related to child safety and has begun using machine learning to detect connections between different search terms.

The reports and resulting child safety changes come at the same time as US and EU regulators press Meta on how it keeps kids on its platforms safe. Meta CEO Mark Zuckerberg — along with a slate of other Big Tech executives — will testify before the Senate in January 2024 on the issue of online child exploitation. In November, EU regulators gave Meta a deadline (that expires today) to provide information about how it protects minors; they sent Meta a new request today, specifically noting “the circulation of self-generated child sexual abuse material (SG-CSAM) on Instagram” and the platform’s recommendation system.

In late November, the dating app companies Bumble and Match suspended advertising on Instagram following The Journal’s reporting. The companies’ ads were appearing next to explicit content and Reels videos that sexualized children.

Disclaimer

We strive to uphold the highest ethical standards in all of our reporting and coverage. We StartupNews.fyi want to be transparent with our readers about any potential conflicts of interest that may arise in our work. It’s possible that some of the investors we feature may have connections to other businesses, including competitors or companies we write about. However, we want to assure our readers that this will not have any impact on the integrity or impartiality of our reporting. We are committed to delivering accurate, unbiased news and information to our audience, and we will continue to uphold our ethics and principles in all of our work. Thank you for your trust and support.

Popular

More Like this

Meta is expanding child safety measures as government and press reports mount

Illustration: Nick Barclay / The Verge

In a blog post published today, Meta says it’s expanding and updating its child safety features aimed at protecting kids — even as reports pile up about how its platforms recommend content sexualizing children.

Over the course of several months, The Wall Street Journal has detailed how Instagram and Facebook serve up inappropriate and sexual child-related content to users. In June, a report detailed how Instagram connects a network of accounts buying and selling child sexual abuse material (CSAM), guiding them to each other via its recommendations algorithm. A follow-up investigation published today shows how the problem extends to Facebook Groups, where there’s an ecosystem of pedophile accounts and groups, some with as many as 800,000 members.

Meta’s recommendation system enabled abusive accounts to find each other

In both cases, Meta’s recommendation system enabled abusive accounts to find each other, through features like Facebook’s “Groups You Should Join,” or autofilling hashtags on Instagram. Meta today said it will place limits on how “suspicious” adult accounts can interact with each other: on Instagram, they won’t be able to follow one another, won’t be recommended, and comments from these profiles won’t be visible to other “suspicious” accounts.

Meta also said it has expanded its list of terms, phrases, and emojis related to child safety and has begun using machine learning to detect connections between different search terms.

The reports and resulting child safety changes come at the same time as US and EU regulators press Meta on how it keeps kids on its platforms safe. Meta CEO Mark Zuckerberg — along with a slate of other Big Tech executives — will testify before the Senate in January 2024 on the issue of online child exploitation. In November, EU regulators gave Meta a deadline (that expires today) to provide information about how it protects minors; they sent Meta a new request today, specifically noting “the circulation of self-generated child sexual abuse material (SG-CSAM) on Instagram” and the platform’s recommendation system.

In late November, the dating app companies Bumble and Match suspended advertising on Instagram following The Journal’s reporting. The companies’ ads were appearing next to explicit content and Reels videos that sexualized children.

Disclaimer

We strive to uphold the highest ethical standards in all of our reporting and coverage. We StartupNews.fyi want to be transparent with our readers about any potential conflicts of interest that may arise in our work. It’s possible that some of the investors we feature may have connections to other businesses, including competitors or companies we write about. However, we want to assure our readers that this will not have any impact on the integrity or impartiality of our reporting. We are committed to delivering accurate, unbiased news and information to our audience, and we will continue to uphold our ethics and principles in all of our work. Thank you for your trust and support.

Website Upgradation is going on for any glitch kindly connect at office@startupnews.fyi

More like this

ThredUp fashion marketplace offloads its European business, Remix

Fashion resale marketplace ThredUp has divested its European...

Swiggy Not Focusing On Entering New Cities For Food...

Foodtech major Swiggy is focusing on deepening its...

Meta says it’s mistakenly removing too many posts

Meta is mistakenly removing too much content across...

Popular

Upcoming Events

Startup Information that matters. Get in your inbox Daily!