Reports find Facebook and Instagram are platforming sexually exploitative parents

Share via:


Investigations into “child influencer” accounts on Facebook and Instagram have found that Meta is knowingly allowing parents who sexually exploit their children for financial gain on the platform — and in some cases, using Meta’s paid subscription tools to do so. 

According to separate reports published by The New York Times and The Wall Street Journal on Thursday, Facebook and Instagram have become a potentially lucrative endeavor for parents who run social media accounts for children — mostly girls — who aren’t old enough to meet the platforms’ minimum 13-year-old age requirements. Several of the “parent-managed minor accounts” investigated sold materials to their large audiences of adult men, including photos of their children in revealing attire, exclusive chat sessions, and their children’s used leotards and cheer outfits.

Meta staffers found that some parents were knowingly catering content of their children to pedophiles

According to The Wall Street Journal, while these parent-run accounts don’t feature illegal content or nudity, staff at Meta discovered that some parents were knowingly producing material of their children that pedophiles would find sexually gratifying. This included parents having sexually charged conversations about their own children and making them interact with sexual messages sent by subscribers. Meta staff also were allegedly aware that the company’s algorithms promoted subscriptions for accounts that feature child models to suspected pedophiles and that some parents offered additional content of their children on other platforms.

Meta did not immediately respond to a request for comment from The Verge.

Because of the way Meta’s social media algorithms work, even accounts that aren’t intentionally insidious — like those for child models, athletes, and performers — stand to benefit from gaining large audiences of adult men. The Times reports that 5,000 accounts it examined had 32 million connections to male followers. Accounts with high follower counts have their visibility boosted by Instagram, which can lead to discounts and financial incentives from brands. Some companies pay child influencers $3,000 for a single post, and six-figure incomes can be made through monthly subscriptions, according to the Times.

Recommendations made by Meta staff to tackle the issue — such as requiring accounts that sold child-focused subscriptions to register themselves for monitoring or banning subscriptions to such accounts entirely — were apparently not pursued by the company. Instead, Meta focused on building an automated system for preventing likely pedophiles from subscribing to parent-run accounts, though this proved to be unreliable and easily evaded by creating a new account, says the Journal.

Meta’s own moderation tools restricted parents who blocked too many accounts from blocking suspected predators

While it was building this system, Meta expanded its subscription program and “gifts” tipping feature, arguing that such programs are well monitored. The Journal also found that this gifting tool has been misused and that efforts made by some parents to manage who was interacting with their children were thwarted by Meta’s own moderation tools — with accounts that blocked too many followers in a day having their ability to block accounts restricted.

For comparison, TikTok told the Journal that it bans the sale of underage modeling content on TikTok marketplace and through its creator monetization services.

The Times also highlighted the company’s inadequate moderation attempts, noting that Meta responded to just one of the 50 reports the publication made regarding questionable content featuring children over a period of eight months. One internal study conducted by Meta in 2020 and revealed in court documents found that 500,000 child Instagram accounts had “inappropriate” interactions every day.



Source link

Disclaimer

We strive to uphold the highest ethical standards in all of our reporting and coverage. We StartupNews.fyi want to be transparent with our readers about any potential conflicts of interest that may arise in our work. It’s possible that some of the investors we feature may have connections to other businesses, including competitors or companies we write about. However, we want to assure our readers that this will not have any impact on the integrity or impartiality of our reporting. We are committed to delivering accurate, unbiased news and information to our audience, and we will continue to uphold our ethics and principles in all of our work. Thank you for your trust and support.

Popular

More Like this

Reports find Facebook and Instagram are platforming sexually exploitative parents


Investigations into “child influencer” accounts on Facebook and Instagram have found that Meta is knowingly allowing parents who sexually exploit their children for financial gain on the platform — and in some cases, using Meta’s paid subscription tools to do so. 

According to separate reports published by The New York Times and The Wall Street Journal on Thursday, Facebook and Instagram have become a potentially lucrative endeavor for parents who run social media accounts for children — mostly girls — who aren’t old enough to meet the platforms’ minimum 13-year-old age requirements. Several of the “parent-managed minor accounts” investigated sold materials to their large audiences of adult men, including photos of their children in revealing attire, exclusive chat sessions, and their children’s used leotards and cheer outfits.

Meta staffers found that some parents were knowingly catering content of their children to pedophiles

According to The Wall Street Journal, while these parent-run accounts don’t feature illegal content or nudity, staff at Meta discovered that some parents were knowingly producing material of their children that pedophiles would find sexually gratifying. This included parents having sexually charged conversations about their own children and making them interact with sexual messages sent by subscribers. Meta staff also were allegedly aware that the company’s algorithms promoted subscriptions for accounts that feature child models to suspected pedophiles and that some parents offered additional content of their children on other platforms.

Meta did not immediately respond to a request for comment from The Verge.

Because of the way Meta’s social media algorithms work, even accounts that aren’t intentionally insidious — like those for child models, athletes, and performers — stand to benefit from gaining large audiences of adult men. The Times reports that 5,000 accounts it examined had 32 million connections to male followers. Accounts with high follower counts have their visibility boosted by Instagram, which can lead to discounts and financial incentives from brands. Some companies pay child influencers $3,000 for a single post, and six-figure incomes can be made through monthly subscriptions, according to the Times.

Recommendations made by Meta staff to tackle the issue — such as requiring accounts that sold child-focused subscriptions to register themselves for monitoring or banning subscriptions to such accounts entirely — were apparently not pursued by the company. Instead, Meta focused on building an automated system for preventing likely pedophiles from subscribing to parent-run accounts, though this proved to be unreliable and easily evaded by creating a new account, says the Journal.

Meta’s own moderation tools restricted parents who blocked too many accounts from blocking suspected predators

While it was building this system, Meta expanded its subscription program and “gifts” tipping feature, arguing that such programs are well monitored. The Journal also found that this gifting tool has been misused and that efforts made by some parents to manage who was interacting with their children were thwarted by Meta’s own moderation tools — with accounts that blocked too many followers in a day having their ability to block accounts restricted.

For comparison, TikTok told the Journal that it bans the sale of underage modeling content on TikTok marketplace and through its creator monetization services.

The Times also highlighted the company’s inadequate moderation attempts, noting that Meta responded to just one of the 50 reports the publication made regarding questionable content featuring children over a period of eight months. One internal study conducted by Meta in 2020 and revealed in court documents found that 500,000 child Instagram accounts had “inappropriate” interactions every day.



Source link

Disclaimer

We strive to uphold the highest ethical standards in all of our reporting and coverage. We StartupNews.fyi want to be transparent with our readers about any potential conflicts of interest that may arise in our work. It’s possible that some of the investors we feature may have connections to other businesses, including competitors or companies we write about. However, we want to assure our readers that this will not have any impact on the integrity or impartiality of our reporting. We are committed to delivering accurate, unbiased news and information to our audience, and we will continue to uphold our ethics and principles in all of our work. Thank you for your trust and support.

Website Upgradation is going on for any glitch kindly connect at office@startupnews.fyi

More like this

Consumer Commission Penalises Flipkart For Defective Product

SUMMARY The District Consumer Disputes Redressal Commission (Mumbai suburban)...

Wipro: Wipro promotes insider Omkar Nisal to head Europe...

Wipro on Monday promoted yet another insider Omkar...

how to watch a baby

Parenthood is...

Popular

Upcoming Events

Startup Information that matters. Get in your inbox Daily!