Google, Meta, Discord, and more team up to fight child abuse online

Share via:

Illustration by Amelia Holowaty Krales / The Verge

A new program called Lantern aims to fight online child sexual exploitation and abuse (OCSEA) with cross-platform signal sharing between online companies like Meta and Discord. The Tech Coalition, a group of tech businesses with a cooperative aim to fight online child sexual exploitation, wrote in today’s announcement that the program is an attempt to keep predators from avoiding detection by moving potential victims to other platforms.

Lantern serves as a central database for companies to contribute data and check their own platforms against. When companies see signals, like known OCSEA policy-violating email addresses or usernames, child sexual abuse material (CSAM) hashes, or CSAM keywords, they can flag them in their own systems. The announcement notes that while the signals don’t strictly prove abuse, they help companies investigate and possibly take action like closing an account or reporting the activity to authorities.

Image: The Tech Coalition
A visualization showing how Lantern works.

Meta wrote in a blog post announcing its participation in the program that, during Lantern’s pilot phase, it used information shared by one of the program’s partners, Mega, to remove “over 10,000 violating Facebook Profiles, Pages and Instagram accounts” and report them to the National Center for Missing and Exploited Children.

The coalition’s announcement also quotes John Redgrave, Discord’s trust and safety head, who says, “Discord has also acted on data points shared with us through the program, which has assisted in many internal investigations.”

The companies participating in Lantern so far include Discord, Google, Mega, Meta, Quora, Roblox, Snap, and Twitch. Members of the coalition have been developing Lantern for the last two years, and the group says that besides creating technical solutions, it had to put the program through “eligibility vetting” and ensure it jibes with legal and regulatory requirements and is “ethically compliant.”

One of the big challenges of programs like this is being sure it is effective while not presenting new problems. In a 2021 incident, a father was investigated by police after Google flagged him for CSAM over pictures of his kid’s groin infection. Several groups warned that similar issues could arise with Apple’s now-canceled automated iCloud photo library CSAM-scanning feature.

The Tech Coalition wrote that it commissioned a human rights impact assessment by the Business for Social Responsibility (BSR) — a larger coalition of companies aimed at global safety and sustainability issues. BSR will offer ongoing guidance as the program changes over time.

The coalition will oversee Lantern and says it’s responsible for making clear guidelines and rules for data sharing. As part of the program, companies must complete mandatory training and routine check-ins, and the group will review its policies and practices regularly.

Disclaimer

We strive to uphold the highest ethical standards in all of our reporting and coverage. We StartupNews.fyi want to be transparent with our readers about any potential conflicts of interest that may arise in our work. It’s possible that some of the investors we feature may have connections to other businesses, including competitors or companies we write about. However, we want to assure our readers that this will not have any impact on the integrity or impartiality of our reporting. We are committed to delivering accurate, unbiased news and information to our audience, and we will continue to uphold our ethics and principles in all of our work. Thank you for your trust and support.

Popular

More Like this

Google, Meta, Discord, and more team up to fight child abuse online

Illustration by Amelia Holowaty Krales / The Verge

A new program called Lantern aims to fight online child sexual exploitation and abuse (OCSEA) with cross-platform signal sharing between online companies like Meta and Discord. The Tech Coalition, a group of tech businesses with a cooperative aim to fight online child sexual exploitation, wrote in today’s announcement that the program is an attempt to keep predators from avoiding detection by moving potential victims to other platforms.

Lantern serves as a central database for companies to contribute data and check their own platforms against. When companies see signals, like known OCSEA policy-violating email addresses or usernames, child sexual abuse material (CSAM) hashes, or CSAM keywords, they can flag them in their own systems. The announcement notes that while the signals don’t strictly prove abuse, they help companies investigate and possibly take action like closing an account or reporting the activity to authorities.

Image: The Tech Coalition
A visualization showing how Lantern works.

Meta wrote in a blog post announcing its participation in the program that, during Lantern’s pilot phase, it used information shared by one of the program’s partners, Mega, to remove “over 10,000 violating Facebook Profiles, Pages and Instagram accounts” and report them to the National Center for Missing and Exploited Children.

The coalition’s announcement also quotes John Redgrave, Discord’s trust and safety head, who says, “Discord has also acted on data points shared with us through the program, which has assisted in many internal investigations.”

The companies participating in Lantern so far include Discord, Google, Mega, Meta, Quora, Roblox, Snap, and Twitch. Members of the coalition have been developing Lantern for the last two years, and the group says that besides creating technical solutions, it had to put the program through “eligibility vetting” and ensure it jibes with legal and regulatory requirements and is “ethically compliant.”

One of the big challenges of programs like this is being sure it is effective while not presenting new problems. In a 2021 incident, a father was investigated by police after Google flagged him for CSAM over pictures of his kid’s groin infection. Several groups warned that similar issues could arise with Apple’s now-canceled automated iCloud photo library CSAM-scanning feature.

The Tech Coalition wrote that it commissioned a human rights impact assessment by the Business for Social Responsibility (BSR) — a larger coalition of companies aimed at global safety and sustainability issues. BSR will offer ongoing guidance as the program changes over time.

The coalition will oversee Lantern and says it’s responsible for making clear guidelines and rules for data sharing. As part of the program, companies must complete mandatory training and routine check-ins, and the group will review its policies and practices regularly.

Disclaimer

We strive to uphold the highest ethical standards in all of our reporting and coverage. We StartupNews.fyi want to be transparent with our readers about any potential conflicts of interest that may arise in our work. It’s possible that some of the investors we feature may have connections to other businesses, including competitors or companies we write about. However, we want to assure our readers that this will not have any impact on the integrity or impartiality of our reporting. We are committed to delivering accurate, unbiased news and information to our audience, and we will continue to uphold our ethics and principles in all of our work. Thank you for your trust and support.

Website Upgradation is going on for any glitch kindly connect at office@startupnews.fyi

More like this

Elon Musk’s reposts of Kamala Harris deepfakes may not...

California’s newest law could land social media users...

Cognizant: Cognizant CMO quits, Thea Hayden to take interim...

Global technology services giant Cognizant saw yet another...

Blockdaemon mulls 2026 IPO: Report

Other Web3 infrastructure platforms, such as Circle, are...

Popular

Upcoming Events

Startup Information that matters. Get in your inbox Daily!