The escalating concerns surrounding child safety on social media platforms have prompted tech giant Meta, led by Mark Zuckerberg, to actively participate in the Lantern program. Launched by the Tech Coalition, a global alliance of tech companies combating child sexual abuse and exploitation online, Lantern is recognized as the first cross-platform signal-sharing program dedicated to child safety.
As a founding member of Lantern, Meta’s involvement underscores the critical importance of robust policy guardrails implemented by tech companies to ensure compliance and swift redressals in the event of policy violations. The move also signifies the growing responsibility of tech giants to address child safety issues amid the prolific rise of social media platforms.
Meta has been utilizing advanced technologies, such as Microsoft’s PhotoDNA and Meta’s PDQ, to combat the dissemination of child sexual abuse content on the internet. Despite these efforts, Meta acknowledges the necessity for additional solutions to prevent predators from exploiting multiple apps and websites to target children.
“At Meta, we want young people to have safe, positive experiences online, and we’ve spent a decade developing tools and policies designed to protect them,” stated the company in an official release.
The Lantern program, initiated by the Tech Coalition, operates as a global alliance working collaboratively to combat online child sexual exploitation and abuse. The coalition aims to inspire, guide, and support its members, fostering collaboration to protect children from online threats.
Online child sexual exploitation and abuse (CSEA) pose a pervasive threat across platforms and services, making cross-platform collaboration essential. The Lantern program addresses this need by identifying signals related to accounts and profiles violating policies against CSEA, such as inappropriate sexualized contact with a child, online grooming, and financial sextortion of young individuals.
Signals shared through Lantern include information like email addresses, child sexual abuse material hashes, or keywords used in grooming, buying, or selling such materials. While these signals do not serve as definitive proof of abuse, they offer valuable clues for investigations to trace perpetrators in real-time.
Before the inception of Lantern, the absence of a mechanism for joint action against bad actors allowed many to evade detection. The program enhances prevention and detection capabilities, expedites threat identification, and builds situational awareness across platforms.
Companies participating in the Lantern program can securely share signals about activities violating their child safety policies. The collaborative approach allows them to upload and access signals, review similar activities on their platforms, and take appropriate actions based on their enforcement procedures. The program facilitates responsible management through safety and privacy by design, respecting human rights, transparency, and stakeholder engagement.
Exciting news! We’re now on WhatsApp Channels too. Subscribe today by clicking the link and stay updated with the latest insights in the startup ecosystem! Click here!
Meta’s engagement with the Lantern program reflects a concerted industry effort to address the complex challenges associated with child safety in the digital realm, emphasizing the shared responsibility of tech companies to create a secure online environment for young users.