top of page

Introducing "Lantern": Pioneering Cross-Platform Signal Sharing Program for Child Safety

The Tech Coalition has embarked on an essential mission with the introduction of "Lantern," a groundbreaking program designed to promote cross-platform collaboration in the fight against online child sexual exploitation and abuse (OCSEA). This pervasive issue continues to pose a significant threat, transcending various online platforms and services. Within this realm, two of the most pressing concerns are "online grooming" – involving inappropriate sexualized contact with children, and "financial sextortion," which targets young individuals.

Predators frequently employ sophisticated tactics, initiating contact with young people on public forums by assuming false identities, pretending to be peers or friendly acquaintances. Subsequently, they lead their victims to private chat rooms and different online platforms to solicit and share child sexual abuse material (CSAM) or coerce payments by threatening to reveal intimate images.


Given the widespread and cross-platform nature of OCSEA, addressing this issue effectively requires a collective effort among companies. "Lantern" serves as an innovative solution to this challenge, facilitating the secure and responsible sharing of signals between technology companies. These signals can encompass various types of information, such as email addresses, usernames, CSAM hashes, or keywords used in grooming and CSAM transactions.


It's essential to note that these signals do not serve as definitive proof of abuse but rather provide valuable clues for further investigation. They can be the missing piece of the puzzle that enables companies to uncover real-time threats to a child's safety.

Until the introduction of "Lantern," there was no consistent procedure for companies to collaborate against predatory actors who managed to evade detection across various online services. This initiative aims to fill this gap and shed light on cross-platform attempts at online child sexual exploitation and abuse, ultimately contributing to a safer online environment for children.


"Lantern" promises to increase prevention and detection capabilities, expedite threat identification, enhance awareness of new predatory tactics, and facilitate the reporting of criminal offenses to the appropriate authorities.


Participating companies upload signals to the "Lantern" program, indicating activity that violates their child safety policies on their respective platforms. Other participating companies can then access and use these signals, applying them to their platforms' policies and terms of service, and taking appropriate enforcement actions. This collaboration allows for a more comprehensive approach to identifying and addressing threats.



During the pilot phase of the program, significant progress was made. For example, MEGA shared URLs with Meta, leading to the removal of over 10,000 violating profiles, pages, and accounts on Meta's platforms. This demonstrates the program's tangible impact on child safety.


The development of "Lantern" has been a thoughtful and meticulous process. Over the last two years, the Tech Coalition has worked closely with several of its member companies to design a program that is not only effective in addressing OCSEA but also legally, regulatory, and ethically compliant. The initiative is marked by a commitment to responsible management through safety and privacy by design, respect for human rights, stakeholder engagement, and transparency.


The Tech Coalition acknowledges that cross-platform signal sharing raises valid concerns and requires diligent oversight. As a result, they are committed to safety and privacy by design, clear guidelines for data sharing, ongoing policy and practice review, mandatory trainings and check-ins, and respect for human rights.


In addition, the program engaged experts and organizations in the fields of child safety, digital rights, advocacy for marginalized communities, government, and law enforcement to gather feedback and insights. Stakeholder engagement remains an ongoing process.

The Tech Coalition plans to include "Lantern" in its annual transparency report and will provide participating companies with recommendations on how to incorporate their participation in the program into their own transparency reporting.


"Lantern" launches with an initial group of companies, including Discord, Google, Mega, Meta, Quora, Roblox, Snap, and Twitch. Their experiences and insights will be instrumental in evaluating and strengthening this necessary initiative to collaborate against the grave threat of online child sexual exploitation and abuse. The Tech Coalition remains committed to working with companies capable of fulfilling the agreements and requirements of joining "Lantern" and welcomes additional participants as they become familiar with the program's potential benefits and applications in enforcing their respective policies.


In addressing urgent threats with safeguards against unintended consequences, "Lantern" serves as a model for a thoughtful, nuanced approach to online child safety. The program's ongoing development prioritizes privacy and human rights while fulfilling its mission to protect children.


Words from Lantern Participating Companies:


Discord: "Child-harm content is appalling, unacceptable, and has no place on Discord or in society. We work relentlessly to keep this content off our service and take immediate action when we become aware of it. Our participation in the Lantern Program has enabled Discord to have a much wider and more nuanced approach to combating harmful behavior on our platform. Crucially, we’ve been able to scale the actioning of offending accounts and the sharing of bad actor data points relating to the highest-harm abuses of our platform with other participating companies. Discord has also acted on data points shared with us through the program, which has assisted in many internal investigations. The Tech Coalition plays a pivotal role in sharing analysis and actionable threat information with its members to mitigate risks and enhance platform resiliency." - John Redgrave, Vice President of Trust & Safety at Discord


Google: “We commend the Tech Coalition’s leadership in bringing these companies together to further our collective fight against child sexual abuse and exploitation online. This ongoing work and the industry-wide collaboration are incredibly important steps in helping to keep children safe online, and the partnership speaks to Google’s longstanding commitment to preventing CSAE on our platforms. We look forward to exploring how we can best contribute to the program moving forward.” -- Laurie Richardson, VP of Trust and Safety at Google


MEGA: "MEGA values the Lantern initiative as we have zero tolerance for sharing of objectionable material such as CSAM. Sharing information across platforms helps us to be more effective in our process to remove objectionable content and to close accounts of anyone sharing illegal content." - Stephen Hall, Chief Compliance Officer, MEGA


Meta: “Predators don’t limit their attempts to harm children to individual platforms, and the technology industry needs to work together to stop predators and protect children on the many apps and websites they use. We’ve spent over a decade fighting to keep young people safe online, and we’re glad to partner with the Tech Coalition and our peers in the Lantern program on this important work.” - Antigone Davis, Global Head of Safety, Meta


Snap: "Nothing is more important than the safety and well-being of our Snapchat community. The exploitation of anyone, especially young people and minors, is illegal, unacceptable, and explicitly prohibited by our Community Guidelines. Preventing, disrupting, detecting, and reporting child sexual exploitation and abuse (CSEA) is a priority for us - which is why we have been working with the Tech Coalition and other companies on innovative industry-wide approaches, like Lantern. Lantern will strengthen our existing capabilities and technologies to identify and combat CSEA and predatory actors. To advance our collective mission of keeping young people safe across platforms and services, we look forward to continuing this collaboration with the Tech Coalition." - Jacqueline Beauchere, Global Head of Platform Safety, Snap

Words from Child Safety Experts:


ECPAT International: “Tackling child sexual exploitation in the digital realm demands not only the willingness but also the legal and technical capacity for platforms to unite in action. Lantern is a crucial step forward. It ignites the path to collaboration—fostering a safe, secure, and privacy-conscious environment where the exchange of crucial signals could pivot a child's fate from exploitation to protection.” - Amy Crocker, Head of Child Protection and Technology, ECPAT International


Lucy Faithfull: “Child sexual abuse happens on so many different apps, websites, and platforms. Right now, people who want to harm children can move freely between them. We congratulate the Tech Coalition on launching project Lantern, an innovative way to try to block this. We encourage tech companies to follow the lead taken by Discord, Google, Mega, Meta, Quora, Roblox, Snap, and Twitch and participate in Lantern to work together to detect harm, prevent abuse, and protect children. The more partners that get involved, the further the reach and wider the impact. It is only by working together that we can place a comprehensive online shield around children.” - Deborah Denis, Chief Executive of The Lucy Faithfull Foundation


The National Center for Missing & Exploited Children (NCMEC): “The National Center for Missing & Exploited Children (NCMEC) applauds the Tech Coalition for the creation of Lantern. We hope that by sharing signals across platforms, the online sexual exploitation of children can be disrupted and more thorough and actionable CyberTipline reports can be made.” - Gavin Portnoy, VP of Communications & Brand, NCMEC

bottom of page