Safer, Built by Thorn

Safer, Built by Thorn

Software Development

El Segundo , California 1,614 followers

Proactive child sexual abuse material (CSAM) detection built by experts in child safety technology

About us

Safer was built by Thorn to fill the need for a solution that could adequately tackle child sexual abuse material (CSAM) and online child sexual exploitation (CSE). With Safer, any platform with an upload button can access industry-leading tools for proactive CSAM and CSE detection. Safer detects verified CSAM using hash matching, novel image and video CSAM using machine learning (ML) classification models and predicts potential text-based harms that include or could lead to child exploitation, such as sextortion, discussions of CSAM and more. Platforms don't have to tackle this issue alone. We can take meaningful action together. With a relentless focus on CSAM and CSE detection strategies via state-of-the-art AI/ML models, proprietary research, and a cutting-edge detection solutions, Safer enables digital platforms to create safer user experiences.

Website
https://bit.ly/3BNWroZ
Industry
Software Development
Company size
51-200 employees
Headquarters
El Segundo , California
Founded
2019
Specialties
Child safety, Platform safety, Online safety, Content identification, Image identification, Video identification, and Content moderation

Updates

  • Looking ahead: 2025 online child safety predictions from the research team at Thorn Thorn’s research team is constantly analyzing emerging trends in online child safety —and they’ve put together an insightful list of predictions below. These 2025 predictions are based on the trends we’ve been seeing in our own research—unpacked with a deep understanding of the online child safety ecosystem and how these trends could impact Trust & Safety. Here are five key developments we expect to shape the landscape in 2025: 1. Regulatory pressure: Companies will navigate an increasingly complex web of global requirements. 2. Reporting systems: Platform safety mechanisms will face unprecedented public scrutiny. 3. Trust & Safety evolution: The field will continue transforming into a distinct professional discipline with increased influence. 4. AI threats: New threats will emerge, but so will innovative safety solutions. 5. Caregiver challenges: Families will face new decisions about digital access and safety. These predictions underscore both the challenges ahead and the critical importance of proactive safety measures in our digital world. Head to the Safer blog to read our full analysis and learn how these trends could impact your organization and the broader tech ecosystem: https://lnkd.in/e8pyKvd5

  • 🔗 The Intersection of sexual abuse, monetization and technology has been highlighted in recent headlines. See how criminals are misusing technology to profit from sexual abuse. ⤵️ Plus, top lines with links and summaries for the latest child safety headlines related: ✔️ Child safety (News from Roblox, Telegram, and the Australia ban) ✔️ Generative AI (case study, cleaning training data and safety) ✔️ Scaling trust and safety (Bluesky) The Digital Defender newsletter gathers the trust & safety news you should know into bite size servings. Each month, we gather top headlines and give you a quick summary of the most consequential stories impacting online child safety. Thanks for reading and for being a champion for child safety! #trustandsafety #childsafety

    The Intersection of sexual abuse, monetization and technology

    The Intersection of sexual abuse, monetization and technology

    Safer, Built by Thorn on LinkedIn

  • Safer, Built by Thorn reposted this

    View organization page for Partnership on AI, graphic

    19,900 followers

    NEW 🚨 We're proud to share five new case studies exploring how leading organizations are mitigating synthetic media risks through PAI's Synthetic Media Framework: https://buff.ly/49QxWXG These studies delve into an underexplored area of synthetic media governance known as direct disclosure - the methods or labels used to convey how content has been modified or created with AI. Read about: 🔹 How Meta updated its approach to direct disclosure based on user feedback 🔹 How Microsoft gave users detailed context about media on LinkedIn 🔹 How Truepic used disclosures to help authenticate cultural heritage imagery in conflict zones 🔹 How child safety nonprofit Thorn's analysis evaluated mitigating the risk of generative AI models creating Child Sex Abuse Material 🔹 How scholars from Stanford Institute for Human-Centered Artificial Intelligence (HAI) analyzed direct disclosure's limited impact on AI-generated Child Sexual Abuse Material Alongside our existing case studies, the library provides critical insights into the evolving landscape of AI content transparency and responsible technology deployment. Learn more about these case studies and their policy implications here: https://buff.ly/49QxWXG #AIEthics #SyntheticMedia #ResponsibleTechnology #MediaTransparency

    • No alternative text description for this image
  • Safer, Built by Thorn reposted this

    View organization page for Thorn, graphic

    32,179 followers

    Thorn and the WeProtect Global Alliance have partnered on a groundbreaking study to identify the top technologies that will guide the fight against child sexual abuse and exploitation in the digital age. We call it the Evolving Technologies Horizon Scan project. This unique survey brings together the long-standing expertise and informed perspectives of roughly 300 multidisciplinary stakeholders from around the world to analyze critical technology trends whose evolution stands to significantly impact online child safety in the next 5-10 years. From predictive AI and generative AI to end-to-end encryption (E2EE), we’ve sought to highlight substantial risks and explore potential solutions to better safeguard children’s privacy. Together, we’ve developed a full report that comprehensively looks at the most pressing technology trends and their potential intended and unintended effects on child safety online. 👉 Dive into our findings: https://lnkd.in/enNGedaH

  • 🗣Upcoming Webinar: “Hope, Vision & Impact: Leaders Shaping the Future of Child Safety” Join us next Wednesday (12/11) for a compelling discussion about safeguarding online communities. We’ll be joined by Julie Cordua, CEO of Thorn; Sara Clemens, Interim Board Chair of Thorn; Julie Inman - Grant, Australia’s eSafety Commissioner; and Brigette De Lay, Director of the Prevent Child Sexual Abuse Programme at Oak Foundation. These remarkable leaders will discuss innovative strategies for overcoming online safety challenges and their vision for safer digital environments—drawing upon their expertise in tech, government, and nonprofits. Learn about: 📣 Cross-sector collaboration to accelerate change 🌐 Global insights on online safety 💡 Strategic philanthropy’s role in long-term solutions ✨ What’s ahead for child safety in 2025 There’s still time to register! Sign up here: https://lnkd.in/gCmVFjxm

    • No alternative text description for this image
  • When platforms experience rapid growth, content moderation becomes even more critical. Just ask Bluesky Social, which welcomed 7M+ new users in recent weeks. By implementing Safer, Built by Thorn by Thorn from its earliest days, Bluesky demonstrated why safety by design isn't just good practice—it's essential for sustainable growth. In our latest blog post, Thorn VP John Starr and Bluesky Head of Trust & Safety Aaron R. break down: • Why rapid user migration demands robust safety measures • How AI-powered solutions like Safer protect growing communities • The importance of proactive safety infrastructure Read more about Bluesky’s growth and how it uses solutions like Safer to set standards for platform safety: https://lnkd.in/gcVf6E9b

    • No alternative text description for this image
  • 🔗 AI-generated child sexual abuse content is on the rise. So are indictments. Get a snapshot of AIG-CSAM risks and law enforcement's response. ⤵️ Plus, top lines with links and summaries for the latest child safety headlines related: ✔️ Social media (News from Instagram, Roblox, TikTok, Snapchat) ✔️ Generative AI (Thorn’s Dr. Rebecca Portnoff featured in TechCrunch’s Women in AI series) ✔️ Legislation and litigation (Online Services Act resource, the constitutionality of childproofing the internet) The Digital Defender newsletter, gathers the trust & safety news you should know into bite size servings. Each month, we gather top headlines and give you a quick summary of the most consequential stories impacting online child safety. Thanks for reading and for being a champion for child safety! #trustandsafety #childsafety

    AI-generated child sexual abuse content is on the rise. So are indictments related to this content.

    AI-generated child sexual abuse content is on the rise. So are indictments related to this content.

    Safer, Built by Thorn on LinkedIn

Affiliated pages

Similar pages