OpenMined’s cover photo
OpenMined

OpenMined

Technology, Information and Internet

New York, NY 6,640 followers

Building the public network for non-public information

About us

OpenMined is a non-profit foundation creating open-source technology infrastructure that helps researchers and app builders get answers from data without needing a copy or direct access. Our community of technologists is building Syft: the public network for non-public information. Learn more → www.openmined.org Supported by the OpenMined Foundation, the OpenMined Community is an online ecosystem of over 17,000 technologists, researchers, and industry professionals keen to unlock 1000x more data in every scientific field and industry. Join us on Slack → slack.openmined.org LinkedIn → linkedin.com/company/openmined X/Twitter → x.com/openminedorg Youtube → youtube.com/@OpenMinedOrg

Industry
Technology, Information and Internet
Company size
11-50 employees
Headquarters
New York, NY
Type
Nonprofit
Founded
2017

Locations

Employees at OpenMined

Updates

  • 🎙 Upcoming Privacy Tech Talk: What Should We Trust in Trustworthy Machine Learning? Join Andrew Trask February 24th at 1:30 PM ET (GMT-4) for an exciting presentation by Aaron Roth, Henry Salvatori Professor at UPenn. #Machinelearning is impressive, but imperfect — it makes errors. When we use ML predictions to take action, especially in high stakes settings, we want to be cognizant of this fact and take into account our probabilistic uncertainty. There are many ways of quantifying uncertainty, but what are they good for? Tune in to learn more. 🎥 The event will be live-streamed on YouTube. 🔗 Links to register and submit Q&A in the comments ↓ #PrivacyTechTalkSeries

    • No alternative text description for this image
  • AI researchers and dataset owners — The National AI Research Resource (NAIRR) Pilot is seeking datasets to help expand AI education and skill development across the U.S. Your dataset could play a key role in shaping the future of AI learning! We’re especially looking for datasets that: ✅ Support AI education in critical fields like health, climate, and manufacturing ✅ Help build a more diverse AI research community ✅ Come with clear documentation and support resources 📅 Deadline: February 7, 2025 We strongly encourage submissions from EPSCoR jurisdictions and Minority Serving Institutions to help create a more inclusive AI research ecosystem. Learn more & submit your dataset via link in the comments ↓

  • View organization page for OpenMined

    6,640 followers

    We are proud to be among the 50 projects selected by the Paris Peace Forum from 770 applications across 111 countries for the #AIActionSummit. 🗓️ We'll be in Paris on February 10-11 as we showcase ethical and inclusive AI solutions focusing on our important work with the Christchurch Call Foundation 👉 Read more about it on our blog: bit.ly/4hl59h2

    View organization page for Paris Peace Forum

    30,908 followers

    🌍 770 applications, 111 countries represented, 50 projects selected!   The AI Action Summit, taking place on February 10-11 in Paris, will spotlight innovative solutions from the Call for AI Projects launched by the Paris Peace Forum.   These groundbreaking projects address key challenges: 👁️ Support for blind and visually impaired people 🛡️ Tackling digital violence 🎗️ Cervical cancer screening 🗣️ Fighting hate speech ⚖️ Strengthening labor rights And much more…   Discover them now 👉 bit.ly/4hx5OMP   🙏 Thank you to all the organizations who shared their projects. Together, let's build a future where AI serves everyone.   #AIActionSummit

    • No alternative text description for this image
  • View organization page for OpenMined

    6,640 followers

    🎥 Webinar: Curious about Privacy-Preserving Machine Learning (PPML)? Join our FREE webinars where you'll discover how to build AI systems that protect data privacy while delivering powerful insights. Two Sessions: • December 4th (This Wednesday!) • December 18th What you'll learn: • Core concepts of #PrivacyPreservingMachineLearning • Learn the basics of #FederatedLearning, #DifferentialPrivacy, and #HomomorphicEncryption. • Real-world applications and use cases using new tools Perfect for data scientists, ML engineers, and privacy enthusiasts looking to stay ahead in the evolving AI landscape. Learn More & Register now → https://bit.ly/3ZkxLj4

    • No alternative text description for this image
  • View organization page for OpenMined

    6,640 followers

    🎙Upcoming Privacy Tech Talk: Reconciling Computer Science and Legal Approaches to Privacy Join Andrew Trask on November 25th at 1PM ET for an insightful talk with Kobbi Nissim, Professor of Computer Science at Georgetown University and a pioneer of #DifferentialPrivacy. Kobbi will discuss the challenges of reconciling legal and technical approaches to privacy and explore how we can bridge these gaps to create systems that meet both legal standards and technical realities. 🎥 The event will be live-streamed on YouTube. Register now to receive a link: https://bit.ly/3Z5gTgn Submit your questions for the Q&A: https://bit.ly/4fG8C9D #PrivacyTechTalkSeries

    • No alternative text description for this image
  • OpenMined reposted this

    View profile for Moisés Vargas

    AI/ML Driven Developer: Human-in-the-loop Crafting Software with Ruby, Python, JavaScript, UNIX/AWS

    Day 1: Federated Learning + Privacy Technologies: A First Dive 🔐🤖 #30DaysOfFLCode I aim to explore Federated Learning (FL) and how it enables collaborative machine learning while preserving privacy. My goal? Understand its core concepts, tackle real-world use cases, and apply cutting-edge tools like SyftBox. What is Federated Learning, and why does it matter? Federated Learning is a paradigm shift in how we train machine learning models. Instead of centralizing data (and risking privacy breaches), FL allows models to be trained locally on user devices or organizational datasets. The magic happens when these local models share their updates (e.g., weights or gradients) to create a global model—without exposing raw data. But FL isn't without challenges:   1️⃣ Communication Bottlenecks: Sharing updates across devices must be efficient.   2️⃣ Statistical Heterogeneity: Data across nodes isn't always uniformly distributed.   3️⃣ Privacy Concerns: Even sharing updates can leak sensitive information.  The paper I read today highlights some powerful solutions to these problems:   - Federated Averaging (FedAvg): Combines updates from local models to refine a global model.   - Differential Privacy: Adds noise to updates to protect individual data.   - Communication Efficiency: Compression techniques like sparsification reduce data transfer.   - Homomorphic Encryption: Enables encrypted computations, preserving privacy during model aggregation.  Paper link: https://lnkd.in/gzr-JZ6B Enter SyftBox: A Tool for Privacy-Preserving Collaboration SyftBox by OpenMined is like a "Dropbox for data and computation." It simplifies the implementation of privacy-enhancing technologies (PETs) like FL.  Key features:   - Distributed Network: Nodes (called Datasites) store and process data locally.   - Language Agnostic: Build APIs in any language and environment.   - Secure Collaboration: Datasites share model updates, not raw data, ensuring privacy.  I set up SyftBox today and explored its potential for real-world use cases. One exciting application is fraud detection in financial transactions. Imagine banks collaboratively training a fraud detection model without ever sharing sensitive customer data—this is what Federated Learning and SyftBox enable. What I accomplished today   1️⃣ Studied a foundational paper on Federated Learning and its techniques like FedAvg and Differential Privacy.   2️⃣ Set up SyftBox and ran a demo sharing CPU usage across a distributed network.   3️⃣ Planned how to apply FL concepts to credit card fraud detection, using techniques like Homomorphic Encryption to protect sensitive financial data.  👉 Check out my GitHub repo: https://lnkd.in/gDGcAm9Y

    • No alternative text description for this image
  • OpenMined reposted this

    View profile for Patrick Hough, PhD

    Postdoctoral Researcher, Post-Quantum Cryptography, Oxford.

    Very excited to be starting a 30-day course on Federated Learning (FL) with OpenMined. Having worked on the development of various privacy-enhancing technologies (PETs) through my PhD, I'm keen to understand how they're being used in this exciting area. LLMs are chewing their way through public data (the internet) but to harness the power of machine learning on private data sets we have to step carefully where privacy must me preserved. If we can do things right, huge possibilities are available to us in areas such as medical research, personalised education, and more. One of OpenMined's primary goals is to make sure PETs are embedded as standard practise when allowing training on private data and to make these tools open and accessible to all. #30DaysOfFLCode #PETs #MPC #FHE #ZKP

    • No alternative text description for this image
  • View organization page for OpenMined

    6,640 followers

    🚀 Introducing the #30DaysOfFLCode Challenge! Ready to dive into #FederatedLearning and other Privacy-Enhancing Technologies? This is your chance to learn, build, and share over 30 days of community-based coding fun. Whether you’re a beginner or a seasoned pro, this challenge is designed to: ✅ Expand your skills ✅ Connect with like-minded innovators ✅ Showcase your work in a supportive community How It Works: • Commit Publicly: Announce your participation to stay accountable. • Study Daily: Utilize free resources and tutorials provided by OpenMined and other challenge participants. • Share Progress: Post daily updates using the hashtag #30DaysOfFLCode. Join the movement and start your journey here → 30DaysOfFLCode.com Let’s code together and shape the future of AI and privacy-enhancing technologies. 💻✨

    • No alternative text description for this image
  • Excellent new research by Department for Science, Innovation and Technology on the AI assurance market. We are proud to see that our work with AI Safety Institute and the Christchurch Call to enable external scrutiny of AI systems is featured ↓

    As AI adoption grows, demand to ensure it is trustworthy is greater than ever. The Assuring a Responsible Future for AI report from the Responsible Technology Adoption Unit will focus on three new areas that will unlock £6.5 billion in the next decade:  ➡️ Develop a platform to drive demand in tools ensuring the trustworthiness of AI  ➡️ Improving AI assurance quality with industry and the AI Safety Institute  ➡️ Develop a terminology tool for responsible AI  Read more ▶️ https://lnkd.in/ej9si7wJ

Similar pages

Browse jobs

Funding

OpenMined 1 total round

Last Round

Non equity assistance
See more info on crunchbase