Yaak

Yaak

Softwareentwicklung

We build multimodal data visualization, search and open source foundation models for pioneers of spatial Intelligence.

Info

Yaak is developing a unified workflow to search & uncover trends within petabytes of multimodal data and build spatial intelligence.

Website
http://www.yaak.ai
Branche
Softwareentwicklung
Größe
11–50 Beschäftigte
Hauptsitz
Berlin
Art
Privatunternehmen
Gegründet
2020
Spezialgebiete
Machine Learning, AI, Spatial Intelligence, Robotics, Embodied AI, Data visualization, Search und Open source

Orte

Beschäftigte von Yaak

Updates

  • Yaak hat dies direkt geteilt

    Unternehmensseite von byFounders anzeigen, Grafik

    15.143 Follower:innen

    A Path Toward Cognitive Robotics 🤖 In our latest deep dive, Daniel explores how AI is about to make one of its most significant leaps yet—from the digital realm into the physical world through cognitive robots. While we’ve seen GenAI revolutionize digital work, the physical world remains largely untapped. Daniel breaks down why we’re at an inflection point, powered by: - Game-changing foundation models like π0 and AutoRT - Breakthroughs in imitation learning, a method for teaching robots through human demonstrations - Unprecedented capital flowing into robotics ($100B+ in the last decade) The applications are vast—from reimagining retail operations to revolutionizing commercial kitchens. But unlike the winner-takes-all dynamics of consumer tech, we expect this market to flourish with specialized players solving specific challenges across industries. A must-read for anyone interested in the future of AI, robotics, and their transformative impact on our physical world. Are you building the future of robotics? Connect with daniel@byfounders.vc 🤝 🔗 Read the blogpost here: https://lnkd.in/gnf3nGut

    • Kein Alt-Text für dieses Bild vorhanden
  • Unternehmensseite von Yaak anzeigen, Grafik

    1.516 Follower:innen

    As the robotics ecosystem matures from prototypes to products, the range of data formats expands accordingly: mcap.devRerun, hdf5.org, good ol' jpeg's and npy's — you name it. Wider adoption of e2e AI in robotics requires, at a minimum, being able to train models on those formats. To that end we've open-sourced rbyte [https://lnkd.in/d528zryX], our PyTorch-compatible multimodal dataset library. Key features include: - Support for a variety of robotic data formats - Modality alignment - SQL-like filtering - TensorDict samples - Declarative dataset configs 👀 Spoiler — rbyte dataset configs will eventually be curated through natural language search in Nutron.

    • Multimodal dataset library in PyTorch
  • Unternehmensseite von Yaak anzeigen, Grafik

    1.516 Follower:innen

    Self-supervised foundation model trained on sensor data (states) and optimal policies (actions), can often uncover bias within in the datasets. We computed embedding similarity for states (spatial and temporal positions encoding) and actions (steering and braking) learned by our robotics foundation model. From top left, steering angle, brake pedal, image patch position, global time-step. 👀 Large braking values are clearly under represented in our datasets. #AI, #robotics, #LMM, #SpatialIntelligence

  • Unternehmensseite von Yaak anzeigen, Grafik

    1.516 Follower:innen

    We've identified four key components essential for accelerating the adoption of end-to-end (e2e) AI within the robotics ecosystem. 1. Logs: Sensor log playback, enrichment of modality (f.ex voice/commentary) and tasks.  2. Discovery: Auto triage, scenario search and dataset curation 3. Multimodal data: Proprietary/open robotics data format, alignment of modalities and PyTorch data loaders 4. Spatial intelligence: Hackable sources for training spatial intelligence models. While building e2e AI for the automotive domain we found the landscape of tools for 1-4 fragmented or non-existent. We did a deep dive on each these topics in part 2 of blog series and how Yaak is addressing them. Blog: https://lnkd.in/dd3N_y6a 👀 A (nearly) free byproduct of e2e AI trained through self-supervision are powerful embeddings, which captures rich scene/policy semantics without additional annotations (bounding boxes or segmentation masks). These embeddings enable auto discovery of similar scenarios and triage of sub-optimal policies through vector similarity search. 👇 Vector similarity search visualization with Rerun (Looping enabled)

    • Kein Alt-Text für dieses Bild vorhanden
  • Unternehmensseite von Yaak anzeigen, Grafik

    1.516 Follower:innen

    We've identified four key components essential for accelerating the adoption of end-to-end (e2e) AI within the robotics ecosystem. 1. Log replay: Sensor log playback, enrichment of modality (f.ex voice/commentary) and tasks.  2. Discovery: Auto triage, scenario search and dataset curation 3. Multimodal data: Proprietary/open robotics data format, alignment of modalities and PyTorch data loaders 4. Spatial intelligence: Hackable sources for training spatial intelligence models. While building e2e AI for the automotive domain we found the landscape of tools for 1.-4. fragmented or non-existent. We did a deep dive on each these topics in part 2 of blog series and how Yaak is addressing them. Blog: https://lnkd.in/dd3N_y6a 👀 A (nearly) free byproduct of e2e AI trained through self-supervision are powerful embeddings, which captures rich scene/policy semantics without additional annotations (bounding boxes or segmentation masks). These embeddings enable auto discovery of similar scenarios and triage of sub-optimal policies through vector similarity search. 👇 Vector similarity search visualization with Rerun (Looping enabled)

    • Kein Alt-Text für dieses Bild vorhanden
  • Yaak hat dies direkt geteilt

    #SpeakerAnnouncement We’re delighted to welcome Søren Halskov Nissen, CEO of Yaak, to the DriveAI Summit! Søren will be speaking on “New Challenges When Developing End-to-End Models,” sharing his perspective on tackling the complexities of autonomous systems from start to finish. 🚗🤖 With an extensive background in autonomous vehicle and sensor development, Søren is at the forefront of exploring the convergence of robotics, vehicle autonomy, and spatial intelligence. At Yaak, he’s pushing the boundaries of what’s possible in this field, bringing fresh insights into how autonomous systems are evolving and adapting in a rapidly changing tech landscape. Come to the DriveAI Summit to hear Søren’s insights on building robust end-to-end models for autonomous systems and the technical hurdles that come with them! 🚘💡 #AutonomousVehicles #DriveAISummit #InnovationInAI

    • Kein Alt-Text für dieses Bild vorhanden

Ähnliche Seiten