🎤Questions to the management - Part 2 🙂 ❓How does our solution benefit your customers? 💬 "Some scenarios can be captured in the real world more easily, while others can be modelled more efficiently. Then there's the human X-factor. We only know to a limited extent how people will behave. Therefore, we cannot assess the extent to which a certain human behavior is indicative of drowsiness or another impairment, for example. We can only obtain a reliable reference by measuring this. So if we want to reconstruct edge cases to test whether the algorithm can cope with things like sunglasses that occlude the eyes, for example, this is a situation that we can enrich synthetically. The rotation of the head, for example, has already been computed. This situation can therefore be extended quite easily to include sunglasses. The #advantage for our customers here is that they can use a scene that has already been shot many multiples times, as there are almost no limits to the expansion of synthetic components. In addition, we are also able to adapt the camera hardware to various other specifications. In other words, our customer has the advantage of avoiding having to reshoot the scenes if a new camera setup is required. This saves them a lot of time as well as enormous costs, as they receive a huge amount of training data in a very short space of time." #computervision #3d #reconstruction #ar #vr #synthetic #groundtruth #realworld #trainingdata #engineer #gamechanger #precise #benefit
rabbitAI’s Post
More Relevant Posts
-
OMG, this is SO COOL! Just came across this project from UIST 2024 called “StegoType: Surface Typing from Egocentric Cameras” and I had to share! The brilliant team — Mark Richardson, Fadi Botros, Yangyang Shi, Pinhao Guo, Bradford J Snow, Linguang Zhang, Jingming Dong, Keith Vertanen, Shugao Ma, and Robert Wang — figured out how to enable touch typing on flat surfaces using only headset cameras and hand-tracking. No physical keyboard, just pure innovation in AR/VR! With a deep learning model and a carefully designed data collection process, they’re solving challenges like self-occlusion and natural sloppy typing. 🚀 This could be a game-changer for productivity in AR/VR environments! 📄 Learn more: https://lnkd.in/e2PeidQR https://lnkd.in/eGJiHwyn 🎥 Original video: https://lnkd.in/e7q-wFk9 #UIST2024 #AR #VR #TechInnovation
To view or add a comment, sign in
-
To all #VR #Application developers and designers: Ever wanted to measure the effect of a specific design choice without running another lengthy user study? We have built #SIM2VR, a system that allows you to run biomechanical user simulations directly in a given Unity environment. SIM2VR enables the prediction of user movements, joint angles, muscle activation, fatigue, performance and effort differences, and identify potential user strategies – all in silico, using state-of-the-art #RL techniques and the advanced MuJoCo physics engine. Feel free to try out SIM2VR, which is now freely available on GitHub! We appreciate your feedback and can't wait to see the applications you come up with. The accompanying #UIST24 paper "SIM2VR: Towards automated biomechanical testing in VR" will be published soon (a pre-print version is available on arXiv). https://lnkd.in/df3S22Zn
To view or add a comment, sign in
-
Virtual Reality as a viewer for CFD? Watch me do it life at SPS Nuremberg NEXT WEEK! Grab your free ticket here 👉 https://lnkd.in/eFKJWatJ 👈 🖨️ When it comes to CFD, one result is a black number printed on a brown sheet of recycling paper. This answers the question: 🐟 "HOW MUCH IS THE DRAG?" 🤷♀️ For us engineers, the more important question is: WHY? 🤔 Why is there a vortex? Why is the flow separating? Which geometry is causing that? 🤓 To answer these questions, we nerds need to interprete data. Look at colorful images OR: 😁 Just dive into the colorful images. Study three-dimensional results IN three dimensions - not on a 2D screen! #meshedpotato #simcenter #cfd #simulation #virtualreality #vr #spsnuremberg
To view or add a comment, sign in
-
🎤 Questions to the management - Part 1 😊 ❓You call our training data "game changing". What exactly has this effect? 💬 "When talking about the best training data for computer vision, there is one particular question that comes up again and again: Is real and synthetic data better? Our answer is: both are actually the same. On the one hand, you can put a lot of effort into a realistic-looking synthetic image. Sometimes, however, this effort is simply too much. On the other hand, when you take a photo in the real world with a camera, it's actually digital. It becomes a mathematical matrix. This also makes it ‘synthetic’, in a sense. So what is really real anyway? We want to show that this dichotomy does not exist or does not have to exist. With our method, you can slide from the synthetic to the realistic part of your recordings like a slider. According to the requirements of your training data. And this can indeed be an incredible game changer when it comes to gathering training data." #computervision #3d #reconstruction #ar #vr #incabin #synthetic #groundtruth #realworld #trainingdata #engineer #gamechanger #precise
To view or add a comment, sign in
-
𝗘𝘅𝗽𝗹𝗼𝗿𝗶𝗻𝗴 𝗥𝗲𝗮𝗹-𝗧𝗶𝗺𝗲 𝗦𝗶𝗺𝘂𝗹𝗮𝘁𝗶𝗼𝗻 𝗼𝗳 𝗖𝗼𝗺𝗽𝗹𝗲𝘅 𝗢𝗽𝘁𝗶𝗰𝗮𝗹 𝗦𝘆𝘀𝘁𝗲𝗺𝘀: 𝗗𝗲𝘀𝗶𝗴𝗻 𝗮𝗻𝗱 𝗔𝗻𝗮𝗹𝘆𝘀𝗶𝘀 𝗣𝗲𝗿𝘀𝗽𝗲𝗰𝘁𝗶𝘃𝗲𝘀 Curious about designing and simulating complex optical systems like automotive lighting, night-vision, ergonomics, electronic devices (displays, VR, AR) and cameras in real-time? Ansys SPEOS offers extensive capabilities to explore these possibilities and beyond! 1. Model complex optical systems inside a realistic and time-dynamic three-dimensional simulation. 2. Analyze the interaction among precise models of lighting, sensors, and human vision. 3. Simulate the entire optical system in action, at any location and at any time, and gain a clear understanding of its behaviour and performance. #OpticalSystems #RealTimeSimulation #AnsysSPEOS #AutomotiveLighting #NightVision #Ergonomics #ElectronicDevices #VR #AR #CameraDesign #TechInnovation #EngineeringExcellence #CADFEMIndia #Ansys
To view or add a comment, sign in
-
🚀 **Exciting Progress on VR Aerial Refueling Simulation!** Over the past few weeks, I've been deep into the development of a VR aerial refueling simulation as part of my larger **Meta Fusion Simulation** framework, built using Unreal Engine. The goal: to replicate the complex and precise process of mid-air refueling with high fidelity. 🌐 **What We Tested**: - **Real-time physics**: Spline-based hose dynamics, including realistic retraction and extension based on aircraft movement. - **Collision**: Basket-probe interactions, focusing on impact handling and the detailed behavior of the basket when it touches the airframe. - **Multiplayer and Latency**: Refining formation flying stability using the DIS framework in a Cesium-georeferenced world. ⚙️ **The Results**: - **Improved Hose Retraction/Extension**: Dynamic spline points ensure the hose moves fluidly, retracting naturally when tension is applied. - **Collision Effects**: Accurate basket-probe collisions now generate more realistic responses, enhancing immersion. - **Formation Stability**: Tweaks to dead reckoning have reduced lag-induced jittering in tight formations, though challenges remain in very close proximity due to network latency. This project has given me valuable insights into the future potential of immersive simulations and how to tackle real-world challenges in VR environments. Onward to more testing and optimizations! 🎮✈️ #VR #AerialRefueling #MetaFusionSimulation #UnrealEngine #SimulationDevelopment #DIS #Cesium
To view or add a comment, sign in
-
Complex AR/VR Explained Simply: Depth Perception Algorithms—how do they create 3D objects that feel real without physical measurements? Let’s break it down. 🛠️ #ARVR #TechExplained #DepthPerception #VirtualReality #Innovation
To view or add a comment, sign in
-
📈 #GaussianSplatting is the new buzzword in #AEC, #XR, #Gaming, and #RealityCapture. So what is it and why should you care? Day 3 of Reality Capture Network's #RCON2024 had a couple amazing talks on the subject. Huge thanks to Erik Peterson, Tomas Barnas, and Michal Gula for sharing their knowledge! 💡 🙏 #Gaussian #Splats represent a scene as a collection of points in 3D space, each with a Gaussian function describing its spatial distribution. These Gaussians are rendered by "splatting" them onto a 2D view and blending them. Gaussian Splats are more efficient and can render scenes much faster than #NeRF, making them ideal for real-time applications, like #VR and #AR, where performance is a priority over absolute detail and accuracy. Essentially, if you are an XR developer, you can create detailed reality captures with a limited number of photos, thus making capturing easier. Furthermore, these captures will run in real-time XR apps. #UnrealEngine officially supports Gaussian Splats, and people have figured out how to use them in #Unity. ℹ️ For all the AEC folks out there, groundbreaking work is being done to extract (somewhat) precise measurements out of Gaussian Splats. Follow Erik Peterson and Manifold for more details. 👇 I'll share a few resources in the comments below that go into more detail on what Gaussian Splatting is, how it compares to techniques like NeRF, and how to use splats in Unity. 📢 Look forward to a demo of Gaussian Splats on the QuarkXR platform!
To view or add a comment, sign in
-
Holograms – those 3D projections from sci-fi movies – have always seemed a little futuristic. Well, the future is here! Scientists have developed a new type called "metaholograms" that can project multiple clear, crisp images at once. This is a huge leap forward from traditional holograms that can be blurry and limited to one image. Metaholograms open a treasure chest of possibilities: imagine VR experiences so real you feel like you're in another world, or AR displays that seamlessly blend digital information with the real world. They could even be used for ultra-secure data storage or next-level image encryption. Buckle up, because metaholograms are about to change the game! #inwider #dubai #metahologram #hologram #augmentedreality #technology #science #innovation #future
To view or add a comment, sign in
-
Holograms – those 3D projections from sci-fi movies – have always seemed a little futuristic. Well, the future is here! Scientists have developed a new type called "metaholograms" that can project multiple clear, crisp images at once. This is a huge leap forward from traditional holograms that can be blurry and limited to one image. Metaholograms open a treasure chest of possibilities: imagine VR experiences so real you feel like you're in another world, or AR displays that seamlessly blend digital information with the real world. They could even be used for ultra-secure data storage or next-level image encryption. Buckle up, because metaholograms are about to change the game! #inwider #dubai #metahologram #hologram #augmentedreality #technology #science #innovation #future
To view or add a comment, sign in
795 followers