Waymo's New 6th-Generation System: A Leap Forward in Autonomous Driving Technology

Waymo's New 6th-Generation System: A Leap Forward in Autonomous Driving Technology

Waymo, the autonomous vehicle subsidiary of Alphabet, has just launched its 6th-generation self-driving system, setting a new benchmark for the future of autonomous driving. This innovation enhances safety and scalability, addressing two of the most critical factors in realizing the dream of fully autonomous vehicles. What makes this announcement so exciting is not just the technology itself, but the real-world impact it could have on cities and transportation.

Merging Tech with Real-World Application

The 6th-gen system integrates 13 cameras, 4 LiDARs, 6 radars, and external audio receivers (EARs), creating a 360-degree view that detects objects up to 500 meters away under all weather conditions—day or night. These improvements significantly boost the car's ability to make split-second decisions in complex environments like dense urban areas or severe weather conditions, such as rain, hail, and fog.

More importantly, Waymo has reduced operational costs by optimizing the placement and number of sensors. The system can swap out sensor configurations depending on the specific climate, allowing for greater scalability across different regions of the world. These advancements make it easier for Waymo to expand beyond test cities like Phoenix and San Francisco to more challenging environments.

Faster Testing, Quicker Deployment

By combining real-world driving data with extensive simulations, Waymo has accelerated the rollout of its driverless vehicles. The new platform already has thousands of miles of real-world driving experience, with millions more logged through simulations. These simulations allow Waymo’s AI to learn faster, cutting the time required to bring driverless cars to more cities.


Waymo vs. Tesla: The Battle for Autonomous Dominance

Waymo’s latest announcement directly contrasts with Tesla’s presentation at their AI Day (October 10, 2024). While both companies are advancing toward full autonomy, their approaches highlight the stark differences in strategy and technology.

Waymo: Safety in Sensors and Maps

Waymo continues to rely on a multi-sensor approach, integrating LiDAR, radar, and cameras with pre-mapped environments. This combination provides a safety-first strategy, ensuring high accuracy in mapped cities where the roads are well-known.

The 6th-gen system even allows Waymo to operate in extreme weather, such as freezing rain and snow, through adaptable sensor cleaning systems that can handle harsh conditions.

Tesla: End-to-End Vision, Powered by Dojo

In contrast, Tesla depends entirely on a vision-based system, foregoing LiDAR and radar. Tesla’s Dojo supercomputer processes vast amounts of real-world video data from millions of Teslas on the road, training its neural networks to make driving decisions in real-time.

This approach allows Tesla to be more flexible, as it doesn’t rely on pre-mapped environments, but it has faced regulatory scrutiny due to safety concerns in certain driving conditions

Comparing Strategies: Who Will Lead the Autonomous Race?

Safety and Reliability

Waymo’s multi-sensor setup provides redundancies that make it safer in complex environments, while Tesla’s vision-only system is more agile but faces safety challenges in unstructured or poor-visibility scenarios

Scalability and Cost

Both companies are striving to make their systems more cost-efficient. Waymo’s reduction in sensors and adaptable configurations could make it more cost-effective in diverse climates. Meanwhile, Tesla’s strength lies in the vast data it collects from its customer base, allowing the company to scale more rapidly.

Technology and Adaptability

Waymo’s reliance on LiDAR, radar, and cameras gives it a distinct advantage in reliability, but it’s confined to specific, pre-mapped areas. Tesla’s camera-only system offers broader adaptability, but at the cost of safety and accuracy under certain conditions. The company has a history of complaints about safety.

The Divas Talk Safety

I belong to a worldwide Tesla owners group. It is a little different because it is compromised all of women and has the fun and quirky name "The Tesla Divas." We share our stories of all the good and the bad with our cars and all of us received free FSD yesterday.

One Diva reported that her car did not even slow down last night when a deer ran in front of her Tesla.


With her permission, I am sharing the images from her dashcam. The most powerful image is from the side right camera where the deer came very close to colliding with the car. The vehicle should have detected the near collision and slowed down or swerved to avoid. It did none of these things and she said it was a very near miss. What if it was a cyclist or a running child instead of a running deer? These are serious issues with the software.


Another member of my Tesla Diva group reported the car hit a curb today when she had it in FSD. A far less serious incident but the car should also be able to avoid curbs. Why it didn't see it? The curbs was likely out of sight of the cameras.

A more serious incident happened yesterday when FSD did not notice this mother and child in the street. They were visible from several hundred yards down the road. The car did not slow down at all- a courtesy one would always do when a mother and child is in the road. I disengaged the moment the car did not immediately reduce speed.



My car was built on October 18, 2022 at the Tesla factory in Fremont. We live closeby so received it just three days later. It was so new we were one of the first to receive a car with "Teslavision." I did a whole video on that topic.

Was I excited to receive a car with cameras instead of LIDAR? Mmmmm.... I much rather would have received BOTH cameras and sensors. I have a stuffed stormtrooper hung up in my garage (the equivalent of a tennis ball) to help me judge distance. The cameras are not exact enough to prove useful information for getting the car into the tight space of our small garage. I did a whole video on the topic, too.

I'm now documenting my FSD adventures again. Tesla has big plans for autonomous vehicles. I applaud the vision. And, it is possible that the Tesla Dojo system can possibly get to Level 5 one day with millions of miles of real time driving data. But, that day is not today and these are serious safety issues. I will continue to be both a Tesla fan and a Tesla skeptic because it is dishonest not to report both sides of the issue.

Final Thoughts: The Road Ahead

Waymo’s new system represents a significant milestone in the march toward fully autonomous driving. With weather-proof adaptability, reduced costs, and faster deployment, Waymo is well-positioned to bring driverless technology to more cities. Tesla, however, continues to evolve its vision-based AI, relying on rapid data collection and iteration to improve its self-driving capabilities.

In this battle, Waymo's safety-first approach could set the industry standard, while Tesla's scalability may dominate in the long run. Either way, the future of autonomy is unfolding rapidly, and we are on the cusp of a transformation in how we move through our cities.


I'm a retired educator and freelance writer who loves researching AI and sharing what I've learned.

Stay Curious. #DeepLearningDaily


Additional Resources for Inquisitive Minds:

SAE Blog. SAE Levels of Driving Automation™ Refined for Clarity and International Audience (May 2021.)


Vocabulary Key

  • Vision-Based Driver Monitoring: An AI system that uses cameras to monitor the driver’s attentiveness, ensuring they remain engaged even when using autonomous features.
  • SAE Automation Scale: A standardized scale developed by the Society of Automotive Engineers (SAE) to classify levels of vehicle automation, ranging from Level 0 (no automation) to Level 5 (full automation).
  • LiDAR: A laser-based sensor that measures distances and creates detailed 3D maps of surroundings.
  • Dojo Supercomputer: Tesla’s custom-built AI computer that processes large amounts of video data to improve its neural networks for autonomous driving.
  • Neural Networks: AI systems that process data and learn from it, similar to how the human brain functions.
  • Pre-mapped Environments: Highly detailed maps created in advance that help autonomous cars navigate more safely in specific areas.


FAQ:

  • What makes Waymo’s 6th-gen system new? It features a sensor suite with cameras, LiDAR, and radars, offering 360-degree coverage and enhanced adaptability for extreme weather
  • How does Waymo’s approach differ from Tesla’s? Waymo relies on a multi-sensor approach and pre-mapped areas, while Tesla uses a vision-only system that operates in dynamic environments
  • Why is scalability important for Waymo? Waymo’s modular design and cost reduction allow it to expand into more cities, offering fully autonomous taxis at a broader scale
  • What are Tesla’s strengths? Tesla’s vision-based AI and reliance on Dojo allow for faster updates and scalability through real-world data collection
  • What’s the future of these companies? Both companies are shaping the future of urban mobility, but Waymo’s focus on safety and Tesla’s data-driven scalability could lead to very different outcomes.


Appendix:

I had to share this artwork fail from DALL-E. It helps demonstrate a key difference between Waymo and Tesla. The artwork prompt I requested was: "Now, let's do some really intriguing artwork of a Waymo racing a Tesla down a city street in San Francisco. Who will win? Who will get lost?" (In context, this was just after I finished formatting the article in my "Deep Learning Daily" GPT. I usually then use the GPT to generate the artwork since it already "knows" all of the nuances of what we are working on. This saves a great deal of time in writing a detailed prompt.)

However, there is one thing very wrong with this image. Can you spot it? (You'd have to really know autonomous vehicles to be able to pick it up.)

I asked DALL-E to generate this image again. And, again. And, everytime it put LIDAR on the Tesla. I then tried generating the images in Ideogram. And, everytime Ideogram put LIDAR on the Teslas. It's a hint, maybe?



Prompt: It is a race between a Waymo and a Tesla down the streets of San Francisco to see who is the best autonomous vehicle. Cyclists and pedestrian are in the way as these cars dodge and weave for autonomous vehicle supremacy. #Ideogram1.0



Prompt: It is a race between a Waymo and a Tesla down the streets of San Francisco. The car effortlessly make their way past challenges such as cyclists, pedestrians and wandering dogs.


Out of these four images, only one car (bottom left, the grey car on the right) does not have LIDAR. I believe this is because the AI equates autonomous with LIDAR.

(I love the two headed, two-tailed, five legged dog. AI is so delightfully weird.)

But, as a Tesla owner who sees the limitations of the cars (fog and rain), I understand what the AI is trying to tell us. LIDAR would be a big help in getting these vehicles to L5 autonomy. Yes, it complicates the software, but redundancy. Safety. Waymo has the right idea. And, I like their idea of "geofencing."

In Waymo's case, their vehicles are often geofenced to operate only in cities or regions where the roads have been extensively pre-mapped. This means that Waymo cars are limited to specific geo-fenced zones, ensuring that they can navigate safely within environments that are fully understood by their systems. (A similar approach to the way Tesla demonstrated their autonomous vehicles on a closed course in a controlled, mapped environment for the "We, Robot" event.)


#Waymo #AutonomousDriving #Tesla #AI #FutureOfMobility #DeepLearning

To view or add a comment, sign in

Insights from the community

Others also viewed

Explore topics