How Tesla’s Promises Compare To Reality
Howdy👋🏾. Last Thursday was Tesla's highly anticipated “We, Robot” event. Elon Musk envisioned fully autonomous cars, robot taxis, and household robots straight out of The Jetsons—everything from watching our kids to cleaning our homes.
I found the event needed more clarity, with Musk repeating the same promises and pushing timelines further into the future. While competitors make headway and begin delivering on the promises, Tesla still hasn’t met. So, I thought I’d dig into why it’s getting harder to take Tesla’s word for it for this week's newsletter.
Let’s start with Full Self-Driving (FSD), a feature Tesla has long promised for every vehicle. To understand FSD, it's important to know the different levels of autonomy as defined by the Society of Automotive Engineers (SAE):
According to the Society of Automotive Engineers' definition, all Teslas with FSD are rated as Level 2, meaning the human driver remains responsible for vehicle control and must be ready to intervene at any time. The system is designed to augment, not replace, the driver.
On the other hand, Waymo—Alphabet’s self-driving taxi service available in Austin, Phoenix, and San Francisco, with Atlanta coming soon—is a Level 4 vehicle. This means the car is expected to handle all tasks without human assistance, although a remote operator can take control if necessary.
If you’re willing to take Tesla’s timeline as fact, let’s not forget that they promised Level 4 or 5 FSD as far back as 2016.
These days, competition has gotten much tougher. Waymo is an obvious one, but G.M. has introduced Cruise, Ford has BlueCruise, and Mercedes has released a limited Level 3 vehicle capable of driving on some U.S. highways, with testing of a Level 4 vehicle in China, where Tesla is dying stiff competition from several Chinese manufacturers releasing Self Driving vehicles while Tesla is hoping to get approval.
If Tesla's steering-wheel-free car feels groundbreaking, remember that Waymo showed us its concept more than two years ago.
Fast forward to last week's event, and once again, Tesla's FSD promises feel more like wishful thinking. That said, Tesla has some believers, like Waymo's Co-founder Anthony Levandowski, who was quick to remind detractors how valuable all that driving data is from a fleet of vehicles on the road for years. Who knows, Tesla might still surprise us, especially after SpaceX stuck the landing in a remarkable show of the company's tech chops.
Now my thoughts on tech & things:
🤖 Tesla’s Optimus Robots Are Fake Elon Musk’s Optimus robots, which he claimed could be a top-selling product, are far from autonomous. They were under human control, raising doubts about Tesla's advancements in robotics.
🔐 Backdoors Don’t Work U.S. communications have been compromised by backdoors mandated for government access. These vulnerabilities were exploited by China, showing that backdoors pose risks to everyone, not just intended users.
Recommended by LinkedIn
🚗 Who’s Really Driving? Many people think they’re riding in robotaxis when in reality, humans are still behind the wheel. Uber and Lyft drivers using Tesla’s Autopilot are sparking serious safety concerns in this regulatory gray area.
🎥 Meta’s AI Video Models Keep Advancing Meta’s AI video and audio models just got major updates, but it’s unclear when these tools will be publicly available. As these models improve, we may soon see AI-generated video content go mainstream.
As a Tesla Model 3 owner, I appreciate its FSD features, but as someone who's also ridden in the back of a Waymo, I'm still not convinced that Tesla will deliver on its lofty promises anytime soon. But I'd love for them to prove me wrong.
One of the biggest unanswered questions about FSD is liability—who's at fault when something goes wrong? From my experience with Tesla's Autopilot and FSD, it seems like the system is designed to disengage if a collision is imminent, shifting responsibility to the driver.
However, in Level 3 and beyond, the manufacturer should assume responsibility unless the driver is actively in control during the incident. So, who's liable in Tesla's vision of a world where cars might operate like franchises, and owners send them out to work as robotaxis?
A few weeks ago, Innovate DMV, U.S. Ignite, and the Black Meta Agency invited me to speak to their first cohort for "A.I. for DMV Startups." In the coming weeks, I'll be speaking at D.C. Startup & Tech Week, reboot WTCI's AGILE series in November, and wrap up the year at NYC's A.I. Summit. I'd love to see you at any of these events! Visit my site for more information on my upcoming speaking engagements.
📕 Remember to pre-order my book, The A.I. Evolution.
Thanks for reading, and please forward or share this newsletter with anyone you think would enjoy it!
-jason
P.S. While on self-driving cars, remember that some real-world scenarios still need to be worked out. Case in point: a Waymo robotaxi recently got stuck in Vice President Kamala Harris's motorcade route. I bet the engineers didn't program for that one!
It was perplexing in 2022 when Musk first unveiled a fake robot - even more so that he just did it again. And it’s not clear if they understood how obvious this is…were they hoping to fool everyone (some of the giddy coverage suggests some people were fooled). Not sure what all this says about the tech, but it’s definitely suspect marketing.