We Have To Talk About Tesla: A Look At Recent Legal and Regulatory News Surrounding The World's Most Valuable Car Maker

We Have To Talk About Tesla: A Look At Recent Legal and Regulatory News Surrounding The World's Most Valuable Car Maker

Dear Reader,

February has been a somewhat ambiguous month for me. There's lots of tech news within automotive that I want to celebrate, but they are contrasted by events that make me less enthusiastic.

This month, my thoughts are increasingly occupied by the plentiful news surrounding Tesla: the deposition of their Director of Autopilot Software, the NHTSA FSD recall and the implications for automotive as a whole. I figure there's enough going on to warrant our full attention this month - so let's go:


The Elluswamy Deposition

Ashok Elluswamy is Director of Autopilot Software at Tesla. He features in a lawsuit filed by the family of a Tesla driver killed in an Autopilot-related accident in 2018. Elluswamy was deposed by the family's lawyers in June 2022, and the transcript of that deposition recently became public. Let me walk you through five parts that caught my eye specifically ...

It appears that Elluswamy regards human-supervised automation as safe by default. He claims to be unfamiliar with the concept of perception-reaction time, and unable to recall having ever discussed it within Tesla. Same for 'functional specifications' and 'behavioral competencies' in relation to Autopilot:

Screenshot from the Elluswamy deposition transcript

Elluswamy claims not to know the term ODD, and denies familiarity with the concept it describes. He also says he's unaware of ever having seen a document describing the ODD for Autopilot:

Screenshot from the Elluswamy deposition transcript

The answer to the last question there (whether the Autopilot SW team ever made a decision about where/when Autosteer is supposed to operate) is also "I do not know."

Regarding Tesla's DMS, Elluswamy says he is unaware of any internal discussions about flaws. This is in regards to the torque-based 'steering wheel nag' which Elon Musk later tweeted he wants to disable for drivers with more than 10k miles of FSD-active driving:

Screenshot from the Elluswamy deposition transcript

Regarding safety, Elluswamy shows two interesting notions. He is not aware of a dedicated safety team existing at Tesla - and he seems to equate system safety with the performance of sub-systems:

Screenshot from the Elluswamy deposition transcript

This notion of safety as a direct result of performance (which safety professionals will disagree with) is seen in more clarity when Elluswamy speaks about how Tesla decides whether to ship a new Autopilot software release:

Screenshot from the Elluswamy deposition transcript

It's important to remember that these are not the statements of an individual engineer who might excusably lack an overall perspective. Elluswamy is the Director "leading the autonomy software team for the Tesla Autopilot" - what he says he is or isn't aware of matters: It speaks to an entire development culture and the role of industry/safety best practices within that.

So is any of that likely to change, now that NHTSA has issued a recall for FDS? Well ...


The NHTSA Recall

On February 16th, news broke that Tesla would recall 362,758 cars outfitted with FSD because the system may cause crashes. This is technically a voluntary recall, the details of which have been jointly agreed on by Tesla and NHTSA.

Some people were surprised to learn that the measure qualifies as a recall, given its proposed remedy - an OTA software update. That in itself is neither unusual nor does it make this any less of a recall; this is how software-defined functions are regularly updated and upgraded these days, after all. However, we need to look at the implications in the context of Tesla's development/safety culture:


  1. FSD is deemed unsafe and a recall is announced.
  2. The recall is to be performed within 60 days - during which FSD does not have to be disabled in the 362,758 affected cars.
  3. The remedy is an OTA update, the adequacy of which Tesla can self-assess: There is no requirement for external pre-approval when introducing or updating an AD system for Tesla or any other OEM in the US.
  4. Bonus: There is no mention of computer system safety issues in the underlying FMVSS (Federal Motor Vehicle Safety Standards).


Now recall how Elluswamy describes Tesla's approach to Autopilot software updates, equating performance and safety:

"We evaluate the net -- the total system performance and then compare that against the previous release. And if the total system performance is better than the previous release, the we are obligated to release it because it's net safer."

Releasing software updates/upgrades for Autopilot and FSD is a routine exercise for Tesla. With no outside audit required, no FMVSS requirements for computer system safety to adhere to and the recall not extending to their development/validation processes, its consequences seemingly boil down to 'keep doing what you're doing'.

I've been chatting with people who have way more experience and insight into NHTSA's working than I will ever have; none of them have a clear opinion on what happened here yet. It's worth noting that while NHTSA employs capable career professionals, the head of the organization is a political appointee - and currently, acting head Ann Carlson is hoping to win President Biden's nomination for the permanent position as well as required US Senate approval.

What this may or may not mean for how NHTSA directs its handling of Tesla is food for speculation, but public pressure has been building for the agency to act. One thing I'm concerned about is what it may mean for the industry as a whole:


The Implications

Tesla is an outlier in automotive. Elon Musk's tolerance for risk and the OEM's resulting push for the limits of what's technically/legally possible are quite unique. Traditional OEMs have taken a much more conservative approach to automated driving, supervised or not: 'Safety first' is a credo that's followed in honesty across most companies in this industry.

However: Virtually all mass market OEMs are publicly traded companies and owe profits to their shareholders. This means that a commitment to safety must be weighed against the potential profitability and risks of ... a little less safety. If Tesla can continuously push the limits for this and face the kind of non-consequences we're seeing this month, the weighing of the perceived risk has to change also for its competitors.

It's much too early to tell if this will lead to any re-assessments among other automotive players. But safety professionals, already often accused of being innovation blockers might have a harder time making their case in the future - and EU legislators could face more pushback from a powerful automotive lobby about what does and does not constitute 'unreasonable risk'.

Bottom line: I don't know what will come from this but I think it's a conversation we all need to have. Tesla is the world's most valuable automotive brand, and their intersections with legislation and regulation matter - as do our reactions to them.

As for the ambiguity: Nothing in this world is defined by one thing alone. Personally, I remain in awe of the high-performance data engine Tesla has built as well as of their impact on industry-wide electrification - while at the same time I find myself sceptical of their approach to safety ...

---

That's all I have for you today; thank you for reading. And as always, please feel free to share your thoughts in the comments!

All the best


Tom Dahlström

Hi Tom, thanks for your article. Shure Elluswamy knows about ODD. Nevertheless his reaction shows that Tesla thinks it would be critical to disclose the safety architecture of the Autopilot. Switching to an open source like approach would make such a system safe. Hiding is the wrong concept except for legal reasons. Regards Christian

Mike Allocco, Emeritus Fellow ISSS

System Safety Engineering and Management of Complex Systems; Risk Management Advisor...Complex System Risks

1y

What do you expect system safety?

Jaya K.

Striving for safe and sustainable mobility!

1y

Great Post! Thanks for consolidating and highlighting and initiating this discussion! I share your concern about the traditional OEMs modifying their risk tolerance looking at the way the FSD recall was handled. As the EU has stricter type approval processes and regulations compared to US, I wonder this may even lead to systems being deployed in various geographies with different risk tolerances.

Gulroz Singh

Semiconductor Safety Architect at NVIDIA | Technical Speaker | Author | Mentor 🌱

1y

Thanks for making the effort to go through the deposition! Insightful!

Ralph Grewe

Developing innovative L4 perception products for Continental - with a strong commitment to agile methods.

1y

Reading through Elluswamy answers there is one thing which makes me a bit careful: Reading through the answers I have the feeling that they are highly strategical with the goal to put him into a better position. So the answer “I’m not aware of an ODD” might not mean that there is no such thing at Tesla, but that this answer puts him into better position for some reason. Second, there of course have been many short questions using a lot of “automotive slang”. My experience is that discussing with experts in other domains can man days of just finding common nomenclature (even within different fields of automotive). Arguing for days just to figure out that you’re talking about the same thing at the end is not uncommon. So the next question would be if Tesla really doesn’t understand what they are doing or if they have to considered to be another domain than automotive. Third, if we want to see certain problems addressed, we tend to look at the according teams in the organization. So if you want “Safety”, you have a “Safety Team”. In practice you then have the risk to build up strong borders between the teams: then it is safety against development, what also doesn’t lead to good results in the end.

To view or add a comment, sign in

More articles by Tom Dahlström

Insights from the community

Others also viewed

Explore topics