Just took a look at a Tesla discussion forum about this post. It is interesting that some folks there have reasonable discussion points, which is great! But some Tesla owners seem to be unaware that they are still beta testers.
The Tesla model 3 owner's manual says "Autosteer is a BETA feature." and "Navigate on Autopilot is a BETA feature." Model Y 2025+ manual says the same thing. Yep, Tesla still says they have beta software deployed on public roads...
FSD does not say "beta" explicitly, but does say: "Full Self-Driving (Supervised) and its associated functions may not operate as intended".
I'm going to say that "may not operate as intended" is either an admission that it has defects, or an implicit statement that it is beta / not full series production quality.
'pre-homologation prototype' approval framework requires you to obtain a test permit before testing any supervised / full automation system on public roads.
It's actually been inspired by SAE J3018 and the AVSC publications. Hence, we address some key elements like system maturity acc. to TRL scale, safety driver training program and qualifications, driver monitoring and other risk mitigating controls such as a limited test fleet size, limited test periods and weekly reporting as part of the oversight.
Thanks for the comparison to European approaches. We do not have type approval in the US.
However, in California they have a permitting system for autonomous driving defined as SAE Levels 3-5. Tesla tells regulators that their FSD autonomous driving prototype software is Level 2, evading regulation.
While I completely agree with this article, one other factor I keep thinking about whenever I see a Waymo engineer in the back of an autonomous vehicle, is that our roads and infrastructure are not designed for safe, autonomous driving. We can perfect the SW and test it toughly and completely, but from state to state, roadway safety and infrastructure differs. Continuing to invest in driving assistance, like putting your headlamps on when it’s dark or foggy (which happens all the time) or alerting the driver to a potential collision is paramount. If our country wants more autonomous driving, then we need to some drastic changes on our roadways and infrastructure.
Fair point, but this is a tough one. Where I live in Pittsburgh a bridge collapsed not that long ago due to deferred maintenance. It is unclear where the money is going to come from to upgrade infrastructure for autonomous vehicles.
For now I think it is realistic that robotaxi companies are thinking they need to work in the as-built environment. And that means it will be challenging to operate safely at scale.
I too like the idea of investing in support and active safety features.
Part of your argument is agreeable, though the general crux is disagreeable. Any and all testing of driverless vehicles will always, by definition, be a beta test.
Thank you for the article. It is kinda sad to see how practices applicable for mobile apps are applied to less controlled and far more dangerous environments. As an owner of on the betas I wish they at least had a kill switch in every “beta” car. One of the things I always afraid of that a car will go nuts and I have no reliable tools to stop its madness. So I stoped using Tesla FSD after a few unpleasant attempts.
Thanks for your comments Anton. Not only is it possible you would not be able to react in time to stop a crash, but there is a chance you would be held personally liable. Good call on not being a public beta tester.
Thank you for highlighting this important topic. It is also important to emphasize that serious companies (e.g. logic-bearing medical devices, aircraft, navigation infrastructure and industrial machinery including offshore oil rigs, etc.) that expose the public to serious harm from developmental software embrace detailed test protocols and/or physical barriers to assure safety before release. Technology readiness levels published by the US DOD and DOT [1] and CMMI software maturation practices from Carnegie Mellon University [2] (I think you have heard of this institution, Phil) provide useful guidance. Beta testers are demonstrably competent, indemnified, and paid. While not perfect, quality escapes from those protocols are rare and subjected to extensive root cause determination before testing is resumed. AVs and consumer firearms are the only products that rely on customer or third party liability, injury, or death as surrogates for competent test protocols. That needs to change. In favor of public safety.
Thanks for spelling this all out. Indeed, it's amazing how computer drivers have altered the meaning and the intent of "beta testing" over time. It's time to go back to basics: public roads, shared by conventional human drivers, pedestrians and cyclists, shouldn't be playgrounds for computer drivers.
I was an official beta tester for software back in about 1981. Had to sign a testing agreement, NDA, US mail weekly bug logs to the company, etc. Times have really changed.
Just took a look at a Tesla discussion forum about this post. It is interesting that some folks there have reasonable discussion points, which is great! But some Tesla owners seem to be unaware that they are still beta testers.
The Tesla model 3 owner's manual says "Autosteer is a BETA feature." and "Navigate on Autopilot is a BETA feature." Model Y 2025+ manual says the same thing. Yep, Tesla still says they have beta software deployed on public roads...
Source: https://www.tesla.com/ownersmanual/model3/en_us/GUID-20F2262F-CDF6-408E-A752-2AD9B0CC2FD6.html#:~:text=lane%20(see%20%20).%20Note%20Autosteer%20is%20a%20BETA%20feature.%20%20%3A%20When%20you%20engage
FSD does not say "beta" explicitly, but does say: "Full Self-Driving (Supervised) and its associated functions may not operate as intended".
I'm going to say that "may not operate as intended" is either an admission that it has defects, or an implicit statement that it is beta / not full series production quality.
Source: https://www.tesla.com/ownersmanual/modely/en_us/GUID-E5FF5E84-6AAC-43E6-B7ED-EC1E9AEB17B7.html#GUID-4EE67389-5F55-46D0-9559-90F31949660A
In my country (in Europe), the
'pre-homologation prototype' approval framework requires you to obtain a test permit before testing any supervised / full automation system on public roads.
It's actually been inspired by SAE J3018 and the AVSC publications. Hence, we address some key elements like system maturity acc. to TRL scale, safety driver training program and qualifications, driver monitoring and other risk mitigating controls such as a limited test fleet size, limited test periods and weekly reporting as part of the oversight.
Thanks for the comparison to European approaches. We do not have type approval in the US.
However, in California they have a permitting system for autonomous driving defined as SAE Levels 3-5. Tesla tells regulators that their FSD autonomous driving prototype software is Level 2, evading regulation.
While I completely agree with this article, one other factor I keep thinking about whenever I see a Waymo engineer in the back of an autonomous vehicle, is that our roads and infrastructure are not designed for safe, autonomous driving. We can perfect the SW and test it toughly and completely, but from state to state, roadway safety and infrastructure differs. Continuing to invest in driving assistance, like putting your headlamps on when it’s dark or foggy (which happens all the time) or alerting the driver to a potential collision is paramount. If our country wants more autonomous driving, then we need to some drastic changes on our roadways and infrastructure.
Fair point, but this is a tough one. Where I live in Pittsburgh a bridge collapsed not that long ago due to deferred maintenance. It is unclear where the money is going to come from to upgrade infrastructure for autonomous vehicles.
For now I think it is realistic that robotaxi companies are thinking they need to work in the as-built environment. And that means it will be challenging to operate safely at scale.
I too like the idea of investing in support and active safety features.
Part of your argument is agreeable, though the general crux is disagreeable. Any and all testing of driverless vehicles will always, by definition, be a beta test.
Thank you for the article. It is kinda sad to see how practices applicable for mobile apps are applied to less controlled and far more dangerous environments. As an owner of on the betas I wish they at least had a kill switch in every “beta” car. One of the things I always afraid of that a car will go nuts and I have no reliable tools to stop its madness. So I stoped using Tesla FSD after a few unpleasant attempts.
Thanks for your comments Anton. Not only is it possible you would not be able to react in time to stop a crash, but there is a chance you would be held personally liable. Good call on not being a public beta tester.
Thank you for highlighting this important topic. It is also important to emphasize that serious companies (e.g. logic-bearing medical devices, aircraft, navigation infrastructure and industrial machinery including offshore oil rigs, etc.) that expose the public to serious harm from developmental software embrace detailed test protocols and/or physical barriers to assure safety before release. Technology readiness levels published by the US DOD and DOT [1] and CMMI software maturation practices from Carnegie Mellon University [2] (I think you have heard of this institution, Phil) provide useful guidance. Beta testers are demonstrably competent, indemnified, and paid. While not perfect, quality escapes from those protocols are rare and subjected to extensive root cause determination before testing is resumed. AVs and consumer firearms are the only products that rely on customer or third party liability, injury, or death as surrogates for competent test protocols. That needs to change. In favor of public safety.
[1] https://www.gao.gov/products/gao-20-48g
[2]https://cmmiinstitute.com/learning/appraisals/levels
Thanks for spelling this all out. Indeed, it's amazing how computer drivers have altered the meaning and the intent of "beta testing" over time. It's time to go back to basics: public roads, shared by conventional human drivers, pedestrians and cyclists, shouldn't be playgrounds for computer drivers.
I was an official beta tester for software back in about 1981. Had to sign a testing agreement, NDA, US mail weekly bug logs to the company, etc. Times have really changed.
Wow...reporting weekly bug logs?! The companies did have a moral, ethical and technical "stanadard" back then.
Yup.... agree...