And it's not like you are not aware of it as that has been discussed.
I had a really fantastic driving experience with the latest City FSD last week. I was honestly surprised at how good it was. It also requires hands on the wheel & eyes on the road now, which is an improvement to the previous software, as well as an improvement over most other competing systems that require or don't require proactive driver attention. Incremental changes are better than no changes, imo!
Part of the bigger picture is how to do we get from Point A (where we are now) to Point B (safer drive & safer self-driving). I wrote my high school thesis paper on automotive safety back in the day (thanks Nader!) & have had a keen interest in it ever since, as I was absolutely SHOCKED by the vehicle death rate in America! People are not exactly much better than Autopilot software:
* Over 34,000 people are killed in motor vehicle crashes each year in America
* That's more more than 100 people per day
* There are over 5 million crashes a year with more than 2 million injuries
Nearly half of the people killed were not wearing seatbelts:
Nearly half of all car passengers killed on US roads in 2021 were not wearing a seat belt, according to newly released data from NHTSA.
abcnews.go.com
Within that context are the alcohol-related statistics:
Get resources on ways to prevent drunk driving and alcohol-impaired crashes along with national drunk driving statistics and facts.
www.nhtsa.gov
* 37 of those 102 people die in drunk-driving crashes
* That's one person every 39 minutes
* In 2022, a total of 13,524 people died in alcohol-impair driving traffic deaths, which were all preventable
So the questions are:
1. Do we need a perfect Autopilot system to move forward?
2. From a purely statistic perspective, if the system is imperfect and kills people, but lowers the
overall crash-related deaths & injuries, is that net positive
still worth investing in?
3. Can we expect instant perfection without a
massive amount of training data?
The conservative numbers you linked to represent only about 1/4 of publicly-tracked data. To date, we actually know of 44 fatal Autopilot crashes. The crashes are tracked in detail on this website:
Tesla Deaths is a record of Tesla accidents that involved the death of a driver, occupant, cyclist, motorcyclist, or pedestrian.
www.tesladeaths.com
In addition, the 44 fatalities are not monolithic, as Autopilot was enabled in each of them, but did not necessarily cause the crash. There are 3 basic categories:
1. Autopilot system malfunctions (faulty programming, unable to cope with different situations, etc.)
2. Unavoidable situations
3. Intentional driver misuse
The reality is that there IS no perfect system. Was the Autopilot system rushed to the market? I absolutely think so. Does it have the potential to reduce the overall automotive deaths & injuries on the road before the system gets perfected? I believe so. But then you get people doing nonsense like this:
Tesla says Autopilot and FSD system require driver to pay attention at all times
www.the-independent.com
I'd wager that the current City FSD is an improvement over a drunk driver driving themselves home. Whereas we've had over 40 documented fatal Autopilot crashes, imagine if self-driving technology was government-standard on all new cars & imagine the impact (economics of implementation aside) that would have on the 13,000+ annual deaths from drunk driving.
In addition, I automatically assume that anything released by a company is "murky data" as it is intentionally slanted to make them look good. The most recent data release from Tesla :
Tesla has finally decided to release its Autopilot safety data report after taking a break of more than a year....
electrek.co
"The automaker is only now releasing the data as Q1 2024 shows a significant improvement for Autopilot:
* In the 1st quarter, we recorded one crash for every 7.63 million miles driven in which drivers were using Autopilot technology.
* For drivers who were not using Autopilot technology, we recorded one crash for every 955,000 miles driven.
* By comparison, the most recent data available from NHTSA and FHWA (from 2022) shows that in the United States there was an automobile crash approximately every 670,000 miles."
From a big-picture perspective, we live in a dangerous world filled with impaired drivers, pickup trucks with man-sized grills, and huge vehicles like semi-trucks, busses, RV's, and dump truck around us all the time. I think anything we can do to improve the odds of safety is a Good Thing. I don't think we can do it perfectly or have a perfect rollout. I think Tesla could have done a LOT better with their Autopilot rollout over the years, including making those safety assists an initial requirement & not mis-marketing the capabilities, but I also feel like they triggered a storm of driver assistive technologies. Subaru's EyeSight came out in 2015, Tesla's Autopilot came out in 2015, etc.
Do I think Tesla was irresponsible in their initial rollout of the technology? Yup! Could they have put stricter safety measures in place right off the bat? For sure! Remember this poor guy?
But per your statement here:
"We, the public, are being subjected to Level 2 autonomous driving in perpetual beta."
This is not wrong, but that's also not the only factor to consider in overall vehicular safety. For example, "according to the National Highway Traffic Safety Administration, about 1.5 million people were arrested in a given year for driving under the influence of alcohol or drugs." So it's not just beta self-driving vehicles we're up against...it's poorly-maintained vehicles, impaired drivers, people with anger issues, bad weather, etc.
I think anything that can give us a net positive improvement is better than wishing for a perfect future that will never come. Autopilot has been out 9 years with 44 documented fatalities, which averages out to about 5 a year. I don't know the statistics of Autopilot usage, but they've shipped 5.8 million Tesla vehicles to date. Surveys show that about half the people who have access to FSD technologies (not just from Tesla) trust the system "too much":
When Tesla sells a Full Self Driving package for $15,000 US, people seem to trust the name and use it pretty much as such, according to a new survey, even if they only have the less capable Autopilot system.
www.forbes.com
This is what I've referred to in other threads as the "lull of complacency", i.e. because it works pretty well MOST of the time, we get habitual about it. I feel like Tesla has always over-sold Autopilot as a mature solution, when it's really very far from it. My recent City FSD experience was absolutely fantastic, but it's still a beta system with beta issues & needs to be babysit to ensure accident avoidance.
But why are habits & the "lull of complacency" important? Because out of the NHTSA's 14 recorded Autopilot deaths & 49 serious injuries, plus other collected data, they found that in 78 of those incidents, the drivers had enough time to react as the human driver...but failed to do so, despite have 5 full seconds to do something about it:
The National Highway Traffic Safety Administration has concluded a lengthy investigation into Tesla’s Autopilot system. It found 13 fatal crashes due to misuse and software that doesn’t prioritize driver attentiveness.
www.engadget.com
The most serious were 211 crashes in which “the frontal plane of the Tesla struck a vehicle or obstacle in its path” and these were often linked to Autopilot or FSD. These incidents led to 14 deaths and 49 serious injuries. The agency found that drivers had enough time to react, but didn’t, in 78 of these incidents. These drivers failed to brake or steer to avoid the hazard, despite having at least five seconds to make a move.
I don't necessarily disagree with your take on it. The ultra-fanboys over-hype everything about the technology. It IS scary to be on the road with beta software. But I also don't know if avoiding the technology altogether would raise or lower the overall death rate on the road, and whether that net gain in human lives is worth, statistically-speaking, the short-term risk to get it out on the road & collecting data. Many of the earlier reports I've seen disagree with Tesla's data & say that their numbers are WORSE for safety statistics than a human being behind the wheel.
Either way, the cat is out of the bag by nearly a decade at this point & there's not really anything we can do about it. Other manufacturers are using various implementations of FSD (and safer ones, imo, such as Ford's BlueCruise, which also have the benefit of radar & LIDAR sensors). There are still things I don't like about it, like how the vehicle would driver right over potholes & deep manhole covers, so the technology definitely has a long way to go.
Overall however, I was VERY impressed with the latest City FSD update. It put my perspective on City self-driving 5 years ahead of what I had previously tested it at. However, I think it still has a LONGway to go. I don't know if I would personally ever trust it to act as a robotic taxi or to take a child alone to school or soccer practice. But again...VERY impressed with my experience in it. The self-driving capabilities in the city & backgrounds was nothing short of amazing!