So here's an interesting article with a different take on self-driving cars:
The customer is always wrong: Tesla lets out self-driving car data – when it suits
"The luxury car maker is quick to divulge data to suggest its technology was not responsible for crashes but refuses to let drivers themselves see the data logs."
Several issues at play here:
1. Tesla will basically throw you under the bus if you publicly lie about what you did, in order to defend their image. Not a bad thing & perfectly within their rights, but turns out a lot of people do lie to the media:
In a statement to the Guardian, Tesla defended this practice. “In unusual cases in which claims have already been made publicly about our vehicles by customers, authorities or other individuals, we have released information based on the data to either corroborate or disprove these claims. The privacy of our customers is extremely important and something we take very seriously, and in such cases, Tesla discloses only the minimum amount of information necessary.”
Tesla indemnifies itself extensively in its privacy policy, granting itself the right to “transfer and disclose information, including personal and non-personally identifiable information … to protect the rights, property, safety, or security of the Services, Tesla, third parties, visitors to our Services, or the public, as determined by us in our sole discretion”. The legal status of user agreements varies by jurisdiction.
2. Tesla will not give you your car's raw data. Best I've seen is a copy of the detailed log to a customer post-accident. Per the article, Tesla collects a truckload of data from their vehicles, including "whether a customer’s hands were on the wheel, when a door was opened, which of its self-driving processes were active at the moment and whether or not they had malfunctioned." Two sides to that coin...yes, it can make driving safer, identify problems, and so on, but on the flip side, Tesla is basically the NSA of automotive manufacturers. afaik, none of my current or previous cars record when I open the door, haha. Wonder what the data access policy will be on your vehicle's information in the future...
3. Tesla, imo, mis-markets their automation features. "Autopilot" sounds like self-driving, not driving-assist. I think they jumped the gun on giving the feature that name. iirc there's only been one instance when they took the heat for it, although they cited the software not having that particular feature (lateral detection) & also blamed the driver for not paying attention...both valid excuses, but the word "Autopilot" still sounds like that should NOT have happened:
In only one case – the May death of Canton, Ohio, Tesla driver Joshua Brown – has the company publicly admitted that its software made a mistake. In that case, the Autopilot software did not “see” the white side of a tractor-trailer as it moved in front of the car against the white sky. The driver was reportedly watching one of the Harry Potter movies at that moment and did not see the vehicle, either.
Tesla takes issue with the characterization of Autopilot’s performance in the crash as a failure and told the Guardian that it only distributes detailed information from the site of auto accidents to the press when it believes someone quoted in the media is being unfair.
https://www.recode.net/2016/7/26/12285930/tesla-mobileye-self-driving-cars
When asked why the two companies parted ways, Sashua pointed to the companies’ respective responses to the fatal accident. Mobileye’s technology is only capable of helping to avoid accidents with cars in front of it, not trucks crossing the highway laterally, as was the case in this accident.
“This incident involved a laterally crossing vehicle, which current-generation AEB systems are not designed to actuate upon,” the company wrote.
Tesla said Autopilot, which combines proprietary and third-party technology, is supposed to be able to recognize “any interruption of the ground plane in the path of the vehicle” but “the high, white side of the box truck, combined with a radar signature that would have looked very similar to an overhead sign, caused automatic braking not to fire.”
To me, "Autopilot" doesn't sound like "oh yeah, the current-gen Autopilot doesn't see lateral obstacles". Then you have the self-parking "Summon" feature:
http://www.roadandtrack.com/new-car...3/tesla-self-driving-crash-summon-autonomous/
Tesla claims that Overton activated Summon three seconds after he stepped out of the car and closed the door, saying the action "was initiated by a double-press of the gear selector stalk button, shifting from Drive to Park and requesting Summon activation ." The automaker also pointed out to Overton that use of Summon requires that the driver agree to a terms of use that specifically mentions that the vehicle "may not detect certain obstacles" that are too low or too high for the car's sensors to see—perhaps why the car didn't stop before impacting the high-riding trailer.
However, I think this is a good point:
http://www.nbcnews.com/tech/tech-news/it-s-your-fault-tesla-sends-owner-detailed-log-after-n572926
"Even if he did forget to cancel the 'self-parking' feature and left, why is this car not smart enough to know its own clearance and the clearance it is heading towards?" wrote one Facebook user. "Self-parking means 'self-parking.' Not assisted parking or part-auto parking, SELF-PARKING."
Yes, I understand Tesla's POV, especially with the users agreeing to the legal terminology about usage, but when you put the words "Auto-pilot" and "self-parking" in your advertising and then your car runs into a semi-truck on the highway or drives itself into a trailer out of park mode...hmm. Being a beta-tester for self-driving software sounds kinda risky, especially given that certain features aren't always well-thought-out & requires media pressure to change. For example:
Currently, Summon is controlled by an app on the vehicle owner's smartphone. In order to activate Summon, the user holds down a forward or reverse button on the Summon screen of the Tesla app. If you take your finger off either of the buttons, the Tesla will stop immediately (in case you drop your phone). Originally, the feature was controlled by holding a button on the key fob, but after Consumer Reports raised concerns over what happens if you drop your keys, Tesla switched to smartphone control. It's unknown whether Overton's car uses the key fob or the app.
Scary video...imagine if there was a little kid in front of the car, instead of a bicycle or a bag:
Implementing a deadman's switch was a good idea, but I would hesitate to release a feature like that to the public without having a 360-degree infrared camera or something that would completely prevent obstacles from getting hit. CR's test model even scraped a wheel in their testing.
On the flip side, Autopilot is twice as safe as a human at driving:
http://www.telegraph.co.uk/technolo...s-autopilot-makes-accidents-50pc-less-likely/
"The probability of having an accident is 50 per cent lower if you have Autopilot on," said Musk, speaking at an energy conference in Oslo, Norway. "Even with our first version, it's almost twice as good as a person."
Drawing on early data from Tesla's cars, Musk said that the average number of kilometres driven by a car before an accident was almost double when Autopilot was switched on.
It's kind of hard to gather enormous amounts of data without, well, gathering enormous amounts of data. So beta-testing with actual drivings is really the fastest way to gather all sorts of information about different road types, weather conditions, driver behavior, and so on. Downside is that it comes with some risks, along with some currently mis-marketed feature names. In my own testing of the Gen1 Autopilot on a Model X, I did feel very confident with it on the highway. Plus they have a software update now that requires you to wiggle the wheel every 30 seconds or so (which I think is a dumb implementation; they should have some kind of touch-detection sensor on the wheel because wiggling the steering wheel is a bit annoying, but as the car doesn't have that tech already, they went with a solution using the existing hardware), so it's harder to not pay attention like the guy in the semi-truck accident was doing.
Props to Tesla for pushing this technology forward, however. I am curious to see how other automakers deal with self-driving systems. Tesla has an enormous lead as far as data collection goes; also curious if they'd ever be willing to share that data to make other brands of self-driving vehicles safer (legal issues aside).