You have to learn how to use it properly and pay attention. I use it a lot and it can drive from San Francisco to LA pretty much without stopping. But every once in a while it does mess up and you just need to make sure you're watching and ready to take over quickly. I agree that it's good enough that people might stop paying attention, but they just need to realize that they have to hold the software's hand in these initial stages. As a matter of fact, being in the drivers seat able to take control makes me much more comfortable in a self driving Tesla than in the back seat of a much more advanced Waymo self driving car
Trust me, it's still a huge relief. A day and night difference. Try it on a long drive, like LA to San Francisco and you'll see what I mean. It kind of feels like you're in the passenger seat rather than the drivers seat, but you can still grab the controls if you need to.
If it allows the driver to e.g. reply to a text message on his phone, then it's dangerous.
Either it requires the driver to occasionally take over and handle a situation on short notice, then the driver should have hands on the wheel and eyes forward (and the car should ensure this by alerting and slowing down immediately if the driver isn't paying attention).
Or the driver isn't required to pay attention and take over occasionally, and then it's fine to reply to that email on the phone while driving. I see no future in which there is a possible middle ground between these two levels of autonomy.
Regular manual automotive controls, in practice, "allow[] the driver to...reply to a text message". Illegal or not, it's done by millions of people every day, and for the most part, they don't crash. You'd have to actually add control instability to the car to force the kind of attention you want out of drivers, and nobody would buy a car with this deliberately annoying handling.
There is a double standard here: cars with autonomous features are held to the new standard, meat drivers are held to the old standard.
Just like we will probably never accept (neither socially nor legally) autonomous drivers that aren't at least an order of magnitude safer than human drivers, we will likely continue to accept drivers not paying attention to their manual controls - but we do not accept drivers not paying enough attention to take over after their AI driver.
> nobody would buy a car with this deliberately annoying handling.
Exactly. And since this is the only way of making a reasonably safe level 3 car, this is also why many car manufacturers have actively chosen to not develop level 3 autonomous cars (because they are either not safe OR annoying - and either way it's a tough sale)
This isn't a black and white scenario. I use AP for probably 50 miles of my 60 mile round trip commute. Here's how it breaks down for me:
In the morning, I leave at 5:30am. There is constant moving traffic at 60-80mph on the first leg of my trip (6 miles). I drive manually, or with AP, and I pay full attention, 100% of this time. If I'm using AP, it's because it's at least as reliable as I am staying in lanes.
I then change interstates. I do this in manual mode almost every time. AP isn't great at dealing with changing multiple lanes quickly and tightly like traffic often requires. Once I'm on the new interstate, I get into my intended lane (second from the left), put the car in AP, and we have what you'd consider stop and go traffic for a few miles. At this point, I open the can of soda I brought. I keep a hand on the wheel, and I pay attention, but I'm more relaxed than I was earlier, when I was doing 60-80mph. At this point, the only thing that I need to do is to respond to someone jumping into my lane and cutting me off (which the AP deals with, but I have more faith in my ability to slam the brakes), or road debris, which at this speed, is not a problem that needs less than a second of response time.
There's a slow steady 40mph drive that I'm in full AP for, drinking my soda and paying attention, and then the traffic thins out, and I notice that I'm starting to lag behind cars, because my AP has a set max of 70MPH from back where the speed limit was 65mph (even though I was only going 20-40). At this point, both hands on the wheel, full attention, and I increase the AP max to 75 or 80, depending on how much traffic I'm in. I switch it back over to manual to make the lane changes necessary to hit my exit, and I'm manual until I park my car at work.
On the way home, I'm in stop and go traffic for an hour. When I'm 'going', I'm going 5-10mph. I am in full AP mode 95% of this time, and I could take a nap at this point, and it wouldn't actually be unsafe. I'm safer with AP than manual at this point, because my attention fades if not and I could drift lanes, or bump the car in front of me. Which I've seen happen to other people countless times, and which just increases the amount of time everyone else is in traffic, too.
Even the most egregious lane jumper can't get into my lane too fast for AP at this stage of my commute. I just set the follow distance to 2, and listen to audio books while I browse twitter or facebook. I look at what's going on out the windshield, but it's virtually unchanging. Like the thousands of people surrounding me, I'm slowly creeping forward, waiting on the 20-30 miles to pass. For an hour and a half.
This is the same non-argument - people don’t want safer (in that case a crappy autopilot that’s only slightly better than a human driver would be a viable product) - they just don’t accept any notion of unsafety in new tech.
The bottom line is that people don’t accept any risk at all involving autonomous driving - regardless of whether the alternative/old tech was worse. So, to put it very bluntly, people accept being hit by a texting person not paying attention for 1 second. People don’t accept being hit by a person in a level 3 autonomous vehicle not paying attention for 10 seconds - and that’s regardless of the relative safety of the two systems.
Obviously if you use Autopilot in bumper to bumper traffic this is an improvement, and a huge improvement over texting while manually driving. But texting at highway speed is thankfully rare when manually driving and should be just as outlawed with AP.
You could achieve the same level of comfort today on many common cars with adaptive cruise control. Autopilot gives you a false sense of security when it works great 90% of the time.
Again, you're making the same mistake op talked about - that the Autopilot works 80-90% so you'll keep letting it drive you while you relax, which means it's just a matter of time until you'll crash.
I don't trust you to be able to snap out of a distracted state and take control of a car in a situation you may or may not have been paying attention to. I don't trust you to do that, I don't trust the driver behind me to do that, and I don't trust the drivers to the left and to the right of me to do that.
I should not need to trust you -- if I trusted you, why would we need autopilot at all? Humans are either qualified to drive cars or they aren't. If they are qualified, we don't need autopilot. If they aren't qualified, autopilot should never require human intervention. What we have now is a half-measure that assumes that neither humans nor autopilots can be trusted, but that some combination of those two untrustworthy parties can somehow be trusted. It doesn't make any sense.
Please. You should see me when the car isn't on AutoPilot. I speed excessively, weave through LA traffic switching lanes frequently and generally exhibit unsafe driving behavior. I can't help it, it's like I get bored or something. I'm a decent driver and have never gotten in a crash but have gotten a lot of tickets. When I use AutoPilot, the car automatically maintains a reasonable distance at a fixed speed. You need to trust people driving other cars today, just as you always have -- that hasn't changed yet. AutoPilot related accidents are new, but accidents aren't. So yes people will die using AutoPilot but the solution is to know the systems limits and also maybe enhanced attention detection systems in the car, not banning AutoPilot or anything like that. I do believe AutoPilot can already reduce the number of crashes today.
The 3 Autopilot deaths so far involved 3 relatively young men, both in relatively elite professions: former Navy SEAL and an Apple engineer -- the third was a son of a Chinese business owner [0]. They fit the profile of men fairly confident about driving and tech, perhaps too confident. 3 fatalities over 320 million miles for Teslas equipped with Autopilot hardware is not much better than the 1.16/100 million miles fatality rate of all American drivers and vehicles.
I know, it definitely gives me pause to see people dying and is definitely a reminder to be safe. I remember when I was using AutoPilot on the 101 in the bay area this weekend, I noticed I was in the far left lane and decided to switch to a middle lane. So it's not that I don't think AutoPilot could kill me, as a matter of fact there have been several times when AutoPilot was definitely about to kill me but I took control in time to course correct. I think I've used the system enough to get a sense of what it can and can't do and feel that I'm less likely to get in a collision due to not paying attention while AutoPilot is on than I am to get into a collision because I was driving.
Also, in response to the statistics you cite: I wouldn't expect the statistics to be that far off the average because AutoPilot is limited in it's possible use cases at the moment. Even in cars equipped with AutoPilot people are always still driving the car for at least some part of the trip. Therefore I wouldn't expect the impact of AutoPilot to lead to a significant deviation from the average. Plus, I'd speculate that the crash rate per mile is probably higher on a Tesla then an average car -- I'm thinking of like a Honda Civic. Faster cars probably get in more crashes, right? Maybe not, who knows. Regardless it should be possible to control for this and assess the effect of AutoPilot on per mile crash rates by comparing rates for Teslas with and without AutoPilot. This is somewhat complicated by the fact that even Teslas without AutoPilot have automatic collision avoidance, but should shed some light on whether AutoPilot is making people crash more or less.
What about non-fatal accidents though? Where Teslas (and other assisted driving vehicles) prevent stuff like hitting pedestrians, random bikers, animals, parked cars etc.?
1.3 million people a year die in car crashes. Humans are woefully unqualified to pilot heavy machinery on a daily basis. Tesla’s Autopilot reduces crashes by 40% according to the NHTSA, so I can’t agree with your binary argument.
Whether you trust others is immaterial; statistics will be the final arbitor. If Autopilot still causes fatal accidents, but fewer fatal accidents than humans alone, how could you argue against such a safety system? What of the lives saved that wouldn’t have been if we demand an entirely fault proof system prior to implementation? Who are you (not you specially, but the aggregate) to take those lives away because of irrationality?
> Tesla’s Autopilot reduces crashes by 40% according to the NHTSA
This claim was immediately called into suspicion when it was first published, and the NHTSA is currently facing a FOIA lawsuit for refusing to release data to independent researchers.
As noted in your citation, Tesla requested the data they provided to be confidential (which is not an uncommon request), and the NHTSA granted the request. Whether the statement from the regulatory agency can be independently verified is immaterial.
OK, and all I said in my comment was that the NHTSA was asked for elaboration and proof -- because its findings seemed curious with respect to other study results -- and so far they have declined further explanation. This may be relevant information for anyone who sees you using the NHTSA's claim as a premise.
I think I normally would have given Tesla the benefit of the doubt. But after the misleading, weasely-worded data they discussed to defend AutoPilot in light of the recent fatal accident [0], I think the onus is now on them to provide more concrete proof.
>> Tesla’s Autopilot reduces crashes by 40% according to the NHTSA
So does any car with automatic emergency braking and/or or forward collision warning (links below).
The problem here is, this Tesla drove head-on into the gore point. And previously, it drove right into the side of a huge truck.
So... can it be trusted? Your call.
I am human, and any product built for me will have to take into account my idiosyncrasies. This includes my unwillingness to drive on the same road with a car that might at any moment swerve into me because of weird software.
It may be irrational, but I can forgive a human, I cannot forgive an AI.