A motorist in a self-driving Tesla was killed when his vehicle failed to spot a lorry pulling into its path, the first known fatality in an automated car.
A federal investigation is under way into whether the $80,000 Model S car’s semi-autonomous driving system was to blame for the death of Joshua Brown, 40, a businessman so impressed with the vehicle’s Autopilot feature that he had posted videos online demonstrating its efficacy.
“What we know is that the vehicle was on a divided highway with Autopilot engaged when a tractor-trailer drove across the highway perpendicular to the Model S,” Tesla said in a statement posted online yesterday, revealing the May 7 accident for the first time.
“Neither Autopilot nor the driver noticed the white side of the tractor-trailer against a brightly lit sky, so the brake was not applied. The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield.”
Drivers are told to keep their hands on the wheel even when the system — which uses sensors to detect highway parameters and other vehicles — is engaged. “You need to maintain control and responsibility for your vehicle” users are told when they activate it.
Reaction LT readers
I had occasion recently to test out the "hands free" on my Tesla Model S and was incredibly impressed with the hands free driving on motorways where the car automatically followed the curves in the road without hesitation and kept a safe distance from the car in front. It was reassuring to be instructed with a bong and a message to take over when the computer systems deemed it to be necessary.
I have no doubt whatsoever that the car's computer systems can make better judgement calls than the average or indeed most human beings, but it is prudent and indeed recommended by Tesla to keep your hands on the wheel so that you can take control as and when needed.
Some unfortunate accidents cannot be avoided either by humans or by computer systems but the odds are much greater in favour of the computers. Human error is well known to be the cause of most motor accidents.
Tesla's Autopilot system is meant to be used exactly as aeroplane pilots use their autopilot systems, being aware and alert and taking over control when required to by the computerised flying or driving systems.
Autonomous driving is so important for our future- we must press on inspite of unfortunate incidents such as this.
Continual pessimism from luddites ignores the huge benefits the old and disabled will gain from this exciting technology!
Good job electricity has never killed anyone or there would be calls to have it banned.
I look forward to when they can be fully automated, so I can drive to a country pub, then have a sleep in the back while the car drives me home.
Perhaps you would like to reintroduce a man with a red flag at the same time? Your supposition is nothing more than whimsy. Human error is far more likely to result in an accident. No computer is going to be distracted by answering a phone call or typing a text for example.
Presumably all self driving car makers will address this now, if they haven't already.
That's how technology develops in the real world, once it's free of test tracks and other controlled environments - one accident at a time. Unfortunately this one read fatal.
What really matters is whether self-driving cars cause more injuries than human drivers over the same number of miles and driving conditions. I expect the figure will be lower.
The technology can only be perfected by using it in real conditions. That is what has happened with aircraft which used to be much more dangerous in the early days than they are today.
The evidence shows that human error rather than equipment failure is a much more common cause of accidents. That is why in hazardous industries, automation is usually preferred to human control.
What? You mean that you don't want this silly little Computer Geek's fantasy plaything to fail? There is no point in driverless cars. They are inherently dangerous and here is the proof.
And yet, how many fatal road accidents are cause by human drivers. There's no point in cars - they are inherently dangerous, according to this point of view.
Tragic - yes, preventable by smarter design - yes. There may be some hysterical over-reaction by wife, children and other close relatives. But as one renewable energy newsletter today said comfortingly, 'deaths of this sort can be statistically expected during the initial introduction of autonomous vehicles'.
I was once behind the car which drove into a JCB with a bright evening sun behind it. I hadn't seen it either so I believe it's entirely possible.
Reaction Fark readers
FTFA: Joshua D. Brown, of Canton, Ohio, died in the accident May 7 in Williston, Florida, when his car's cameras failed to distinguish the white side of a turning tractor-trailer from a brightly lit sky and didn't automatically activate its brakes, according to government records obtained Thursday.
So a truck made such a close turn in front of him that if the car driver didn't react, the accident would happen? All that "driver was watching TV" is the truck driver covering his ass.
You realize it's not a self-driving or automatic-driving car, right? If you honestly didn't, now you do.
It's an assisted-driving car. There is a world of difference.
Look, I don't care how good these systems are or how long they develop them. Just like humans, they will be unable to detect and avoid every possible collision scenario. They will do some things better than humans, but they will be forever prone to these sort of misinterpretations of surroundings. The combination of automation and human piloting can make the safest system yet, but it has to be both working together at the same time, and the autopilot has to be free of Toyota-like acceleration bugs, and other stupid losses of manual override for it to all work.
Do you know how awfully incomplete the first autopilot systems were for flight?
It took them time, but the autopilot that we have today can fly itself. Even with the skies having high traffic the autopilot works very, very well. Computers are machines driven by logic, unlike us humans who are driven by emotion. And trusting them has saved sooooo many lives, far more than the lives lost when the shiat farks up.
Yes, I agree however in a plane, the primary function of the autopilot is simply control of heading, control surfaces, engines, altitude, aux systems, etc. Obstacle avoidance other than terrain is largely a non-issue in planes when you compare it to the task of simply driving to the corner store. Airspace is a lot more homogenous than the roadway. There will always be roads, corners, intersections, merges, you name it that a car autopilot will fail on intermittently. In this case it was simple a contrast issue with the camera. Things also go wrong much faster in a car. One or two seconds of inattention or miscalculation and BAM. Planes have much longer timeframes to adjust. Of course I am generalizing, but I think I have correctly framed the complexity of the task of getting a car to never wreck in any given situation.
Good. Hopefully this will kill the stupid idea of self-driving cars.(Mock26)
Of course it will. Totally. Just like aircraft crashes have killed off planes and auto crashes forced everybody to go back to riding horses.
This is totally in tune with American ideals of banning things that are dangerous, like firearms.
Silly farker, guns are not dangerous.
Jayne was NOT decapitated in the car crash that killed her.
No, she was scalped. But the car did go under a truck, hence the introduction of the Mansfield bar.
Ironically, it was on this date in 1967.
This is a real setback for blind peoples hopes of driving on blind dates.
One death so far in the program versus how many years of no deaths to the other testers?
Trying to fly was insane. You were mentally ill to think you could fly like a bird. But we not only did that, but we went into outerspace, and during that time not everyone came back.
Autopilot has been around so long in the aviation industry that the amount of people who are scared of self driving cars is rather low because of the lack of deaths in the program and the trusting nature in aviation.
130 million miles with autopilot active and only 1 fatality thus far? I get the feeling that is already safer than having a person handle all the driving considering that is typically estimated at 1.5 deaths per 100 million miles driven.
Reaction Reddit readers
YouTube is full of videos of people in Teslas who seem to think they have a fully self driving car. In reality autopilot is supposed to be an assist mechanism, but they're acting like it's capable of completely driving without them. They've got a car that has maybe 1/3 of what would be required for fully autonomous driving and they're acting like all the smarts and sensors are there.
This particular crash is blamed on a lack of contrast between sky an truck - that's because they're using a visible light camera facing forward (on the back of the rear view mirror). The car also has forward radar and 360degree ultrasound. The range of the latter is pretty limited. In order to have avoided this particular crash it would have needed 360 degree lidar mounted on the roof - the lidar wouldn't have been fooled by lack of contrast.
tl;dr Tesla shouldn't be calling it Autopilot since that seems to be giving some owners the impression that this is a self driving car; it's not. Call it Driver Assist or something like that instead.
Here is a quote from the driver that was killed in the autopilot crash. "There are weaknesses. This is not autonomous driving, so these weaknesses are perfectly fine. It doesn't make sense to wait until every possible scenario has been solved before moving the world forward. If we did that when developing things, nothing would ever get to fruition." - Joshua Brown
"Autopilot is getting better all the time, but it is not perfect and still requires the driver to remain alert."
It's just me but this isn't obvious to everyone?
It's the worst of all worlds. Not good enough to save your life, but good enough to train you not to save your life.
There was a Ted talk from a google car engineer that talked about this, you can't make baby steps towards autonomy, you have to jump from very little, to nearly perfect or it will never work.
As someone who knows the Williston area, I'm not sure which "highway" this guy was on. There is one four lane highway that is not limited access (traffic crossing all the time) other than US41.
It's not an interstate and it's very much in the backwoods. I'm not sure why this driver would have felt comfortable using autopilot in that area.
Very near to Gainesville, FL.
Edit: After reading the 2nd article it's US27 which is the road I was thinking of but is further West than i thought. It's West of the US 41 junction but as I said is not a highway I'd be using a beta autopilot on.
As somebody from Europe, why do you have level crossings on a 4-lane highway? That sounds like utter madness.
Much of the US is so rural as to make this a necessity. Many intrastate highways linking the interstates have segments like this. You're right, it seems exceedingly dangerous. There were often fatalities at ones near where I grew up. But above or below grades crossings are expensive and these are in low traffic areas, so it's hard to get funding for them when the rest of the highway system is crumbling due to lack of funding.
Why do you have four Lane highways in low traffic areas?
Likely as passing lanes. Most truck routes are four lanes, even in rural areas. Not sure if this is a major truck route though.
EDIT: just to clarify, a four-lane highway is two lanes in both directions.
In Los Angeles, and most of California (north/south at least), Interstate 5 truck routes are one lane each direction, then very briefly two lanes before merging back into one. Though, most of Interstate 5 has no truck route and they just keep right as per law.
This is in the city with the second highest population (second to New York City), state with THE highest population, and city (LA) with the (statistically proven) worst traffic in the United States.
TLDR; We envy your rural infrastructure.
Europe has to plenty of 4 lane highways with level crossings. The difference is that they generally use traffic circles, which are much safer.
From the accident info it seems like an unguarded crossing, not one with traffic lights, stop signs (.. on a highway?) or a roundabout.
The truck probably had a stop sign and was waiting but thought he could make that gap. He didn't.
Exactly. In Europe a divided highway with level crossings would have traffic circles. On the other hand, I bet in europe the equivalent highway to this one (in terms of traffic volume, importance of the route, etc) would probably have be a simple, two-lane undivided highway with normal, unguarded crossings for secondary roads. So its debatable which is safer.
The reality is that for the next 50, 60, 70 years human and automated drivers will coexist and not that much will change in terms of roads and traffic.
50, 60, or 70? You have an unrealistic idea of how long it takes for tech to develop. You're right about the free cars/transition period and I bring that up in another post, but the transition period is going to be closer to 20 years. The amount of time is determined by how long people keep their cars for. There aren't many cars on the road that are older than 20 years, so it's a reasonable figure.
Though, that is 20 years from when we start seeing it implemented in an official capacity, not from prototype phase tech.
I'm not saying the technology won't be viable until then. I'm merely stating that human and autonomous drivers will coexist for the foreseeable future and that doesn't have to be the technology's fault. Even when all cars sold 20 years from now have the ability to drive autonomously millions and millions of people will opt for the "manual override".
Oh, okay. You mean in their entirety. I was referencing specific regions. I imagine the downtown areas of cities will be the first to be regulated, with major public events (or anything with complex parking issues) following closely behind.
I'm not sure if we'll ever do a full transition.
I think that will change much faster than you are estimating. It will happen in phases, but here is how I think it could go:
Autonomous features available for limited situations that are equal to or better than human drivers. With adaptive cruise control, collision avoidance, lane assist and similar features, I think it is fair to say that we are there now
When fully autonomous vehicles become available to the public, drivers will be required to maintain control over the vehicle at all times, managing/supervising the auto-pilot. Companies (e.g. BMW) are promising this type of autonomy within 5 years.
After a time, autonomous control systems will prove themselves equal or superior to human control, and drivers will be allowed to let the system drive with less or no supervision. This could be in all areas, or may start in specifically designated places like low speed limit zones or special highway lanes. Perhaps 7-10 years?
If full autonomy in #3 was limited to certain areas, the next step is that it is allowed everywhere. This may be the point that fully autonomous capable vehicles start gaining wide adoption, due to greater utility as well as affordability as the features work their way down from higher end models to mid tier and perhaps even economy cars. Now it is possible to have driverless vehicles on the road
Full, driverless autonomy may lead to a shift away from car ownership in favor of more commoditized transportation services (this is where Uber is looking)
A tipping point. It is hard to say where this might be (30%, 40%, 50% of cars on the road?), especially with the effects #5 could have, but autonomous driving starts noticeably changing traffic and driving patterns
The safety record of fully autonomous vehicles leads to legislation requiring more and more of the features that comprise autonomous systems to be standard, eventually resulting in all new cars having full autonomous capability
The further we go into the future, the hazier the possible outcomes, but I think it's reasonable to predict that -- perhaps as soon as 20-25 years out -- there will be increasing barriers to manual driving, which will probably come in many forms -- higher cost, and perhaps higher standards (harder skills test, more stringent vision requirements, etc...) to be licensed, more expensive insurance, restricted roads.
5 and 6 are where significant changes occur, and I expect that they will be closer to 15 years than 50.
I get the feeling we are quite a lot less than that. When it comes to roads a lot of very weird things can happen, but it hardly matters if its an elephant crossing the road, or a burst water main - the answer is usually to avoid.
I think they will hit fully autonomous within 5 years.
The real fun happens when cities start saying manual drivers aren't allowed in - just wait for the screams.
There was a guy on here a few years ago that said he worked on software for self driving cars and that we're further away than people think. He could have been a phony though. He made an analogy to the 1950s when jetpacks first appeared and people thought everyone would have one in 25 years or so.
The intelligent cruise control, braking and lane/side radar on my Infiniti has saved my ass several times when I've dropped my attention in my blindspot and closing speeds. Partly because it has increasingly audible feedback when a car tries to change lanes into you or visa-verse. Eventually it flights back on the steering wheel with opposite brakes. It really fights side collisions. In front, the same thing. If I get too close to a vehicle at too high a speed, the gas pedal physically pushes back, then eventually it starts to brake and audibly beep like hell. The combination of physical force feedback, visual lights near the wing mirrors and audible alarms has made me very comfortable letting the car be my wingman.
I see why people trust the Autopilot system so much but I'd never take my foot off of one of the pedals or eyes off the road. This really was a corner case. I'm sure a software update will be sent to achieve a better balance between panicking about signs where there is clearly enough clearance and trucks that will shear off the roof of the car. Yikes.
This is actually a huge problem with automated systems and some thing the airline industry has struggled with. As automation increases, the human mind not only has a hard time concentrating but our skills also atrophy quickly.
This is an interesting article by The New Yorker that looks at how automation indirectly caused some modern aircraft diasters and how these effecs (humans failing to pay attention inside an automated system) could impact self driving cars : http://www.newyorker.com/science/maria-konnikova/hazards-automation.
What I don't get is why people are holding this tech to impossible standards. We let people who've totalled cars because of cellphone distractions continue driving, and drunk drivers get multiple chances. Give wall-e a shot.
This is terrible. According to that diagram and the reported story by the authority, 10/10 times if an accident like this occurred, the trailer will be at fault, the only reason it is not in this case was because the other car happen to be on a very publicized autopilot car, and the driver of that car was extremely irresponsible. It's a freaking high way and that huge ass tractor is making an unprotected left turn.
The Wisdom of the Crowd - Who controls Self-Drive Cars