The technology behind the tesla crash, explained – the washington post gas vs electric water heater cost per year

The autopilot consists of a forward-facing camera and radar system, as well as a dozen ultrasonic sensors mounted around the car for situational awareness. The camera can read speed limit signs and watch lane markings to prevent a driver from drifting. The ultrasonic sensors detect when other cars get too close and have a range of 16 feet.

Tesla’s approach to autopilot is a lot like the rest of the auto industry’s: It’s only an incremental step toward full driverless cars. In that respect, Tesla’s autopilot is similar to other automated features already in vehicles today, such as assisted parking and automatic collision avoidance. Tesla has described its autopilot as a kind of advanced cruise control, with drivers being able to take over when they want. Tesla has said in the past that the feature is designed to make driving more comfortable "when conditions are clear."

Apparently not. Here’s how Tesla said the crash occurred: As the truck turned left, crossing the Tesla’s path, neither the human nor the machine could distinguish the white-colored body of the truck from the sky, Tesla said. As a result, the Model S never slowed down, punching through the gap between the truck’s wheels and getting crushed.

No, although we don’t know precisely why. Tesla merely said in its blog post that "neither autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky," suggesting there could have been a lighting or other imaging issue that prevented the computer from detecting an obstruction ahead. Indeed, Tesla’s owners’ manual highlights "bright light (oncoming headlights or direct sunlight)" as a factor that can affect the autopilot system. Here are some other things that can confuse the system, according to the manual:

Poor visibility (due to heavy rain, snow, fog, etc.). Damage or obstructions caused by mud, ice, snow, etc. Interference or obstruction by object(s) mounted onto Model S (such as a bike rack or a sticker). Narrow or winding roads. A damaged or misaligned bumper. Interference from other equipment that generates ultrasonic waves. Extremely hot or cold temperatures.

Even if the cameras were defeated by the bright light, one wonders why the radar system failed to interpret an obstacle. Tesla did mention the "high ride height of the trailer," which might have played a role in preventing the radar from reporting correctly if the system was looking for things closer to the ground.

But that raises questions about reaction time. What if you’re paying attention to the road but lack the ability to do anything about an impending accident? We do have some anecdotal cases of autopilot appearing to prevent crashes, so the system seems to do a better job than human drivers at least some of the time.

Statistically, Tesla’s autopilot may even be better than humans most of the time. Tesla claims this is the first time such a crash has happened in about 130 million total miles of autopilot driving. The United States suffers a death on the roads about once every 100 million vehicle miles traveled, according to the Insurance Institute for Highway Safety. So cars that are driven exclusively by humans tend to cause road deaths more often. We’d need more data on crashes involving partly autonomous vehicles, though, to confirm this. But it’s a start.

We should be careful not to conflate Tesla’s autopilot with full self-driving capability. Tesla’s autopilot is markedly different from Google’s self-driving car, which uses not only radar and cameras but also laser beams and sophisticated map models to pinpoint your exact location relative to the world around you.

Google’s self-driving car would be an example of level 4 automation. Tesla’s autopilot falls somewhere lower on the scale, a level 2 or level 3, because it helps make driving a little easier and, as we saw in the video above, can take over "safety-critical functions" from the human.

The task for policymakers, analysts say, is to square our legitimate technology jitters with the societal benefits that vehicle automation could bring — and that’s not going to be easy. The government recently declared that Google’s driverless car can be viewed as a driver in the eyes of the law, a move that will have repercussions for state governments, insurance companies, as well as automakers. Federal highway officials are also devising policies for automated vehicles; some of that work can be viewed online.