Yet another week, another Tesla crashing into a stationary vehicle that just came out of nowhere.
But that's the
thing. The electric car's semi-autonomous driving assistance feature, known as Autopilot, has this very situation — parked cars seemingly coming out of nowhere — written into the manual. It's a known limitation of the driver assistance tool.
This week it was a Tesla driver in Laguna Beach, Calif., that hit a parked police car. Luckily, no one was seriously hurt (and no one was in the police vehicle), but it was another ding for Tesla's Autopilot reputation. It keeps crashing.
The National Transportation Safety Board won't be investigating this crash, but they're still looking into fatal self-driving crashesfrom earlier this year, including a Tesla Model X that hit a road barrier with Autopilot engaged.
This morning a Tesla sedan driving outbound Laguna Canyon Road in “autopilot” collides with a parked @LagunaBeachPD unit. Officer was not in the unit at the time of the crash and minor injuries were sustained to the Tesla driver. #lagunabeach #police #tesla pic.twitter.com/7sAs8VgVQ3— Laguna Beach PD PIO (@LBPD_PIO_45) May 29, 2018
This situation in Laguna Beach, while certainly scary, is what Tesla lays out in its manual. Autopilot's cruise control tracks the vehicle in front of the electric vehicle. So if that car suddenly moves out of the way for, say a traffic accident up ahead, the automated guide pretty much ends there and the driver needs to snap to attention and take over.
Here's what the manual says about this driving situation:
"Traffic-Aware Cruise Control cannot detect all objects and may not brake/decelerate for stationary vehicles, especially in situations when you are driving over 50 mph (80 km/h) and a vehicle you are following moves out of your driving path and a stationary vehicle or object is in front of you instead. Always pay attention to the road ahead and stay prepared to take immediate corrective action. Depending on Traffic-Aware Cruise Control to avoid a collision can result in serious injury or death."
The issue at play here isn't so much that the car is really bad at tracking parked cars and stationary objects out of view (that's a separate shortfall of Autopilot to get into), but that Tesla isn't making it clear to its drivers that the car can't handle or anticipate this fairly typical driving scenario.
In an email statement, a Tesla spokesperson defended using Autopilot appropriately and reminded drivers that the onus is always on them, even when using the semi-automated tool.
"When using Autopilot, drivers are continuously reminded of their responsibility to keep their hands on the wheel and maintain control of the vehicle at all times. Tesla has always been clear that Autopilot doesn’t make the car impervious to all accidents, and before a driver can use Autopilot, they must accept a dialogue box which states that ‘Autopilot is designed for use on highways that have a center divider and clear lane markings,’" the statement read.
Autopilot doesn't mean you can just sit back, relax, and take your hands, eyes, and mind off the road and wheel.
Autopilot doesn't mean you can just sit back, relax, and take your hands, eyes, and mind off the road and wheel.
Some people think Autopilot is "more sophisticated than it really is" and some companies (and media outlets) oversell what these semi-autonomous features can do, University of Iowa engineering professor Daniel McGehee, director of the National Advanced Driving Simulator, told me in a conversation about eroding faith in self-driving cars last week. "So people tend to think this technology is here today," he said.
It's still very much partial assistance, with lane keeping, automatic braking, and other features that work only in certain environments. The warning system needs to improve if it's going to be a reliable driving tool. "We’re going to have to build safety measure that do more than shake the wheel" — or similar warnings about oncoming danger — said corporate strategy lead for transportation Jeremy Bennington from test solutions company Spirent Communications. "We just can’t have these cars running into stationary emergency vehicles."
Bennington called for more and more autonomous testing so eventually "we can move where the driver really is out of the loop." But until then, Autopilot isn't fully autonomous, even if it's treated like it is. That's what happened when a driver was on her phone, had her Tesla in Autopilot, and hit a parked fire department vehicle in Utah earlier this month.
Back in January, a Tesla Model S crashed into a parked Culver City fire truck on the freeway in Southern California. Same situation: fast freeway driving with Autopilot on when a truck responding to a traffic incident up ahead was parked and blocking the road. Autopilot couldn't handle the situation quick enough.
Consumer groups brought up limitations of Autopilot's capabilities in a letter to the Federal Trade Commission last week. They called Tesla's autonomous feature "dangerously misleading and deceptive." Instead of focusing on and educating about its partial capabilities, Tesla makes Autopilot seem like a fully autonomous tool through marketing, advertising, company statements, and online content, consumer advocates say.
That's not the first time Tesla's been under fire for pumping up expectations of Autopilot. Back in 2016, Germany called out the company for claiming the cars could drive themselves more than they really could in Autopilot mode.
That's not the first time Tesla's been under fire for pumping up expectations of Autopilot.
As with most new technology, Washington University in St. Louis engineering professor Sanjoy Baruah says our expectations are too high. "Users are still trying to get a feel for what it’s supposed to be doing for us," he said about Autopilot and other self-driving tools. And while Autopilot may not be able to handle this basic driving scenario with emergency vehicles blocking the road, Baruah sees how automated tech can be a life-saver for sleepy, distracted, or inebriated drivers. It's a balancing act that we'll eventually get the hang of — and the technology will improve, too. "It's new things we are learning to come to terms with," Baruah said.
Comments
Post a Comment