Investigators reveal Tesla didn’t respond to recommendations after autopilot, driver distraction blamed for fatal crash
The investigators of a fatal crash have revealed Tesla didn’t respond to recommendations that could have prevented it.
An Apple engineer who died after his Tesla crashed in autopilot mode had just dropped his young son off at preschool and was playing a video game at the wheel, according to a US government department investigating the crash.
Walter Huang died when his Tesla Model X slammed into a concrete barrier in Mountain View, California.
He had just dropped his son off and was making the 40-minute trip to his job at Apple when the crash occurred.
Mr Huang was using the car’s driver assistance feature Autopilot, which maintains lane position and speed, but is not a fully autonomous driving feature.
Tesla said it makes this clear to customers before enabling the feature and requires them to regularly have their hands on the wheel.
The autopilot on Mr Huang’s Model X had reportedly swerved towards the same concrete barrier that eventually killed him several times previous to the fatal crash, which he had mentioned to friends and family.
On the day of his death the car swerved again, speeding up to 115km/h before abruptly decelerating to zero after crashing into the barrier.
The automatic emergency braking didn’t activate, and Mr Huang didn’t apply the brakes or steer away from the barrier before the collision.
The barrier was also missing a crash cushion that had been removed after a separate collision weeks earlier, which could possibly have saved Mr Huang’s life if it had been replaced in time.
The US’ National Transportation Safety Board (NTSB) has been investigating the March 2018 crash and delivered its official verdict on Tuesday.
Chairman Robert Sumwalt said the investigation made it clear that “self-driving” features on cars don’t mean drivers can stop paying attention to the road.
“If you own a car with partial automation, you do not own a self-driving car,” Mr Sumwalt said in his opening statements.
“This means that when driving in the supposed ‘self-driving’ mode, you can’t read a book, you can’t watch a movie or TV show, you can’t text and you can’t play video games.”
But the Board didn’t let Tesla entirely off the hook.
RELATED: Tesla Autopilot fooled by piece of tape
RELATED: Tesla looking for high school dropouts
NTSB staff determined that Tesla’s system doesn’t make sure drivers are paying attention and recommended stronger driver monitoring systems be required.
Mr Sumwalt said the Board had made recommendations to six automakers in 2017 to address the problem.
Five of the automakers responded. Tesla did not.
Tesla’s Autopilot senses when you apply force to the steering wheel, and if that doesn’t happen, it will flash warnings and beep at you until you put your hands back on the wheel.
But monitoring steering wheel torque, “is a poor surrogate measure” of monitoring the driver, the NTSB’s human performance and automation highway safety expert Ensar Becic told the board.
Mr Huang’s habit for mobile gaming also didn’t escape blame for the crash.
Mr Sumwalt said the NTSB had been asking for technology to stop drivers using their phones at the wheel for almost a decade, but no action had been taken.
Apple’s iPhone and Google’s Pixel smartphones have features to stop you getting distracted while driving (though it’s yet to roll out more widely to other Android phones).
The NTSB wants to see smartphone makers go further.
The Board’s project manager for highway safety Don Karol wants mobile phone companies to program phones to automatically lock out distracting functions such as games and phone calls while someone is driving.
RELATED: Tesla tantrum caught on camera
The board also recommended that companies enact policies to prevent use of company-issued phones while workers are at the wheel.
“Lockout mechanisms should be default setting and should automatically lock out distracting functions,” Mr Karol told the Board.
The Board is only able to make recommendations while the National Highway Traffic Safety Administration (NHTSA) is in charge of regulations.
NTSB director of highway safety Robert Molloy said the NHTSA is taking a “misguided”, hands-off approach to automated driving systems and “they need to do more”.
The NHTSA said it’s investigating 14 Tesla crashes and will review the NTSB report.
“Distraction-affected crashes are a major concern, including those involving advanced driver assistance features,” the NHTSA said in a statement.
Tesla has been contacted for comment.
– with AP
Would you trust a computer to drive your car? Let us know in the comments below.