Robin Ziula had doubts about automatic driving technology equipped with his Tesla Model S when he bought an electric car in 2017.
“It was a little scary, you know, rely on it and just, you know, sit back and let it drive,” he told a US investigator about Tesla’s Autopilot system, his take on the technology. Describing the initial feelings.
Ziula remarked to the investigator in January 2018, days after he was engaged with Tesla’s Autopilot, slammed into the back of an empty fire truck parked on a California Interstate Highway. Reuters could not contact him for additional comment.
Over time, Jiula’s initial doubts about the Autopilot softened, and she found it reliable when tracking a vehicle in front of her. But he noticed that the system sometimes seemed confused when changing lanes in the front of a vehicle in direct sunlight or in front of them, according to a transcript of his interview with a National Transportation Safety Board (NTSB) investigator.
Most of the accidents under investigation occurred after dark or in conditions causing limited visibility.
He told investigators that he was driving in the sun before the fire truck was rear-ended.
The NTSB found that the design of the autopilot allowed Jiula to detach from driving during her journey, and her hands were off the wheel for the entire duration of approximately 30 minutes when the technology was activated.
The US agency, which makes recommendations but lacks enforcement powers, previously asked regulators at the National Highway Traffic Safety Administration (NHTSA) the limits of autopilot, potential for driver abuse and potential safety risks after a series of accidents involving the technology. some of which fatal to them.
“The past has shown that the focus is on innovation over security and I hope we are at the point where the tide is turning,” Jennifer Homendy, the new NTSB chair, told Reuters in an interview. She said there is no comparison between Tesla’s Autopilot and the more stringent Autopilot systems used in aviation that include trained pilots, regulations addressing fatigue, and testing for drugs and alcohol.
Tesla did not respond to written questions for this story.
Autopilot is an advanced driver-assistance feature whose current version does not allow vehicles to be autonomous, the company says on its website. Tesla says drivers must agree to keep their hands at the wheel and maintain control of their vehicles before the system can be enabled.
limited visibility
The Jiula’s 2018 crash is one of 12 accidents involving Autopilot that NHTSA officials are investigating as part of the agency’s most far-reaching investigation since Tesla Inc. introduced the semi-autonomous driving system in 2015.
According to a statement from NHTSA, NTSB documents and police reports reviewed by Reuters, most of the accidents under investigation occurred after dark or in conditions causing limited visibility, such as strong sunlight. Autonomous driving raises questions about the capabilities of the autopilot during challenging driving conditions, according to experts.
“NHTSA’s enforcement and blame authority is comprehensive, and we will take action when we detect an unreasonable risk to public safety,” an NHTSA spokesperson said in a statement to Reuters.
The current NHTSA investigation of Autopilot reopens the question of whether the technology is safe.
Since 2016, US auto safety regulators have separately sent 33 special accident investigation teams to review 11 deaths related to Tesla accidents that were suspected to have occurred in the use of advanced driver assistance systems. NHTSA has denied the use of autopilot in those three non-fatal accidents.
The current NHTSA investigation of Autopilot reopens the question of whether the technology is safe. It represents the latest significant challenge for Tesla Chief Executive Elon Musk, whose advocacy for driverless cars has helped his company become the world’s most valuable automaker.
Tesla charges customers up to $10,000 for advanced driver assistance features like changing lanes, with the promise of eventually giving its cars autonomous driving capability using only cameras and advanced software. Other carmakers and self-driving firms use not only cameras in their current and upcoming vehicles but more expensive hardware, including radar and lidar.
Musk has said that a Tesla with eight cameras will be far safer than human drivers. But camera technology is affected by darkness and sun glare as well as inclement weather conditions such as heavy rain, snow and fog, experts and industry executives say.
“Today’s computer approach is not perfect and will be for the foreseeable future,” said Rajkumar, a professor of electrical and computer engineering at Carnegie Mellon University.
In the first known fatal US crash involving Tesla’s semi-autonomous driving technology, which occurred west of Williston, Florida, in 2016, the company said both the driver and the Autopilot were watching the white side of a tractor trailer against a brightly lit sky. failed in Instead of applying the brakes, the Tesla collided with the 18-wheeled truck.
Driver Abuse, Failed Braking
In January 2017 NHTSA closed the Autopilot investigation stemming from that fatal accident, after some controversial exchanges with Tesla executives found no defects in Autopilot performance, according to documents reviewed by Reuters.
In December 2016, as part of that investigation, the agency asked Tesla to provide details on the company’s response to any internal safety concerns raised about Autopilot, including one sent by regulators to the automaker. Involves the possibility of abuse or misbehavior of the driver in accordance with the special order. .
Tesla’s then-general counsel Todd Marrone tried again after an NHTSA attorney found Tesla’s initial response to be lackluster. He told regulators that the request was “entirely comprehensive” and that it would be impossible to list all the concerns raised during the development of Autopilot, according to correspondence reviewed by Reuters.
Still, Tesla wanted to cooperate, Maron told regulators. During the development of Autopilot, company employees or contractors raised concerns that Tesla addressed about the potential for unintended or unsuccessful braking and acceleration; unwanted or unsuccessful steering; and some sort of abuse and abuse by drivers, Maron said without providing further details.
Maron did not respond to messages seeking comment.
It is not clear how regulators reacted. A former US official said Tesla generally cooperated with the investigation and promptly produced the requested material. Regulators closed the investigation just before former US President Donald Trump’s inauguration, performed Autopilot as designed, and Tesla took steps to prevent it from being misused.
Elon Musk has fought hard to protect Autopilot from critics and regulators.
Leadership Vacuum at NHTSA
The NHTSA has been without a Senate-confirmed head for nearly five years. President Joe Biden has not yet nominated anyone to run the agency.
NHTSA documents state that regulators want to know how Tesla vehicles attempt to see flashing lights on emergency vehicles, or detect the presence of fire trucks, ambulances and police cars in their path. The agency has also sought similar information from 12 rival automakers.
“Tesla has been asked to produce and verify the data, as well as interpret that data. NHTSA will conduct its own independent verification and analysis of all information,” NHTSA told Reuters.
Electric-car pioneer Musk has fought hard to protect Autopilot from critics and regulators. Tesla has used Autopilot’s capability to update vehicle software in the air to carry over the traditional vehicle-recall process.
Musk has repeatedly promoted Autopilot’s capabilities, sometimes criticizing that customers are misled into believing that Teslas can drive themselves – despite warnings to the contrary in owners’ manuals that allow drivers to drive. to persevere and to outline the limits of technology.
Musk continues to launch beta — or unfinished — versions of the “full self-driving” system via over-the-air software upgrades.
0 notes
“Some manufacturers are going to do what they want to sell cars and it’s up to the government to rein in that,” said NTSB’s Homendy.
for the latest auto news And ReviewFollow carandbike.com Twitter, Facebookand subscribe to our youtube Channel.
.