The National Highway Traffic Safety Administration (NHTSA) has joined with the auto industry safety advocates and lawmakers to investigate the misuse of advanced driver assistance technology. About three dozen crashes of cars with autopilot are being reviewed, including the first one in June of 2016 involving a Tesla.
The National Transportation Safety Board (NTSB) went so far as to accuse the NHTSA of contributing to the accidents because they failed to ensure automakers used safeguards to limit the use of electronic driving systems. The federal safety agency told automakers they need to begin reporting and tracking crashes involving driver-assistance technology by manufacturers such as General Motors’ Super Cruise and Tesla’s Autopilot. The analysis covers an estimated 765,000 vehicles produced since 2014.
Keeping Technology in Check
The investigative action is a welcome sign that the safety implications of these systems are being taken more seriously. “Today’s action by NHTSA is a positive step forward for safety,” NTSB Chair Jennifer L. Homendy said in a statement on Monday, August 16. “As we navigate the emerging world of advanced driving assistance systems, it’s important that NHTSA has insight into what these vehicles can, and cannot, do.”
The inquest requires all carmakers to report severe accidents within one day, starting in June. It includes incidents that result in a fatality, a hospital visit, deployed airbags, or the vehicle being towed away. The NHTSA will also begin establishing performance standards for advanced driver assist systems (ADAS), including:
- Adaptive Cruise Control (ACC)
- Lane Keeping Assist
- Blind Spot Monitoring
- Automated Emergency Braking
- Pedestrian Detection
- Hands-Free Systems
- Driver Engagement
Autopilot Does Not Equal Autonomous Driving
Advanced driver-assist systems are designed to assist the human driver. Unfortunately, many people let their attention wander when using a system that functions correctly most of the time. It seems that adding a caveat that drivers must remain alert while their cars are in motion is not enough, and a more sophisticated way of ensuring driver attention is needed.
For now, the driver is still responsible for controlling the vehicle, even if their hands are not on the steering wheel. The new NHTSA investigation is an opportunity to examine the design of these systems, along with more testing and validation before deploying beta versions. Because these vehicles obviously affect other drivers on the road, inattention and overreliance need addressing before they exacerbate the problem.
Lawmakers and Safety Advocates Focus on Tesla
The NHTSA found 11 crashes since 2018 involving autopilot Tesla vehicles and emergency vehicles. In January 2018, a Tesla crashed into a parked firetruck that was attending to a different accident. Although the rescue vehicle used flashing lights and flares, Tesla’s AI did not detect the fire truck. The investigation will cover Tesla’s complete lineup going back to 2014, including Model S, X, 3, and Y.
“NHTSA should get to the bottom of the issue as quickly as it can and demonstrate it will hold Tesla accountable if the company won’t put people’s safety first on its own,” says William Wallace, manager of safety policy at Consumer Reports.
Most of the incidents involved Teslas and resulted in a total of ten deaths. One of those fatalities was a pedestrian hit by an Uber test vehicle. If the examination finds that a safety defect was not reported promptly, it could lead to a product recall and fines for automakers.
Gertler Accident & Injury Attorneys
935 Gravier Street, Suite 1900
Disclaimer: The views, suggestions, and opinions expressed here are the sole responsibility of the experts. No Gio News UK journalist was involved in the writing and production of this article.