The United States’ National Highway Traffic Safety Administration (NHTSA) is looking into issues with the Tesla Autopilot that may have led to a series of crashes with emergency vehicles. The investigation covers 765,000 vehicles that were sold in the United States since 2014.
The reported crashes injured 17 people and killed one. The NHTSA says that there were 11 crashes involving both emergency vehicles and Tesla vehicles that had Autopilot or Traffic Aware Cruise Control activated since 2018. It says that the emergency vehicles were making use of warnings that included flashing lights, flares, an illuminated arrow board or cones warning of hazards.
A recent update to Tesla vehicles’ onboard software makes Tesla Vision more capable of recognizing emergency lights, hazard lights, and (according to one owner of a vehicle equipped with Autopilot who noticed the change) brake lights.
Musk recently said that Tesla prioritized work on the first version of Autopilot after a wreck that killed a cyclist in which the driver admitted to falling asleep at the wheel. The Autopilot has since demonstrated the ability to handle the situation when a driver falls asleep or loses consciousness by pulling over to the side of the road. (It probably didn’t hurt that Tesla recently activated an interior camera to track a driver’s alertness.)
There have still been occasional safety issues involving the Autopilot. Tesla recently had to recall nearly all of the vehicles with Autopilot that had been sold in China to fix an issue with drivers accidentally activating the cruise control.
Some Tesla watchers have accused the media of blaming the Tesla Autopilot and Full Self-Driving software for every accident involving a Tesla vehicle for the sake of views, however. Tesla maintains that these programs are meant to be driver assistance programs rather than fully replace a driver (though it is definitely working toward full autonomy). The Autopilot had initially been blamed for fatal wrecks like a recent one in Texas in which Tesla’s logs of Autopilot use later revealed that the software had not been engaged at the time.
Despite the challenges involved in both developing and promoting fully autonomous vehicles, Tesla maintains that Autopilot and Full Self-Driving will save lives once they become both fully functional and normalized for use on the road. According to a report issued by the NHTSA, driver error is a major factor in over 90% of vehicle wrecks while on the road. Once the software for fully autonomous vehicles is fully developed and widely used, it could drastically cut down on vehicle wrecks and fatalities.
The Biden Administration still seems intent on being tougher on companies like Tesla that are developing driver assist programs, however. Some observers have suggested that Biden plans to use the NHTSA to be tougher on Tesla, especially after the company didn’t receive an invitation from his administration for a recent meeting with automakers who manufacture electric vehicles or have plans to.
The National Transportation Safety Board (NTSB), which serves as a “watchdog” for transportation regulations, partially blamed lax regulatory enforcement for some crashes in which the Autopilot was activated. Although the NHTSA may have let some things slide in the past, this hasn’t always stopped Tesla CEO Elon Musk from occasionally grumbling about regulators slowing things down in regards to development of his companies’ products. The investigation may lead to more recalls, but it could also lead to the NHTSA concluding that the issue has been fixed with recent updates to the Autopilot that makes them better able to recognize emergency vehicles.
Comments