In response to an NHTSA probe of a few crashes involving Tesla vehicles and emergency vehicles, some U.S. Senators have asked the FTC to probe Tesla’s claims about Autopilot and Full Self-Driving (FSD). They want the FTC to probe whether the naming of Tesla’s driver assist programs amount to deceptive marketing practices.
The move was sparked by incidents in which Autopilot failed to recognize the presence of an emergency vehicle, which may have contributed to crashes. The NHTSA says that there have been 11 reported crashes involving a Tesla vehicle with Autopilot or Traffic Aware Cruise Control activated and an emergency vehicle since 2018.
Recent updates to Tesla Vision include the ability to recognize an emergency vehicle with its lights activated. One Tesla owner says that his vehicle was able to recognize that other vehicles are stopped with brake lights on since the software was updated. However, these recent updates likely won’t be enough to convince regulators and lawmakers.
“We fear that Tesla’s Autopilot and FSD features are not as mature and reliable as the company pitches to the public. Tesla drivers listen to these claims and believe their vehicles are equipped to drive themselves — with potentially deadly consequence,” Senators Richard Blumenthal (D-CT) and Ed Markey (D-MA) said in a letter to FTC chair Lina Khan.
The full text of the letter was published on Richard Blumenthal’s subdomain of the Senate.gov website.
Private consumer watchdogs like Consumer Reports have criticized Tesla’s driver assist programs, though Consumer Reports did restore the “Top Pick” rating for the Model 3 after careful safety testing. Consumer Reports previously expressed concern about Tesla’s decision to drop radar in favor of a purely visual camera system for its less expensive vehicle models. However, this doesn’t necessarily mean a complete free pass for Tesla’s Autopilot. In a report published in May, Consumer Reports warned that consumers “should be wary” of claims made by Tesla about the capabilities of its driver assist programs.
Tesla’s website publishes the disclaimer that Autopilot and FSD are not yet ready for fully autonomous driving and drivers should remain alert. The Autopilot has recently demonstrated the ability to act if the driver isn’t alert by pulling over to the side of the road when a driver lost consciousness. Shortly after the incident, Elon Musk admitted that he had Tesla prioritize work on Autopilot after a Tesla vehicle, which obviously didn’t have Autopilot, hit a cyclist after its driver fell asleep at the wheel.
According to the NHTSA’s own data, driver error is a critical factor more than 90% of vehicle crashes. If Tesla or a similar company can develop fully autonomous vehicles and they become normalized, it could take driver error out of the picture.
If one goes by the data, it would be easy to think that regulators and lawmakers are simply picking on Tesla, especially after President Biden’s failure to invite Tesla to a summit on the future of electric vehicles. On the flip side, some software developers working for Tesla have admitted in communications with California’s DMV that statements from Tesla and especially CEO Elon Musk overstated the software’s capabilities. Tesla seems to be nowhere close to giving up on its AI-backed driver assist programs, however, and continues its work to increase their ability to handle most situations on the road.
Comments