In the wake of a fatal crash in Texas in which Tesla’s Autopilot was initially blamed and a few reported incidents of Californian Tesla owners riding in the back seat of a driverless Tesla vehicle, Tesla has activated an interior camera that can track the alertness of the driver while Autopilot is active.
According to data from the National Highway Traffic Safety Administration (NHTSA), Tesla’s Autopilot has been involved in three fatal crashes since 2016. According to the company, the move is being made for safety reasons, though it claims that it’s not meant to be an invasion of privacy. In a release note to Tesla owners, it said:
“The cabin camera above your rearview mirror can now detect and alert driver inattentiveness while Autopilot is engaged. Camera data does not leave the car itself, which means the system cannot save or transmit information unless data sharing is enabled.”
Some privacy activists may call this an unwarranted invasion of privacy after a few instances of Tesla vehicles doing something stupid as a stunt even though Tesla claims that it doesn’t intend to transmit data from the rear view camera. It simply intends to address the safety issues made obvious by the cases of people engaging in dangerous stunts like riding in the back seat of a driverless Tesla with Autopilot activated.
On the flip side, the cameras have been used as part of “Sentinel Mode” to capture evidence of vandalism or theft targeting Tesla vehicles. Tesla recently patched a security flaw involving the key fob that would have enabled the theft of a vehicle in only a few minutes. Onboard cameras assisted in cracking a recent case in which a suspect in a Springfield, Missouri, arson and hate crime case was caught on camera attempting to remove a Tesla vehicle’s tires.
Tesla’s website includes the disclaimer that its self-driving software is not yet ready for full autonomy and drivers should remain alert and in control at all times. California’s DMV backed that up with an internal memo saying that the capabilities of Tesla’s Full Self-Driving software have been overstated by CEO Elon Musk after anonymous engineers reported to the DMV that the company may be exaggerating the software’s capacity and rate of development. Musk had claimed that his goal was to bring it out of beta by the end of the year, though this may be one of his characteristically unrealistic deadlines that will probably be missed.
Musk had previously dismissed tracking eye movements as “ineffective” for gauging alertness. The Autopilot had previously relied on detecting that a driver was in the driver’s seat with hands on the wheel. Now the notes for the most recent update say that the cabin camera mounted to the rear view mirror will “detect and alert driver inattentiveness while Autopilot is engaged.”
Tesla appears to be increasingly pushing for reliance on cameras despite recent actions by the Chinese government that include banning Tesla vehicles at government and military facilities due to concerns that the cameras could capture footage of sensitive activities. It recently ended the use of radar in Model 3 and Model Y vehicles in North America in favor of a purely camera system. Since Tesla made the change, Consumer Reports removed the Model 3’s “top pick” status for safety and the Insurance Institute for Highway Safety also removed the Top Safety Pick Plus award.
On the flip side, a Tesla vehicle with manufacturer’s plates was seen roaming Floridian streets with the lidar system that Elon Musk previously dismissed as a “crutch.” This appears to be part of a deal between lidar manufacturer Luminous and Tesla to test the technology for Tesla’s driver assist programs. It remains to be seen whether Tesla intends to integrate lidar into vehicles equipped with Autopilot or Full Self-Driving software, since it does take up space on the roof and Tesla owners who like to haul kayaks or camping gear on the roofs of their vehicles might not like that too much.
Comments