have been involved in fatal crashes since 2016 due to engaging Autopilot. The crashes have raised concerns about the driver assistance system’s capability. Concerns are rising globally about systems that can undertake driving for extended periods with little or no human intervention.
Senator Markey has called Autopilot, “an inherently misleading name”
U.S Senator Edward Markey released a press release to raise concerns over Tesla's driver assistance system Autopilot and called on the automaker to rebrand sighting the term as “an inherently misleading name”. Markey's press release came attached with a copy of a December 20 statement from Tesla addressing some of the Democratic senator’s concerns attached.
Markey sits on the Commerce, Science, and Transportation Committee.
The Senator took issues with the name of the driver assistance system, saying that the calling it Autopilot encourages users to over-rely on the technology and take their hands off the wheel completely. Markey is now urging Tesla to remarket Autopilot and clear that it's a driver assistance system and not an autonomous capability.
“Rebranding and remarketing the system to reduce misuse, as well as building back up driver monitoring tools that will make sure no one falls asleep at the wheel,” he said.
Markey also pointed out that the system's safeguards that include safety alerts and an automatic shut off feature that turns off Autopilot during driver inattention can be circumvented by the users. He asked Tesla to build a backup monitoring system arguing that if the system can be tricked, there should be more redundancy built-in.
“I have been proud to work with Tesla on advancing cleaner, more sustainable transportation technologies. But these achievements should not come at the expense of safety. That’s why I’m calling on Tesla to use its resources and expertise to better protect drivers, passengers, pedestrians, and all other users of the road. I urge Tesla to adopt my commonsense recommendations for fixing Autopilot, which include rebranding and remarketing the system to reduce misuse, as well as building back up driver monitoring tools that will make sure no one falls asleep at the wheel. Tesla can and must do more to guarantee the safety of its technology.”
- U.S. Senator Edward Markey.
Tesla Defends its Autopilot
Tesla, in a response letter, has rebuked claims made by the Senator saying that the company has taken steps to and enhance its safety features. The automaker had introduced new warning signs for red lights and stop signs to lessen the risks of jumping the red lights as a result of temporary driver inattention.
In the letter, Tesla said it has revised the steering wheel monitoring, which means in most cases “a limp hand on the wheel from a sleepy driver will not work, nor will the coarse hand pressure of a person with impaired motor controls, such as a drunk driver.”
"Devices marketed to trick Autopilot may be able to trick the system for a short time, but generally not for an entire trip before Autopilot disengages," it said.
Responding to the videos cited by Senator Markey, Tesla has said that the videos are fake.
"While some online videos show that there are a few bad actors who are grossly abusing Autopilot, these represent a very small percentage of our customer base. We believe that many of these videos are fake and intended to capture media attention. Nonetheless, we continually monitor for and review these videos and correlate fleet data to determine whether we can eliminate actions that lead to irresponsible and unsafe driving."
- Alexandra N. Veitch, Senior Director, Government Relations & Policy at Tesla
Learn More:
The U.S National Highway Traffic Safety Administration (NHTSA) has launched an investigation into another Tesla crash suspected to have occurred due to Tesla's Autopilot or another advanced driver assistance system. The NHTSA is already probing the Model S crash that took place in Gardena, California on December 29. The crash occurred after Model S jumped a red light and hit a Honda Civic, killing two people.
Learn More: