American electric car manufacturing giant, Tesla, has been under fire for its autopilot feature. There have been a couple of accidents that have links to Tesla’s autopilot feature. The autopilot feature on some electric cars has been a controversial feature. Recently, Tesla provided some new data to the National Highway Traffic Safety Administration (NHTSA). The report is on the vast majority of crashes involving automatic driver assistance systems (ADAS). According to NHTSA, Tesla accounts for 273 crashes between July 2021 and May 15 of this year. It is interesting to note that a good number of these crashes have links to the autopilot feature. However, Tesla executives still believe that the autopilot feature can prevent accidents.
Gizchina News of the week
Tesla claims autopilot can prevent 40 crashes daily
Tesla Autopilot software director, Ashok Elluswamy claims that Tesla’s Autopilot driving system can prevent about 40 accidents every day. The accidents he claims autopilot can prevent are those associated with sudden unexpected acceleration (SUA). In about 40 crashes, human drivers mistakenly pressed the accelerator instead of the brakes, he said. But Autopilot realizes they are doing this and a collision is imminent, and automatically stops accelerating and applies the brakes to prevent a human collision.
Autopilot, a technology unique to Tesla, works essentially the same way as systems from rivals like GM’s Super Cruise or Ford’s BlueCruise. The Autopilot system has been blamed for past crashes involving Tesla vehicles. Back in June, the NHTSA stepped up its investigation into whether Autopilot is flawed. The agency also reveals that it had to review no less than 191 vehicle accidents involving the use of Autopilot.
In reality, the system is only partially automated, such as keeping the car in its lane and a safe distance from the vehicle ahead. The system is only meant to assist the driver, who must be ready to intervene at all times. This system is also under investigation in other countries like Germany.