The testimony could be used in a lawsuit against Tesla.
Back in 2016, Tesla Post a video Enhancing the “fully autonomous” capabilities of its vehicles. But in a recent testimony, Tesla’s autopilot program manager claimed that this promotional video was staged. Tesla simply pre-programmed the car’s path, and humans stepped in whenever things went wrong.
As I mentioned ReutersThis testimony was given by Ashok Eluswami during his testimony in July for the lawsuit against Tesla. The lawsuit stems from a 2018 car accident—The Model X is steered into a traffic barrier while Autopilot is enabled, killing the driver.
We are not legal experts, nor do we know how this testimony will be used in court. However, the fact that Tesla faked this 2016 self-driving video is troubling. Tesla misled the public by advertising an autopilot function that did not exist. This promotional video has been kept under wraps for half a decade, hopefully not affecting public policy or regulation of autonomous vehicles.
As testified by Ashok Elluswamy, Tesla cars were unable to stop at red lights when this video was made. But he also states that “the intent of this video was not to accurately depict what was available to customers in 2016.” That’s right and Tesla website It explicitly states that “Enhanced self-driving and full self-driving features require active supervision from the driver and do not make the vehicle autonomous.”
But Tesla didn’t treat this video as a proof of concept. In fact, Elon Musk Share the video on Twitter As evidence that “Tesla drives itself”. He even gushed that the Tesla car automatically recognized and avoided handicapped parking spaces. (Elon has a long history of overestimating full self-driving capabilities. He has claimed that it will perform “at a much higher level of safety than the average driver” before. End of 2021For example.)
In response to the 2018 accident, Tesla Publish a blog post The claim that the driver “had received multiple visual and one audible warning while driving earlier and the driver’s hands were not detected on the steering wheel for six seconds prior to impact.” The automaker also claimed that “Tesla Autopilot doesn’t prevent all accidents,” though it discouraged lawmakers from cracking down on self-driving systems, because Autopilot “unequivocally makes the world safer for car occupants.”