Skip to main content

Tesla says Elon Musk’s statements on self-driving ‘might have been deep fakes’ in bizarre defense

Tesla has decided to use a bizarre defense in a lawsuit brought by the family of a Tesla owner who died in an accident while using Autopilot a few years ago.

The automaker claimed that CEO Elon Musk shouldn’t be made available to explain some of his statements on self-driving because some of the public comments might have been “deep fakes.”

The lawsuit revolves around the death of Walter Huang, an Apple engineer who died in his Tesla Model X while driving to work in 2018.

As we previously reported, the Model X was driving on Autopilot when it entered the median of a ramp on the highway as if it was a lane and hit a barrier about 150 meters after going into the median.

The impact was quite severe because there was no crash attenuator since it was already destroyed from a previous crash. The driver was rushed to the hospital, but he died of his injuries.

NHTSA investigated the accident and confirmed that the vehicle was using Autopilot at the time of the crash, but it blamed the driver, who was playing a video game on his phone at the time of the accident, according to the phone data, and on the lack of crash attenuator.

Tesla asks drivers always to pay attention and be ready to take control when using Autopilot.

The Huang family decided to sue anyway, and they are trying to use the argument that some of Tesla’s and, more specifically, some of CEO Elon Musk’s comments about Autopilot and self-driving have led Huang to believe he could use Autopilot in the manner that led to the crash.

The lawsuit is set to go to trial in Santa Clara County Superior Court this year, but Tesla has tried to keep Musk and his statements out of the case with a quite bizarre defense.

The automaker is claiming that some of the statements that Musk is believed to have made might have been “deep fakes,” and therefore he shouldn’t be questioned on them.

Deep fakes generally mean synthetic media that have been digitally manipulated to replace one person’s likeness convincingly with that of another, but people also use the term to refer to CGI videos made to make someone say something that they didn’t actually say.

Judge Evette D. Pennypacker didn’t buy the argument. She said in her judgment (via The Telegraph):

Their position is that because Mr Musk is famous and might be more of a target for deep fakes, his public statements are immune. In other words, Mr Musk, and others in his position, can simply say whatever they like in the public domain, then hide behind the potential for their recorded statements being a deep fake to avoid taking ownership of what they did actually say and do.

She has ruled that Musk should be made available for an interview of up to three hours to discuss his statements about Tesla Autopilot and Full Self-Driving.

Electrek’s Take

That’s bizarre. If Tesla thinks some of the statements are deep fakes, it should say exactly which ones and try to prove it. But the capability to create deep fakes certainly doesn’t make anyone immune to scrutiny on their statement.

Top comment by Blorft

Liked by 33 people

I really hate the whole "slippery slope" concept as an argument, but this defense feels like a really dangerous step for Tesla to take. Where does this logic end? Do Tesla and Musk believe they should be able to say anything they want, regardless of the truth of those statements, and simply not be held liable because "those statements could be fakes"?

This defense is shameless, and a worrying sign of how little Tesla and Musk care for being responsible.

View all comments

Also, it’s not like we don’t know for a fact that Musk has made some fairly ambitious statements about Tesla Autopilot and Full Self-Driving.

Is he now going to claim that he never said that Tesla would have 1 million robotaxis on the road by the end of the year three years ago? Was it a deep fake? Was it also a deep fake when he said it again the next year? That’s just ridiculous and worrying that Tesla would try such a defense. I guess that Tesla’s new “hardcore litigation team” at work.

However, in this case, the Huang family is facing an uphill battle because despite Musk’s comments about what he believes Tesla could achieve with self-driving in the future, Tesla has always been clear about how drivers should use Autopilot.

Every time you activate Autopilot, it tells the driver to keep their hands on the steering wheel and be ready to take control at all times. The data points toward the fact that Huang was playing a video game, not paying attention, and had plenty of time to react when the car went into the median and before hitting the barrier. He was clearly not using Autopilot as intended.

FTC: We use income earning auto affiliate links. More.

Stay up to date with the latest content by subscribing to Electrek on Google News. You’re reading Electrek— experts who break news about Tesla, electric vehicles, and green energy, day after day. Be sure to check out our homepage for all the latest news, and follow Electrek on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our YouTube channel for the latest reviews.

Comments

Author

Avatar for Fred Lambert Fred Lambert

Fred is the Editor in Chief and Main Writer at Electrek.

You can send tips on Twitter (DMs open) or via email: fred@9to5mac.com

Through Zalkon.com, you can check out Fred’s portfolio and get monthly green stock investment ideas.