Houston driver sues Tesla after Cybertruck on Autopilot crashes into overpass

March 13, 2026

HOUSTON – A Houston driver is suing Tesla after a crash involving a Cybertruck that she says suddenly veered toward the edge of an overpass while operating in Autopilot mode.

According to a lawsuit filed in Harris County district court, Justine Saint Amour was driving her Tesla Cybertruck on Aug. 18, 2025, along the Eastex Freeway when she says the vehicle’s driver-assist system malfunctioned.

Recommended Videos



The petition claims the vehicle attempted to drive straight off an overpass while Autopilot was engaged, forcing Saint Amour to try to regain control before crashing into a barrier.

The lawsuit says Saint Amour suffered serious injuries in the crash, primarily affecting her shoulder, neck and back.

Attorneys representing Saint Amour argue the crash highlights broader safety concerns surrounding Tesla’s driver-assistance technology and how it is marketed to consumers.

“This company wants drivers to believe and trust their life on a lie: that the vehicle can self-drive and that it can do so safely,” attorney Bob Hilliard said in a statement provided by his firm. “It can’t and it doesn’t.”

Claims about Tesla’s driver-assist system

The lawsuit alleges Tesla sells its “Full Self-Driving” package as a premium feature even though the system remains SAE Level 2 automation, meaning drivers must remain fully attentive and ready to take control of the vehicle at all times.

Level 2 systems are classified as advanced driver-assistance technology, not autonomous driving, according to the Society of Automotive Engineers’ widely used automation scale.

Tesla has repeatedly defended its driver-assist technology by pointing to its own safety data.

In its quarterly Vehicle Safety Report, the company says vehicles operating with Autopilot engaged recorded one crash for every 7.63 million miles driven.

By comparison, Tesla reports one crash every 955,000 miles when its vehicles are driven without Autopilot and the U.S. national average is about one crash every 670,000 miles, according to federal data cited in the report.

Tesla says the figures show vehicles using Autopilot crash significantly less often per mile driven than typical U.S. drivers, according to the Tesla Vehicle Safety Report.

Tesla calculates those numbers based on crashes per mile driven rather than per trip and includes incidents involving airbag deployment or structural damage.

Critics have argued the comparison may not fully reflect real-world conditions because Autopilot is typically used on highways, which are statistically safer than city streets.

Saint Amour’s attorneys claim Tesla’s branding creates the impression that vehicles can operate independently when they cannot.

The suit also alleges Tesla made design decisions that reduced safety, including relying primarily on camera systems rather than additional sensor technology and lacking sufficient safeguards to ensure drivers remain alert.

In addition, the petition claims Tesla vehicles lack adequate backup safety systems to override automated driving functions if the software fails.

Federal scrutiny of Autopilot

Tesla’s driver-assist technology has faced increasing scrutiny from federal regulators.

In December 2023, Tesla recalled more than 2 million vehicles in the United States to address concerns that drivers could misuse the Autopilot system if they were not adequately monitored, according to the National Highway Traffic Safety Administration.

The recall involved a software update designed to add stronger driver alerts and monitoring.

Federal regulators are still reviewing whether that fix fully addresses the safety concerns.

Federal regulators have also been tracking crashes involving Tesla’s driver-assist systems.

Data collected by the National Highway Traffic Safety Administration under its automated driving crash reporting program identified736 crashes involving Tesla vehicles operating with Autopilot or Full Self-Driving systems, including17 deaths, through April 2024, according to NHTSA.

The agency has also opened investigations into crashes involving Teslas using Autopilot that collided with emergency vehicles.

What the lawsuit argues

Saint Amour’s attorneys argue the crash was not random but instead the result of decisions Tesla made about how the system was designed and marketed.

“What happened to Justine wasn’t an accident,” Hilliard said. “It was the foreseeable result of choices Tesla made knowingly, repeatedly, and without regard for the people on the road.”

The lawsuit seeks damages related to Saint Amour’s injuries and alleges Tesla failed to adequately warn drivers about the risks associated with its automated driving features.

Tesla response

As of publication, Tesla had not responded to requests for comment about the lawsuit.

Tesla has previously said its driver-assistance systems are designed to make driving safer but require drivers to remain attentive and ready to intervene at all times.

Deabate about Autopilot-type technology

The case adds to a growing debate about how advanced driver-assistance technology should be marketed to the public.

Transportation safety officials have warned that terms such as “self-driving” can create confusion among drivers about the capabilities of current systems, which still rely heavily on human oversight.

Independent safety researchers have also studied advanced driver-assist technology.

The Insurance Institute for Highway Safety found Tesla’s system performed well in crash-avoidance capabilities but said its driver-monitoring safeguards were weaker than some competing systems, according to IIHS evaluations of partially automated driving systems.

  

Go to Top