Tesla Autopilot Under Fire: New Lawsuit Blames System for Deadly New Jersey Crash
The promise of self-driving cars has captivated the public imagination for years, but the reality of autonomous vehicle technology is proving to be far more complex and fraught with peril. Tesla, a leading innovator in the electric vehicle and autonomous driving space, is facing increasing scrutiny over its Autopilot system. A recent deadly crash in New Jersey has ignited fresh concerns, resulting in a new lawsuit that could have significant implications for the future of self-driving technology and liability.
The New Jersey Tragedy: A Family Lost
On September 14, 2024, a Tesla Model S veered off the Garden State Parkway in Woodbridge Township, New Jersey, and crashed into a concrete bridge support. The accident claimed the lives of three family members: David Dryerman, 54; his wife, Michele Dryerman, 54; and their daughter, Brooke Dryerman, 17. The family was reportedly returning from a music festival. Max Dryerman, Brooke’s older brother, who was not in the car, is also listed as a plaintiff in the suit.
The lawsuit, filed in the U.S. District Court for the District of New Jersey, alleges that the Tesla’s Autopilot and Full Self-Driving (FSD) features were defective and directly caused the crash. The suit claims the vehicle failed to stay in its lane and did not activate emergency braking, which could have potentially prevented the tragic outcome. The estates are seeking unspecified compensatory and punitive damages.
Autopilot Under Scrutiny: A History of Concerns
The New Jersey crash is not an isolated incident. Tesla’s Autopilot system has been linked to numerous accidents, injuries, and fatalities, raising serious questions about its safety and reliability. According to the National Highway Traffic Safety Administration (NHTSA), Tesla’s Autopilot was involved in 736 crashes, including 17 fatalities, since 2019. The NHTSA has launched over 40 investigations into Tesla crashes that resulted in 23 deaths.
One of the key concerns is the potential for “driver disengagement.” The NHTSA concluded in a 2023 filing that “in certain circumstances, Autopilot’s system controls may be insufficient for a driver assistance system that requires constant supervision by a human driver.” This can lead to “foreseeable driver disengagement while driving and avoidable crashes.” The agency also expressed concern that Tesla’s “weak driver engagement system was not appropriate for Autopilot’s permissive operating capabilities,” meaning the system didn’t adequately ensure driver attentiveness.
Tesla’s Defense: Driver Responsibility
Tesla maintains that Autopilot is an advanced driver-assistance system (ADAS) that requires full driver attention and does not make vehicles autonomous. The company states that Autopilot is intended for use only with a fully attentive driver who keeps their hands on the steering wheel at all times and is prepared to take control of the vehicle. Tesla also points to data suggesting that Autopilot is safer than human drivers. In the first quarter of 2025, Tesla reported one crash for every 7.44 million miles driven while Autopilot was engaged, compared to one crash for every 1.51 million miles driven when Autopilot was not in use. The national average is one accident every 702,000 miles.
However, critics argue that Tesla’s safety reports lack crucial details, such as crash severity, causes, and driving conditions. They also point out that Tesla’s data excludes incidents involving its more advanced “Full Self-Driving” (FSD) system.
Legal and Regulatory Landscape: Navigating Uncharted Territory
The legal landscape surrounding autonomous vehicle accidents is complex and evolving. In cases involving Autopilot, liability can depend on various factors, including driver actions, system performance, and potential third-party involvement.
- Driver Negligence: Even with Autopilot engaged, drivers are legally responsible for operating their vehicles safely. If a driver is distracted, impaired, or misuses Autopilot, they may be held liable for an accident.
- Product Liability: If an accident is caused by a defect in Autopilot’s design or manufacturing, Tesla may be held liable under product liability laws. This could include software glitches, sensor malfunctions, or inadequate warnings about the system’s limitations.
- Comparative Negligence: New Jersey follows a “modified comparative fault” rule, meaning that a driver can still seek compensation even if they were partially at fault for an accident, as long as they were not more than 50% responsible. However, their compensation will be reduced by their percentage of fault.
The NHTSA has been actively investigating Tesla’s Autopilot system and has issued recalls to address safety concerns. In December 2023, Tesla recalled over 2 million vehicles to install additional safety features in its Autopilot system following pressure from the NHTSA. The agency is also evaluating whether the Autopilot recall fix that Tesla implemented is effective enough.
The Road Ahead: Balancing Innovation and Safety
The Tesla Autopilot Under Fire: New Lawsuit Blames System for Deadly New Jersey Crash highlights the challenges and risks associated with autonomous vehicle technology. As self-driving cars become more prevalent, it is crucial to strike a balance between innovation and safety. This requires:
- Enhanced Safety Standards: Regulatory agencies like the NHTSA must establish clear and comprehensive safety standards for autonomous vehicle technology.
- Improved Driver Monitoring Systems: Automakers need to develop more effective driver monitoring systems to ensure that drivers remain attentive and engaged while using ADAS features.
- Greater Transparency: Tesla and other companies should provide greater transparency about their safety data and the limitations of their autonomous driving systems.
- Clear Legal Framework: Courts and lawmakers need to establish a clear legal framework for determining liability in autonomous vehicle accidents.
The New Jersey crash serves as a stark reminder of the potential consequences of relying too heavily on unproven technology. While autonomous vehicles hold great promise for the future of transportation, it is essential to proceed with caution and prioritize safety above all else.
Have you or a loved one been injured in an accident involving Tesla Autopilot? Contact our firm today for a free consultation to discuss your legal options.