Skip to main content

ADAS ethics: Who is responsible for the crash?

By Pam Oakes, MACS Senior Technical Trainer and Curriculum Developer – Published on 10/28/2025

Even before ADAS technologies crossed into the Level 3 realm, ethics encompassing autonomy was on the forefront. When a semi-autonomous vehicle crashes, who is responsible?

Traditionally, drivers have borne full responsibility for vehicle operation. But with systems like lane centering, automatic braking (Figure 1), and adaptive cruise control now intervening in real time, the line between human and machine decision-making is increasingly blurred.

All manufacturers require drivers to remain attentive, even in semi-autonomous mode. Yet the consumer’s overreliance is a known issue – and big problem. A 2021 Massachusetts Institute Technology (MIT) study showed that users of Level 2/Level 3 systems exhibited reduced situational awareness over time; not good for the machine learning process.

Take, for example, a Tesla on Autopilot involved in a highway collision. If the driver had hands on the wheel but failed to react to an obstacle that the system misclassified, is it human negligence or software failure? Courts and regulators are grappling with these questions. Tesla’s stance: driver responsibility; driver pays for damages. But other manufacturers are still trying to figure out the fine line of responsibility and are taking the high road when it comes to footing the bill.

Volvo shoulders the accident costs while their vehicle is operating in autonomous mode. In fact, the European car company publicly committed to footing the bill as far back as 2015. CEO Håkan Samuelsson said: “…the company would take full responsibility for any incidents caused by its self-driving technology, including crashes resulting from system failures or even hacking.”

To study self-driving technology, Volvo is building a test center in Sweden (Figure 2).

Liability is beginning to shift: In 2023, Germany approved Level 3 autonomous driving with conditional driver disengagement and assigns accident liability to manufacturers when the system is in control. In contrast, the U.S. still holds the driver accountable in all (minus Volvo) scenarios.

And there are concerns about bias and decision-making in AI models. If a vehicle must choose between two collision paths, what standards guide it?

As ADAS systems progress toward Level 5 – full autonomy – everyone within the industry needs to work to define driver-to-vehicle responsibilities going forward.

Figure 1: Automatic emergency braking systems apply the vehicle’s brakes automatically in time to avoid or mitigate an impending forward crash with another vehicle. NHTSA believes AEB systems represent the next wave of potentially significant advances in vehicle safety. Dynamic brake support and crash imminent braking are AEB systems that potentially save lives and reduce moderate and less severe rear-end crashes that are common on our roadways. Credit: nhtsa.gov

Figure 2: Volvo’s Mobility Innovation Destination Torslanda is a tailored test bed of automotive development near their operations in Gothenburg, Sweden. The innovation centre allows them to test emerging technologies and self-driving capabilities in a city-like environment. Credit: media.volvocars.com

About the author: Pam Oakes is a Senior Technical Trainer and Curriculum Developer at MACS. Pam provides automotive training at all levels, including train-the-trainer, professional technicians, and scholastic programs, with over 20 years of hands-on experience running a 12-bay shop in Florida. She has been a MACS Section 609 proctor since 2016 with additional expertise including ADAS calibration, fleet training, and technical curriculum development for major corporations. You can reach Pam at poakes@macsmobileairclimate.org

Learn More

Become a Member

MACS Upcoming Events

MACS Action Magazine – SEP/OCT 2025 Issue

Section 609 FAQs

Leave a Reply

Your email address will not be published. Required fields are marked *