Autonomous Vehicles: Who’s At Fault When There’s A Robot At The Wheel?
We are fascinated by the issue of liability when autonomous vehicles are involved in accidents. Let’s set individual state-by-state rules for autonomous vehicles aside for a moment (we will cover that beast in the future). The issue of who is to blame when the inevitable accident occurs is sticky, at best. Automotive News did a piece a couple of years ago that posed three different scenarios to a law professor, and it is an interesting read.
It is my belief that liability disputes against other parties who own or insure the vehicles will take a backseat to large product liability cases against autonomous vehicle manufacturers and suppliers. But when human-on-human accidents eventually begin their slow decline, the issue of who is at fault when a robot is at the wheel will have to be decided.
If an SAE Level 5 (fully autonomous) vehicle is involved in a crash, the human should be out of the mix, right? Sue the manufacturer, the software developers, the hardware developers, etc., because Joe Smith didn’t even have a steering wheel! Hypothetically, the defendants will say that the accident was unavoidable, that no human could have avoided the mishap, and that the robot did everything it could – but, alas, the accident still occurred. Do the human drivers (and their history of collective driving) become the baseline standard that the defendants point to as their actions being more than reasonable?
SAE Level 4 (remember geo-fencing?) could be a liability nightmare. Handing off emergency situations to drivers (who now have less driving miles under their belts because the robot takes the wheel most of the time) doesn’t make sense. Sure, there could be situations where advances in vehicle-to-vehicle communications provide human drivers with 12 seconds before they need to take over and acclimate to road conditions. But what happens if a child runs out into the street during that 12-second robot/human takeover period and is killed? Does the robot point the finger at the human and vice versa?
Mercedes-Benz has already proclaimed that they would save their occupants first before a pedestrian. Will other manufacturers be so bold? Will MB’s logic hold up in front of a jury? It seems that while we will eventually take the blame out of the hands of the human, the blame for who was at fault will now be an ethical question. Will regulators mandate that autonomous vehicles must respond to Hazard X with Response Y? If the NFL can’t even define a catch, how will the government decide what is right and wrong in a collision scenario?
How all of this eventually plays out will be incredibly interesting. While serious injuries and fatalities are sure to decline in the long-term, the liability game for the accidents that do occur will be a swiftly moving digital target. Stay with us as we explore issues like these in a deeper context in the future.
Subscribe Here to receive future issues of CrashAxe, our Autonomous Vehicle Newsletter