The future of auto accident litigation

Are autonomous vehicles safe?

Self-driving cars and self-driving trucks are the future of transportation.  These vehicles are are also referred to as autonomous vehicles.  The future may be coming too soon for the safety of the motoring public and pedestrians.

Autonomous vehicles use a combination of radar, lidar (complex sensors that use laser lights to map the environment) and high-definition cameras to map their surroundings. When the vehicles meet a new object,  images of the object are processed rapidly by the vehicle’s artificial intelligence (AI) based upon a massive collection of reference images of similarly labeled objects.  The AI is supposed to process this data and instantly figure out how to react.  Sound scary?  It should.

As the Washington Post reported on November 12, 2019:

The reality of how little some of these [autonomous vehicles] know came to light last week as an investigation by the National Transportation Safety Board (NTSB) of a deadly Uber crash revealed that the self-driving Uber car was unable to distinguish a person from a vehicle or bicycle and that it wasn’t programmed to know that pedestrians might jaywalk.

The driver “supervising” the self-driving Uber car was looking at her phone while a pedestrian crossed the street outside the crosswalk.  The car killed the pedestrian even though the car’s radar detected the pedestrian six seconds before the crash.  The problem is that the car’s system didn’t properly classify the pedestrian or know how to react. The NTSB said that “pedestrian outside a …crosswalk” was “not assigned [by the Uber algorithm] an explicit goal,” meaning the self-driving car could not predict the path the pedestrian might travel the way it might have had she been in the crosswalk.

Sally A. Applin, who studies the intersection between people, algorithms and ethics, stated to the WAPO that engineers appear to be “programming for what should be, not what actually is.”  Companies that currently dominate artificial intelligence space in the self-driving car field include Waymo (owned by Alpha-bet), Aurora, Cruise (a division of GM), Ford and Aptiv (affiliated with Lyft).  In voluntary safety reports filed in 2018-2019, these companies all mention jaywalkers and pedestrians walking outside of marked crosswalks.  Conversely, Uber’s safety reports filed November 2018 do not.  It is clear that the timeline to safely roll out this technology is much longer than the industry has lead Congress believe.

It’s easy to say that Congress should quickly put the brakes on autonomous vehicles and require all of these companies to comply with uniform standards before the another self-driving vehicle is allowed on the road, but it is too late.  The vehicles are on the road and Congress has very little understanding about what a comprehensive regulatory scheme should look like.  Silicon Valley has effectively lobbied Congress to stay out of its way.  At a time when greater caution and regulation are needed, the current administration is disinclined to regulate.  The day will come too soon when we see a long line of self-driving tractor trailer trucks barreling down the right lane of most interstate highways.  What happens when a deer jumps out into the highway, traffic ahead comes to an abrupt stop or another vehicle breaks down on the shoulder?

The future of automobile accident litigation

What does that mean for the victims who will predictably be maimed or killed by computer glitches?  I predict that, in the absence of some sort of statutory fix that assigns strict liability to automobile manufacturers, future automobile accident litigation will become prohibitively expensive for victims of self-driving vehicle malfunctions.  There will be no human driver to sue, so the traditional claim based upon negligent driving  may be unavailable.  The victim will be forced to file an expensive product liability claim against a host of well-funded defendants, including, among others, the automobile manufacture, the radar, lidar,  and camera manufacturers, and the artificial intelligence companies.  This will require the hiring of a host of technical experts in order to prove the product defect. Victims with mild to moderate injuries will not be able to afford to seek just compensation.

Injured in an automobile accident? Call Phelan Petty

At Phelan Petty, we are experienced handling all types of automobile accident cases including car accidents, motorcycle accidents and truck accidents.  We are also experienced product liability attorneys and are well-positioned to litigate against manufacturers of self-driving vehicle technology.

If you or a loved one were injured or killed in an automobile accident, please contact us today for a free legal consultation.  Please call (804) 980-7100 or complete this brief online form.  We will respond promptly.

References

“What self-driving cars can’t recognize may be a matter of life and death,” Washington Post (November 12, 2019), https://www.washingtonpost.com/technology/2019/11/11/what-self-driving-cars-cant-recognize-may-be-matter-life-death/

 

 

About Michael Phelan

Michael Phelan has been consistently recognized for his excellence as a trial lawyer, his commitment to research, his outstanding communication skills, and his sincerity and dedication. As one of his valued clients said, “Mike puts his heart into it."