Uber test car accident
In March of this year an Uber robot test car hit and killed a pedestrian in Arizona. The driverless car was a 2017 Volvo XC90. According to a report issued yesterday by the National Transportation Safety Board (NTSB), the automated emergency braking system on the Uber test car was turned off when the car crashed into the pedestrian. The car’s sensor system was otherwise operating normally and a test driver was behind the wheel.
The NTSB listed three reasons why the brakes were not applied before the fatal autonomous vehicle crash:
- Uber had chosen to switch off the collision avoidance and automatic emergency braking systems that are built into commercially sold versions of the 2017 Volvo XC90 test vehicle. Uber did that whenever its own driverless robot system was switched on — otherwise, the two systems would conflict.
- However, Uber also disabled its own emergency braking function whenever the test car was under driverless computer control, “to reduce potential for erratic behavior.”
- Uber had expected the test driver to both take control of the vehicle at a moment’s notice, while also monitoring the robot car’s performance on a center-console mounted screen. Dashcam video showed the driver looking toward the center console seconds before the crash. The driver applied the brakes only after the woman had been hit at 39 mph.
Are we ready for autonomous vehicles?
This car accident shows why there may be no substitute for an attentive human driver behind the wheel. The crash occurred at night. The victim, a 49 year old woman, emerged from behind a bush on a divided highway median strip and into traffic. The victim may have been impaired. However, the dashcam video showed her emerging from the bushes, which an attentive driver should have noticed as well. Driverless technology is simply not ready for prime time.
The fact that many vehicles have effective driver-assist technology, such as back-up cameras and lane-change warnings, does not mean that we are ready for fully automated vehicles. Our country cannot even pass an infrastructure bill to fix dangerous roads and bridges. We are a long time from have the necessary road sensors that will enable vehicles to interact with the landscape.
Moreover, the technology used by various car companies in their driverless systems is not all the same. While the basics of various companies’ driverless systems are the same, in detail they differ greatly, from how software is written to which sensor systems are used. Tesla, for instance, bucks the industry in general by dismissing the need for lidar, an expensive technology that uses laser pulses to draw images of physical objects. NTSB is investigating several fatalities involving Tesla’s semi-autonomous Autopilot system.
The NTSB report said the Uber system first identified the woman crossing the road as a vehicle, then as a bicycle. Accurate classifications are essential. Driverless systems not only “see” objects, they make judgments on matters like direction and speed based in part on whether an object is a car, a pedestrian, a bicycle or something else.
Object recognition and the resultant decisions made by a robot system have come far over decades of research but are still being refined. The “erratic behavior” described in the report apparently refers to the problem of false positives – misidentifying objects or seeing something that isn’t there. The result can be too much braking — which, if it occurred in a commercial driverless taxi, would make passengers uncomfortable.
The point is that the driverless car technology is not ready yet. So programmers must find a balance between too much braking and not enough, and until they do, there is no substitute for an attentive human in control of the vehicle.
The scary thing is that lobbyists for the driverless car industry have convinced the politicians to rush forward with bills approving driverless cars and tractor trailer trucks. Driverless car and truck deaths will be inevitable. The question then will become, who is liable? If the crash is caused by a software malfunction, should the victim have to undergo the expense of suing a software company or should the vehicle occupant’s automobile insurance cover the claim?
The fairest result would be for the victim to be able to make an automobile insurance claim against the robot car’s occupant and/or owner. The problem is that consumers don’t have powerful lobbyists. It should come as no surprise that our bought-and-paid-for representative in Congress are already working on bills for their cronies in the driverless car industry that would immunize everyone involved and leave the injured or killed victims with little or no recourse.
Phelan Petty: Fighting for Car Accident Victims
At Phelan Petty, we specialize in handling in serious car accident personal injury cases. Because we only take on a small number of challenging cases, we can spend more of our time and resources on our clients, giving them the best possible chance of getting fair compensation for the injuries and losses they’ve suffered.
To get more information about our approach to personal injury claims and schedule your free consultation with one of our attorneys, call us at 804-980-7100 or fill out our convenient online contact form.