Automated vehicles raise many legal questions SanTan Sun News

Automated vehicles raise many legal questions

Automated vehicles raise many legal questions
Opinion
0

By Michael Medina

Guest Writer

 

Automated driving systems and crash avoidance technology are in the fast lane to becoming our daily driving reality. 

While the aim of these technologies is to make travel from point A to point B simpler and safer, the legal and liability issues are anything but simple.

In addition to self-driving systems and crash avoidance, there are even vehicle-to-vehicle systems that allow equipped vehicles to exchange information as they travel.

Although each of these systems is aimed at safety the question remains: Who is liable when something goes wrong?

Crash avoidance technology includes systems with features that may provide forward collision warning, forward collision brake assist, lane departure prevention, adaptive headlights and more. These technologies are in broad use in newer cars today.

Companies like Waymo have dozens of autonomous vehicles throughout the East Valley helping to make Chandler the “self-driving capital of the world,” as WIRED Magazine recently dubbed the city.

Autonomous vehicles offer a wide range of controls, with a computer taking a limited role all the way to full control of a car. With so many different levels of control available, it becomes more difficult to ascertain who is truly in control of the vehicle. 

Drivers who purchase vehicles equipped with these systems have consumer expectations about increased safety while driving. But in the event that something goes wrong, it will have to be determined if there is product liability, driver liability or driver negligence.

The manufacturer may make claims about increased safety and avoidance of collisions. There may be warranties available. A company may be held responsible if something goes wrong either by malpractice — promising a standard of care that was not met — or misrepresentation — promising increased safety. 

But the driver may also be held strictly liable. As in the case with dog bites, the owner assumes responsibility for the product and its use.

We will certainly see cases played out in court in the future. More than a year ago in Tempe, an Uber test vehicle with a self-driving system in control struck and killed a pedestrian.

 The National Traffic Safety Board investigated the incident. The NTSB discovered that Uber had turned off the vehicle’s advance warning system and was using its own self-driving system. 

Cameras showed the Uber driver was on a cellphone immediately prior to the collision. Investigators determined the Uber driver had failed to heed warnings that began at six seconds prior to impact with the pedestrian. 

The self-driving system determined at 1.3 seconds prior to impact that emergency braking was needed. However, Uber had not enabled the emergency braking maneuvers. 

The pedestrian’s surviving family members reached a confidential settlement with Uber.

The NTSB has also investigated several cases involving Tesla’s AutoPilot program. The NTSB found that the manner in which Tesla’s Autopilot monitored and responded to the driver’s interaction with the steering wheel was not an effective method of ensuring driver engagement.

It remains to be seen which direction the courts and juries will lean, but in the meantime, don’t assume that your automated driving system is going to offer full and complete protection physically — and legally. 

 

— Michael Medina is a partner in the East Valley law firm Davis Miles and focuses his practice on product liability and catastrophic injury. Information: davismiles.com and 480-733-6800

Comments are closed.