Connect with us

Business

Who’s liable when a self-driving car collides with another vehicle?

Published

on

Our mission to make business better is fueled by readers like you. To enjoy unlimited access to our journalism, subscribe today.

A white Tesla Model 3 sedan pulls away from a stop sign, beginning a wide left turn.

“Oh my gosh,” whispers the driver, whose hands hover above the steering wheel as the car moves on its own. The vehicle is controlled by Tesla’s Full Self Driving software, technology that Tesla says will eventually be able to drive cars without human involvement.

But suddenly, there’s trouble. “Oh. Oh!” a passenger shouts, as it becomes clear that the car is about to drive itself into a parked Ford.

The driver, an FSD “beta tester” named Brendan McGowan, quickly seizes the wheel, narrowly avoiding a collision. “Jeeeeeeezus,” he exclaims.

McGowan’s video of the incident, recorded in Auburn, Calif. on October 26, is just one of many new glimpses of FSD in action since the technology was made available for testing by some Tesla customers in October. Although FSD has had some impressive moments, near-misses like this one highlight a largely unanswered question: When a driverless car slams into you or your property, who do you sue, and who pays up?

Is the person behind the steering wheel responsible – even if they weren’t touching it? What about the developer who built the software? Or is it the car’s manufacturer—or maybe the supplier that made the car’s navigational cameras—that are liable?

The question has taken on new relevance in recent weeks. In addition to Tesla’s FSD rollout, Alphabet spinoff Waymo has deployed truly driverless vehicles in Arizona. A recent report from Waymo disclosed that Waymo vehicles were involved in 18 accidents in 2019 and 2020, and avoided several others because a human safety driver intervened.

Of course, autonomous driving technology is still being refined, and eventually it’s expected to drive more safely than humans. But experts agree that no such system can completely eliminate accidents.

The question of liability has been somewhat muddied by marketing hype. Despite the name of Tesla’s “Full Self Driving, it’s not yet an autonomous driving system. As with similar technology from Cadillac and Volvo, FSD is considered an advanced driver-assistance system, or ADAS. These automate some elements of driving, such as lanekeeping, but drivers still have ultimate responsibility for what happens when they’re behind the wheel. In fatal accidents involving supervised autonomy systems, U.S. regulators and safety investigators have repeatedly placed blame on human drivers who weren’t watching the road.

When truly driverless cars hit the road, responsibility will shift from drivers to vehicle makers and software designers. But experts don’t expect comprehensive legislation laying out the new order.

Instead, liability for robotaxis or automated tractors will be determined as courts through the courts, based on using existing law to the new facts of specific incidents.

“The answer to who’s liable is going to be, ‘It depends,’” says Bryant Walker Smith, a University of South Carolina law professor who studies liability and autonomous vehicles.

The same process shaped how we think about liability for human drivers. For instance, Smith says that in the 1930s and ‘40s, some accident victims struck by hired taxis tried to sue the passengers rather than the drivers. That approach has largely disappeared because it was rejected by courts.

Smith says that judging liability in individual accidents involving self-driving vehicles should come down to several well-established legal principles. At the highest level, autonomous vehicles will be subject to ‘vicarious liability,’ the idea that companies are responsible for the actions of their employees and the quality of the products they produce.

“Did a wheel fall off? Was a stop sign miscoded [in the system]? Did the LIDAR fail?” says Smith, referring to the laser-based radar used by many autonomous systems. If an obvious hardware or software failure caused a crash, a vehicle’s manufacturer would probably end up being liable.

But many accidents involving human drivers are caused by subtler failures of judgment, and Smith expects courts to use a handful of formulas to evaluate how the technology responded. The first, he says, will be: “Did this system perform as well as a competent human driver? If not, that’s going to suggest there was a defect.”

That standard may be applied to a system’s overall performance rather than its actions in a specific situation. The U.S. National Highway Traffic Safety Administration set the table for that criteria in 2017, when it touted the overall safety benefits of Tesla’s Autopilot system while clearing the system of fault in a fatal 2016 crash.

Second, Smith says, courts assessing liability will look at whether a specific system performs as well or better than a comparable system. That’s already a key measure in automotive recall and safety-monitoring programs.

Finally, Smith hopes courts will adopt one novel legal test when evaluating self-driving cars: “Did the system perform better than the last one that caused this harm?”

The ability to constantly learn, after all, is one of the core features that promise to make robots safer drivers than humans. Rather than relying on one person’s experience (or their slow human reflexes), autonomous systems will learn from data gathered by thousands of other vehicles. That technological promise aligns with the legal principle of ‘foreseeability’—the question of whether a civil defendant should have predicted a particular risk.

“Once something has happened, it has been foreseen,” says Smith. The makers of autonomous systems, he argues, shouldn’t “get to make the same mistake twice.”

Auto manufacturers are as concerned with their reputation as with straightforward legal liability, though. Automakers have long competed on safety, and they’re still out to win the battle for autonomy. But they’re also collaborating on safety standards for the systems through the Automated Vehicle Safety Consortium, which includes Ford, GM, Toyota, Uber, and Lyft.

“Underpinning a lot of the work that the consortium has done is the assumption that ultimately the manufacturer is responsible for the behavior of the system,” says Frank Menchaca, an executive at SAE, a professional organization of auto engineers. That concern about responsibility and reputation helps explain the caution of a Ford or Daimler compared to a company like Tesla.

According to Greg Bannon, who oversees autonomous-vehicle policy for AAA, it will take “years” of court decisions involving truly autonomous vehicles to create consensus about liability between industry, law enforcement, courts, and insurers. That consensus will allow more claims to be settled without lengthy legal fights.

The greatest legal clarity, though, may come simply as more truly driverless vehicles hit the road, with clear messaging that no human driver is in control – or responsible for the vehicle’s actions.

“It’s at that point that the company is making a promise to the public that the user does not have that [driver] role,” says Smith, the University of South Carolina law professor. “And that the company is driving through its technology.”

More must-read tech coverage from Fortune:

Lyron Foster is a Hawaii based African American Musician, Author, Actor, Blogger, Filmmaker, Philanthropist and Multinational Serial Tech Entrepreneur.

Continue Reading
Comments

Business

Get nine top-rated project management courses for just $34

Published

on

Behind every successful team is a project manager who makes things run as smoothly as possible. Team leads usually handle project management responsibilities, but learning these skills yourself will also make you a more efficient employee. Not to mention, they can lead to raises or higher-paying jobs. 

Most of us use project management skills that we learned through hands-on experience, but applying a strict methodology gives your team structure when approaching even the most difficult tasks. If that sounds interesting, then we recommend the 2021 Project & Product Manager Essentials Bundle, which is just $39.99.

What you will learn 

The bundle delivers 11 hours of training on project management best practices that will help your teams deliver work as efficiently as possible. Agile is a popular framework that’s applied to methodologies such as Scrum, Lean, Kanban, and more. It will introduce you to Agile by illustrating the difference between projects. 

You can specialize in a methodology that best suits your team. Scrum is often used to manage software development teams. Scrum Master Training: Case Studies & Confessions will teach Agile Scrum principles such as the empirical process, control theory, and continuous improvement. 

Project management is a must-have skill, whether you’re a small business owner, a team lead, or even a regular employee. Each course in The 2021 Project & Product Manager Essentials Bundle retails for $199, but you can get all nine courses for $39.99.

Continue Reading

Business

Three paths Exxon could take with its dividend

Published

on

Continue Reading

Business

How This Small Biotech Is Targeting Obesity — And Why Shares Are Surging

Published

on

Continue Reading

Trending

Copyright © 2020 Black Biz Daily News