‘Eyes-off driving’ is coming, and we’re so not ready

https://www.profitableratecpm.com/f4ffsdxe?key=39b1ebce72f3758345b2155c98e6709c

Last month, General Motors added its name to the growing list of automakers pursuing a new type of partially automated technology called “eyeless driving.” What they haven’t done, however, is provide a detailed description of how they will take responsibility if something goes wrong.

Not to be confused with the type of distracted driving that many drivers seem to be practicing these days, GM’s system would be a step toward the automaker’s ultimate goal of selling fully autonomous private cars. Some cars produced by GM already include the company’s Super Cruise system, which allows drivers to take their hands off the wheel but uses eye-tracking technology to ensure they keep their eyes on the road. The new system, at level 3 of the six-level autonomy scale, would allow drivers to take their hands off the wheel. And their eyes leave the road on some American highways.

GM says it aims to bring its Level 3 system to market by 2028, starting with the Cadillac Escalade IQ. From there, the technology will likely spread to the automaker’s other brands, like Chevrolet, Buick and GMC. Soon, drivers will be able to check their phones without shame or risk of traffic violations. In some cases, drivers are encouraged to play video games or watch YouTube while their vehicle is driving.

But only sometimes. It is essential that in a Level 3 system, drivers must always be ready to take control of the vehicle if asked to do so. And if they don’t do it quickly, they could be held liable if something goes wrong. And when it comes to driving in today’s world, there’s always something wrong.

“With conditional automation, Level 3 automation, things get more complicated,” said Dr. Alexandra Mueller, senior research scientist at the Insurance Institute for Highway Safety. “And that’s where I think a lot of the concern comes from, because there’s a lot we just don’t know when it comes to Level 3 driving automation.”

The uncertainty is even more concerning when looking through the list of automakers actively pursuing this technology. In addition to GM, Ford, Jeep parent Stellantis, and Honda have all chipped in with Level 3. Mercedes-Benz already has a Level 3 system, which it calls Drive Pilot, but it’s only legal for use on specific highways in California and Nevada.

And that’s the trap. The industry is actively planning the launch of a new technology that is still largely banned in most countries. Germany and Japan grant temporary allocations to BMW and Honda respectively. But as of today, Tier 3 is very restricted and will likely remain so until lawmakers can figure it out.

This is an incredibly tricky issue for many regulators. How do we assign responsibility in a system that can bounce between an automated driving system and a human driver? In the case of Drive Pilot, Mercedes says it will take responsibility for accidents caused by its technology while the system is active. But this is inherently conditional, and the driver is still responsible if they fail to take control when prompted or misuse the system.

Tesla is already using this ambiguity to its advantage with its Level 2 systems, Autopilot and Full Self-Driving. An investigation into dozens of accidents involving Tesla found that Autopilot would disengage “less than a second” before impact. Investigators found no evidence to suggest Tesla was trying to shirk responsibility – but it certainly looks bad for the company.

The sensors that guide these systems, such as cameras, infrared trackers and torque sensors, can also be used by companies to present evidence, in the event of an accident, of who was in control and when. When announcing its new “eyes off” system, GM CEO Mary Barra pointed to the growing number of sensors as potentially exculpatory for the company in these cases. “We’ll have so many more detections that we’ll know pretty much exactly what happened,” she said when asked about liability issues related to Level 3 automation. “And I think you’ve seen General Motors, you know, always take responsibility for what we need.”

The very definition of Level 3 presents a contradiction: drivers are told they can disengage, but they must also remain available for rapid re-engagement. When transitions are planned, such as when a driver enters or exits a mapped area, the transfer should be smooth. But unexpected events, like sudden weather conditions or route changes, could make these systems unreliable. Research has shown that humans generally have difficulty handling this type of “out-of-loop” task retrieval.

Research has shown that humans generally have difficulty handling this type of “out-of-loop” task retrieval.

When people can no longer drive for an extended period of time, they may overreact when suddenly forced to regain control in an emergency situation. They may overcorrect the steering, brake too hard, or be unable to react properly because they weren’t paying attention. And these actions can create a potentially dangerous, even deadly, domino effect.

“The mixed fleet scenario, which will likely exist well beyond our lifetimes, provides a highly uncontrolled environment that many highly automated systems and even partially and conditionally automated systems will struggle with,” Mueller said. “And they will have difficulty with [it] indefinitely because frankly, we live in a very chaotic and dynamic environment where things are constantly changing.

We are already beginning to see case law emerging that places responsibility on the human driver rather than the automated system.

In Arizona, the safety driver of an Uber robotaxi pleaded guilty to negligent homicide for a fatal crash that occurred in 2017 when the autonomous system was activated. Before that, a Tesla driver pleaded no contest to negligent homicide for two deaths resulting from a crash while the company’s Autopilot system was in use. In both cases, prosecutors brought criminal charges against the human behind the wheel, reasoning that despite the presence of an automated system, the driver was the one ultimately responsible for the vehicle.

Automakers are probably pleased with the results of these cases. But there have been other cases in which the automaker could share responsibility if something went wrong. Take for example the recent jury verdict in Florida, where Tesla was held partly responsible for a crash that killed two people. In this case, the owner of the Model S which used autopilot was also found responsible, but it was Elon Musk’s company which was ordered to pay $243 million to the victims’ families.

Mobility attorney Mike Nelson notes that legal precedent for automation-related accidents is still embryonic. Cases relating to Tier 2 systems will be used to inform decisions on Tier 3 and beyond. But judges, lawyers and juries generally lack technical expertise, portending a future driven primarily by unpredictability.

As we enter this chaotic interim period, where human drivers find themselves sharing the road with more and more robots, automakers would do well to be as transparent as possible, Nelson said. The reason? Juries tend to appreciate that companies do not try to cover up their wrongdoings.

“I’m not happy with the chaos, but it’s not unforeseen,” Nelson said. “This has happened every time we’ve had an industrial revolution.”

Track topics and authors of this story to see more in your personalized homepage feed and to receive email updates.


Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button