Tesla Is Urging Drowsy Drivers to Use ‘Full Self-Driving’. That Could Go Very Wrong

https://www.profitableratecpm.com/f4ffsdxe?key=39b1ebce72f3758345b2155c98e6709c

Since the launch of Tesla Its complete autonomous function (FSD) in beta version in 2020, the manual of the company owner was clear: unlike the name, cars using the functionality cannot drive themselves.

The Tesla driving aid system is built to manage many road situations, on stop lights, track changes, steering, braking, turn. However, “the complete self-commissioner (supervised) forces you to pay attention to the road and be ready to take over at any time,” said the manual. “Failure to comply with these instructions could cause damage, serious injuries or death.”

Now, however, the new messaging in the car urges drivers who derive between the tracks or feeling drowsiness to activate the FSD – potentially confusing drivers, which, according to experts, could encourage them to use functionality in a unsure manner. “Lane Drift detected. Let FSD help so that you can stay focused, ”reads the first message, which was included in a software update and spotted earlier this month by a pirate that follows Tesla Development.

“Diet detected. Stay focused with the FSD, ”reads the other message. Online, drivers have since published that they have seen a similar message on their screens in the car. Tesla did not respond to the request for comments on this message, and Wired could not find this message appearing on a Tesla screen in the car.

The problem, according to the researchers, is that drivers ‘inattention moments are exactly when drivers’ safety assistance should demand that drivers become ultra -concentrated on the road – do not suggest that they depend on a development system to compensate for their distraction or fatigue. At worst, such an invitation could lead to an accident.

“This messaging puts the drivers in a very difficult situation,” explains Alexandra Mueller, principal researcher at the Road Safety Insurance Institute who studies driver’s assistance technologies. She believes that “Tesla essentially gives a series of contradictory instructions”.

Many research studies how humans interact with computer systems designed to help them perform tasks. Generally, he finds the same thing: people are really terrible passive supervisors of systems that are quite good most of the time, but not perfect. Humans need something to keep them engaged.

In research in the aviation sector, it is called the “problem of performance outside loop”, where pilots, based on fully automated systems, may not correctly monitor the dysfunctions due to complacency after long operating periods. This lack of active engagement, also known as a decrease in vigilance, can lead to a reduced capacity to understand and regain control of a defective automated system.

“When you suspect that the driver becomes sleepy, to delete even more from their physical commitment – it seems extremely counterproductive,” explains Mueller.

“As human beings, as we get tired or tired, taking more things we have to do could really turn against him,” explains Charlie Klauer, researcher and engineer who studies drivers and stimulates the performances of Virginia Tech Transportation Institute. “It’s delicate.”

Over the years, Tesla has made changes to its technology to make it more difficult for inattentive drivers to use the FSD. The automaker started in 2021 to use car pilot surveillance cameras to determine whether drivers were paying attention enough when using the FSD; A series of alerts prevent drivers if they don’t look at the road. Tesla also uses a “strike system” which can prevent a driver from using her driver assistance function for a week if they do not respond several times to her guests.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button