Image credits: Bloomberg / Licensed Contributor.
By the end of this week, thousands of Tesla owners will potentially test the automaker’s latest version of its “Full Self-Driving” beta software, version 10.0.1, on public roads, even as regulators and federal officials are investigating the security of the system after a few high-profile crashes.
A new study from the Massachusetts Institute of Technology confirms the idea that the FSD system, which despite its name is not actually an autonomous system but rather an advanced driver assistance system (ADAS), is maybe not so sure. Researchers studying at-glance data from 290 human-initiated autopilot disengagement periods have found that drivers can become inattentive when using partially automated driving systems.
“Visual behavior patterns change before and after [Autopilot] disengagement ”, we read in the study. “Before disengagement, drivers looked less at the road and focused more on areas unrelated to driving than after the transition to manual driving. The higher proportion of off-road looks before disengaging manual driving was not offset by longer looks ahead.
Tesla CEO Elon Musk has said that not everyone who has paid for FSD software will be able to access the beta, which promises more automated driving features. First, Tesla will use the telemetry data to capture personal driving metrics over a seven-day period to ensure that drivers remain attentive enough. The data can also be used to implement a new safety rating page that tracks the owner’s vehicle, which is linked to their insurance.
The MIT study provides evidence that drivers may not use Tesla’s autopilot (AP) as recommended. Since the AP includes safety features like cruise control and traffic-sensitive autoguiding, drivers become less attentive and take their hands off the wheel more. Researchers have found that this type of behavior can be the result of a misunderstanding of what AP features can do and their limitations, which is reinforced when it works well. Drivers whose tasks are automated for them may naturally become bored after trying to maintain visual and physical alertness, which the researchers say only creates more inattention.
The report, titled “A Pattern of Naturalistic Looking Behavior Around Tesla Autopilot Disengages,” followed Tesla Model S and X owners through their daily routines for periods of a year or more across the region. from Boston. The vehicles were equipped with the intelligent real-time driving environment recording data acquisition system1, which continuously collects data from the CAN bus, GPS and three 720p video cameras. These sensors provide information such as vehicle kinematics, driver interaction with vehicle controllers, mileage, driver location and posture, face and view in front of the vehicle. MIT has collected nearly 500,000 miles of data.
The purpose of this study is not to shame Tesla, but rather to advocate driver attention management systems that can give drivers real-time feedback or tailor automation functionality to it. depending on the level of attention of a driver. Currently, Autopilot uses a manual sensing system to monitor driver engagement, but it does not monitor driver attention through eye or head tracking.
The researchers behind the study developed a gazing behavior model, “based on naturalistic data, that can help understand the characteristics of driver attention shifts in automation and support the developing solutions to ensure that drivers remain sufficiently engaged in driving tasks. “This would not only help driver monitoring systems deal with ‘atypical’ stares, but it could also be used as a benchmark to study the effects of automation on the safety of driver behavior.
Companies like Seeing Machines and Smart Eye are already working with automakers like General Motors, Mercedes-Benz, and apparently Ford to bring camera-based driver surveillance systems to cars with ADAS, but also to address issues caused by driving in intoxicated or impaired. The technology is there. The question is, will Tesla use it?