Skip to content
Consumer Reports worries Tesla is using owners to test dangerous autonomous driving software – TechCrunch


A Tesla in fully autonomous driving mode makes a left turn out of the middle lane on a busy San Francisco street. He jumps into a bus lane where he’s not supposed to be. It turns a bend and almost digs into parked vehicles, causing the driver to rush to get behind the wheel. These scenes were captured by auto critic AI Addict, and more storylines like this are popping up on YouTube. You could say that these are all mistakes that any human using a cell phone could have made. But we expect more from our AI overlords.

Earlier this month, Tesla has started sending over-the-air software updates for its fully autonomous driving (FSD) beta 9 software, an advanced driver assistance system that relies solely on cameras, rather than cameras and radars like Tesla’s previous ADAS systems.

In response to videos showing unsafe driving behavior, like unprotected left turns, and other reports from Tesla owners, Consumer Reports released a statement On Tuesday, he said the software upgrade did not seem secure enough for public roads and that it would independently test the software update on his Model Y SUV once it received the software updates. required.

The consumer organization said it was concerned that Tesla would use its current owners and their vehicles as guinea pigs to test new features. Making their point, Tesla CEO Elon Musk urged drivers not to be complacent when driving because “there will be unknown issues so please be paranoid.” Many Tesla owners know what they’re getting into because they signed up for Tesla’s early access program which provides beta software for feedback, but other road users haven’t given in. their consent for such testing.

Tesla updates are sent to drivers across the country. The electric vehicle company did not respond to a request for information on whether or not to consider autonomous driving regulations in specific states – 29 states have enacted autonomous driving laws, but they differ wildly depending on the state. Other autonomous driving technology companies such as Cruise, Waymo and Argo AI told CR they are testing their software on private tracks or using qualified safety drivers as instructors.

“Automotive technology is advancing very quickly and automation has a lot of potential, but policymakers need to step up to put in place strong and sensible safety rules,” said William Wallace, head of safety policy at CR. in a press release. “Otherwise, some companies will simply treat our public roads as if they were private testing grounds, without holding them responsible for safety. “

In June, the National Road Safety Administration has issued a general standing order that requires manufacturers and operators of vehicles equipped with SAE Level 2 ADAS or SAE Level 3, 4 or 5 automated driving systems to report accidents.

“The main mission of NHTSA is safety. By making accident reporting mandatory, the agency will have access to critical data that will quickly identify safety issues that could arise in these automated systems, ”said Dr Steven Cliff, NHTSA Acting Administrator, in a press release. “In fact, collecting the data will help instill public confidence that the federal government is closely monitoring the safety of automated vehicles. ”

FSD beta 9 software has added features that automate more driving tasks, such as navigating intersections and city streets under the supervision of the driver. But with such excellent graphics detailing where the car is in relation to other road users, down to a woman on a scooter passing by, drivers might be more distracted by the technology that’s supposed to help them at times. crucial.

“Tesla just asking people to be careful isn’t enough – the system has to make sure people are engaged when the system is up and running,” said J.ake Fisher, senior director of CR’s automatic test center in a statement. “We already know that testing the development of autonomous driving systems without adequate driver assistance can – and will end – in fatalities.”

Fisher said Tesla should implement an in-car driver monitoring system to make sure drivers watch the road to avoid accidents like the one involving Uber’s autonomous test vehicle, which struck and killed a woman in 2018 in Phoenix as she crossed the street.





Source link