hit tracker
prime news list

all information about tech and other

Fatal Crash has renewed concern about Tesla’s “Autopilot” Claim

Fatal Crash has renewed concern about Tesla’s “Autopilot” Claim


Both the National Highway Traffic Safety Administration — a government agency that oversees car safety — and the National Transportation Safety Commission — an independent agency that investigates significant incidents — have sent teams to Texas to investigate the crash. “We are actively working with local law enforcement and Tesla to learn more about the details of the accident and we will take the right steps when we have more information,” the NHTSA said in a statement. It will probably take weeks, if not months, for the results of an investigation to be released.

Still, the facts again highlight the gap between Tesla’s commercialization of its technology and its true capabilities, highlighted in the car’s dialog boxes and owner’s manuals.

On platforms like YouTube and TikTok there is a small rural farm industry that collects videos. People try to “trick” the autopilot into driving in the front seat without paying attention to the driver; some videos show people sleeping on their backs or behind the wheel. Tesla owners have proven that when a driver’s seat belt is secured, someone can order a car in autopilot mode. travel for a few seconds without anyone on wheels.

Tesla — and especially Musk — have a mixed history of public statements about full self-driving and autopilot. A Tesla autopilot gives drivers impressive and audible warnings if the sensors do not detect the pressure on the wheel in their hands for about 30 seconds, and will stop for a minute if they do not feel their hands. But during 60 Minutes When it appeared in 2018, the Musk 3 sat behind the wheel of a mobile model as it pulled back hands into the lap. “Now you’re not driving at all,” the anchor said in surprise.

This month, Musk told podcaster Joe Rogan, “I think Autopilot is good enough that you don’t have to drive most of the time if you don’t really want to.” The CEO has also repeatedly made positive assessments of his company’s progress in autonomous driving. He promised that in 2019 Tesla would be there One million robotaxi on the road at the end of 2020. But in the fall of 2020 company representatives he wrote to the California Department of Motor Vehicles To ensure that Full Self-Driving will be “largely unchanged in the future,” and that FSD will be an “advanced driver support function” rather than autonomous.

To date, only 1,000 participants in FSD’s Beta testing program have been released. “Still be careful, but it’s maturing,” Musk tweeted to FSD Beta testers last month.

At least three people have been killed in serious autopilot accidents. After an investigation into the 2018 fatal accident in Mountain View, California The NTSB asked the federal government and Tesla to ensure that drivers can only operate Tesla’s automated safety features in safe places. Tesla has also recommended installing a more robust control system to ensure drivers pay attention to the road. General Motors, for example, will only accept its users Automated SuperCruise function to operate on pre-mapped roads. The camera facing the driver also detects whether the driver’s eyes are on the road.

An NHTSA spokesman said the agency has opened investigations into 28 Tesla-related crash incidents.

Data released by Tesla says the vehicles are safer than the average U.S. car. On Saturday, just hours before the fatal crash in Texas, Musk he tweeted According to federal data, it is almost 10 times less likely than the average vehicle with the Tesla autopilot. But experts have pointed that out the comparison is not quite adequate. It can be said that autopilot should only be used on highways, while federal data includes all kinds of driving conditions. Teslas are heavy luxury cars that make them safer in an accident.


More great KABEKO stories





Source link

admin

Leave a Reply

Your email address will not be published. Required fields are marked *