He finds no peace the semi-autonomous driving system of Tesla. Now he is back in the dock and after the sensational investigation into sudden braking (the causes of which are unknown for now) could cause a gigantic recall: 830,000 carsbasically all of those sold in the United States from 2014 to presentthey should return to the workshop for a technical update.
“Our Teslas suddenly brake”: 750 customers open the largest investigation in the world
by Vincenzo Borgomeo
The issue reported by the National Highway Traffic Safety Administration involves a series of collisions with emergency vehicles parked on the side of the road or with trucks with flashing lights on. To be precise, we are talking about 16 accidents that caused 15 injuries and one death. But the NHTSA investigation also finds another problem: the agency has found that Autopilot is used where it shouldn’t (areas with high traffic density, areas with low grip or visibility such as rain, snow or ice) and that many drivers do not intervene to avoid accidents despite vehicle warnings. This is why the investigation could eventually be archived.
But there is another even more worrying aspect: according to investigators, all of Tesla’s semi-autonomous driving systems, and therefore not just Autopilot, could push pilots. to have a dangerous attitude, instilling false confidence and driving motorists to underestimate the dangers. The message is clear: “You don’t play on the road, you risk your life”. And if this part of the investigation were to come to a conclusion, then Tesla would be in serious trouble because Musk’s house would then have to overhaul the entire human-car interface. Basically, for how Tesla are made, do the whole car all over again.
Tesla fires 10,000 employees. Musk: “Bad feeling about the economy”
by Vincenzo Borgomeo, Arcangelo Rociola
When will we know how it will end? Given the gravity of the situation, and of the possible sanction, the NHTSA has taken time. And he said he will decide within a year whether a recall is needed or whether the investigation should be closed.
Certainly – according to the survey – in most of the 16 accidents, Tesla issued collision warnings to drivers shortly before impact. Automatic emergency braking intervened to slow cars in about half of the cases. And, on average, Autopilot relinquished control of the Tesla to the pilot less than a second before the crash. This would support the thesis that cars lead the driver to a dangerous attitude, because the NHTSA has found that in many cases the drivers had their hands on the wheelas always requested by Tesla, but they did not intervene in any way to avoid the accident.
Elon Musk has a problem. And it’s not Twitter
by Riccardo Luna
Not only that: in accidents where video is available, drivers would have had to see the vehicles (which then rear-ended) on average eight seconds before impact. And so they could have avoided the accident. All this then shifts the focus from the Autopilot malfunction to the malfunction of the whole system – Tesla which in no way ensures that drivers pay attention to driving. A terrible theory (in practice it supports the thesis that cars are badly done in the design phase) but supported by the analyzes of Bryant Walker Smith, a University of South Carolina law professor who has always studied automated vehicles: “Monitoring the position of the driver’s hands on the volanye – explains Smith – is useless because it only measures a physical position. It’s not about mental capacity, commitment or ability to react. “
Similar systems from other companies, such as the Super Cruise della General MotorsInstead, they use infrared cameras to observe the driver’s eyes or face and make sure they are looking forward. But even these systems can still allow the driver to lose control, of course. But it is an important step forward. Just go to Youtube to see several Tesla users wedging oranges between the crown and the spokes of the steering wheel to trick the system and travel by car without intervening on the guide.
The National Transportation Safety Board also intervened on the subject, investigating some of Tesla’s accidents dating back to 2016, recommending to NHTS and Tesla to limit the use of Autopilot to areas where it can operate safely. The NTSB also recommended NHTSA request a better system from Tesla to make sure drivers are paying attention.
In all this delirium Tesla continues not to comment on anything. Although last fall, the automaker carried out an online update to its Autopilot software to improve vehicle light detection. emergency in low light conditions. And the NHTSA has obviously returned to the assault, asking why the company didn’t make a recall. In short, we are only at the beginning of an investigation that could have catastrophic consequences for Tesla.