Home » ‘Ghost brakes’ have no solution

‘Ghost brakes’ have no solution

by admin
‘Ghost brakes’ have no solution

Matzko paid $5,000 to upgrade to the enhanced Autopilot feature, but his 2018 Model X was having trouble driving like an “untrained test engineer.”

In a lawsuit filed in federal court in San Francisco, Matzko pointed out that Tesla CEO Elon Musk mischaracterized Autopilot’s technology in publicity and made unfulfilled promises time and time again to convince consumers that Tesla had a stronger of autonomous driving technology.

In fact, there are not many lawsuits about Tesla’s alleged false propaganda. Just last month, Model 3 owner Toledo sued Tesla in California court, saying his Tesla would brake suddenly (“ghost brakes”) in the presence of obstacles that didn’t exist at all, saying It was a “terrible nightmare”.

Screenshot of NHTSA lawsuit

Toledo also believes that Tesla knows that Autopilot may have functional defects, but it is still irresponsible to market it, claiming that it has the ability to self-driving.

Not just ordinary consumers, but also U.S. government agencies are nearing their limits. The California Department of Motor Vehicles (DMV) and the U.S. National Highway Traffic Safety Administration (NHTSA) have each taken Tesla to court, saying Tesla had misrepresented. The California Senate has also passed a bill to ban the use of words such as “autonomous driving” in advertisements for smart driving.

It is worth noting that the lawsuit-ridden Tesla did not respond, but raised the price of FSD from $12,000 to $15,000 on September 5, the second price increase this year. The price of FSD has also risen from $2,500 to $15,000, a 6-fold increase in price in seven years.

Back to the question itself, why does Tesla’s ghost brakes frequently occur, and are there false propaganda in FSD and Autopilot? The self-driving car that has bounced several times, is Musk’s promise to launch at the end of the year another empty check?

70% of accidents are related to Tesla

Tesla Autopilot and FSD advertised and actual use do not match, consumers and government agencies can not sit still.

If you take into account the rapid growth in the number of subscriptions to Tesla’s self-driving software, it means that cases like Matzko and Toledo involve more consumers. According to data released by Tesla earlier, the optional installation rate of Tesla FSD software in Model S/X models in 2020 will exceed 60%, and the optional installation rate in Model Y models will exceed 40%. In 2021, automatic driving software Subscription and other businesses achieved revenue of US$3.802 billion, an increase of 65% year-on-year, accounting for 7.06% of total revenue.

The “Washington Post” reported in February this year that in the past nine months, the cumulative number of complaints about Tesla’s “ghost brakes” has reached 354. Especially in the past three months, the number of related complaints has grown rapidly, reaching 107 since.

See also  The first Windows 11 ISO image is released, and the new system can now be cleanly installed-Windows 11

The content of the complaint shows that the user suddenly braked without any warning while the vehicle was driving at high speed while turning on Tesla’s automatic driving assistance system.

NHTSA believes that such unnecessary braking during driving greatly increases the potential for a vehicle accident.

According to data released by NHTSA, about 70% of accidents involving driver assistance technology involve Tesla products. Among them, Tesla’s accidents caused serious injuries accounted for 60%, and fatal accidents accounted for 85%.

NHTSA has called for a new round of safety investigations into Tesla.

The DMV filed a lawsuit in July this year, accusing Tesla of making false claims about the capabilities of its self-driving technology, “while promoting vehicles with or likely to be equipped with advanced driver assistance system (ADAS) features, Tesla has spread untruthful or Misleading claims with no factual basis.”

In a marketing copy on its website, Tesla claims that Tesla’s driver-assistance technology is capable of driving “without requiring action from the person in the driver’s seat.” The DMV believes that despite Tesla’s statement that assisted driving “requires active driver supervision,” the advertising claims are misleading and exaggerate the capabilities of Autopilot and Full Self-Driving technologies.

On the premise that the ghost brakes are not resolved, Tesla is still suspected of false propaganda.

Tesla, which frequently lingered on the red line of government supervision, was finally put on shackles. On September 2, the California Senate passed a new bill prohibiting Tesla from including words such as “autonomous driving” in advertisements for intelligent driving.

Why do ghost brakes occur frequently?

The root of the Ghost Brake is a flaw in the purely visual perception system.

Andre Kapassi, the former head of Tesla’s AI project, has publicly described a typical case of “ghost brakes” and called it “notorious.” When the vehicle is about to drive under the bridge, the millimeter-wave radar has detected the existence of the static object “bridge”, but because there is not enough resolution, the millimeter-wave radar cannot distinguish whether the object is a bridge or a car.

At this time, visual perception (camera) is needed to tell the vehicle system what this static object is. However, because the perception system is associated with millimeter-wave radar, the camera does not have sufficient accuracy in measuring various parameters of the object ahead. If there happens to be a car ahead that is slowing down slowly (but not enough to brake), the system correlates the “slowing vehicle” reported by the vision system with the “static object” reported by the radar, causing ghost braking.

See also  The monthly rent of 69 yuan China Telecom High Card returns: 60G universal data, 200 minutes of call-China Telecom China Telecom

Kapasi on Neural Networks and Ghost Brakes
Kapasi on Neural Networks and Ghost Brakes

Simply put, Kapassi believes that the source of Tesla’s ghost brakes is the conflict between the millimeter-wave radar and the camera’s recognition of objects, causing the system to issue incorrect instructions.

In this regard, Tesla threw the “pot” to millimeter-wave radar. In May of this year, when Tesla released the FSD Beta v9 version, it simultaneously canceled the millimeter-wave radar on all products in North America, and turned to a pure vision route, using the “Tesla Vision” vision system to achieve 3D through cameras in different positions. Image perception.

Contrary to expectations, after Tesla switched to the pure visual route, the phenomenon of ghost braking did not decrease but increased. According to the description of some car owners, it may be just a plastic bag floating on the road or the shadow of the car in front, and driving a vehicle by yourself will ghost brake.

The above phenomenon indicates that there are limitations in the usage scenarios of the camera. Similar to the human eye, the camera is greatly affected by weather, strong and weak light, but it is far less intelligent than the human eye. According to Yang Yongcheng, a partner of Fengrui Capital, in-vehicle autopilot cameras are basically installed with fixed focal length, fixed FOV, fixed aperture, and fixed position, and do not have the automation and flexibility of the human eye.

Yang Yongcheng said that the front of the car is equipped with far, medium and close visual imaging modes. In addition to cost considerations, in order to realize the universal design of the human eye, a large number of motors, mechanical motion and control components must be used. The complex usage scenarios of automobiles, including high and low temperature, vibration, motion, etc., require the camera to operate for a long time without failure.

This is why the vehicle will add the auxiliary system millimeter-wave radar after the main sensor camera in the perception system. Millimeter waves are essentially electromagnetic waves that can directly measure distance and speed. The mainstream frequency bands are 24GHz (for 15-30 meters short-to-medium distance perception) and 77GHz (for 100-200 meters long-distance perception).

However, Tesla’s current practice of self-severing its arms is obviously a bit hasty when the pure visual perception algorithm has not been iterated to an ideal state. Just like Tesla’s breakup with Mobileye in 2016, Tesla Autopilot and ADAS have been criticized for a long time, and the experience after iteration has not risen but fallen.

When will ghost brakes be fixed?

Musk, who has always been optimistic, has also become cautious when facing the problem of ghost brakes.

See also  German Bundestag - 87 export licenses for armaments issued in 2022

A few days ago, Musk said on Twitter that the latest beta version of FSD has been launched, but users should be more cautious about the system.

According to the Tesla FSD Beta 10.9 changelog, Tesla “reduced false decelerations for intersecting objects by improving velocity estimates for objects at the end of visibility,” and “by better modeling merge points and heavy objects at the edges of visibility. Shadow objects, improved smoothness of merge controls”.

As for when the ghost brakes will be fixed, even Musk himself isn’t sure.

Regarding the iteration of the FSD, Musk said that he had mentioned in the “Master Plan Part II” that Tesla would need about 6 billion autonomous miles to get the autonomous driving system approved by global regulators.

As of the second quarter of this year, Tesla’s FSD beta fleet had 35 million miles on the road. Simple conversion, 0.58% of Musk’s target value has been completed.

Musk’s logic implements machine learning through Tesla’s deep neural network through a large number of marginal cases. The original image is processed through the neural network, the boundary of the target object is “pulled out”, and finally the data is compared and corrected to complete the image recognition of the machine. This process is like teaching children to recognize pictures, telling the machine whether the object is a person, a car or a street sign (the professional term is supervised learning, Supervised Learning).

At the same time, in order to speed up the “cultural level” of the machine, Tesla also launched a “shadow mode”, which allows the vehicle’s autopilot system to be turned on, and sensors detect data around the road the vehicle is driving. The driving operation is still done by humans, and the machine does not participate in the driving. However, in the process of human driving, the machine will learn the driving operation of the human, so as to achieve the optimization of the automatic driving system.

The marginal case is based on the capture of a large amount of road data. In fact, every Tesla driving on the road can be regarded as its test vehicle. The camera equipped with the vehicle continuously captures the road information and sends it back to the central server.

However, Tesla is facing a more serious problem at this stage – China’s restrictions on Tesla’s usage scenarios. Due to data security considerations, many institutions and special road sections have explicitly prohibited Tesla’s entry and passage, which means that Tesla can face China’s unique scenarios (ramps, urban takeaway vehicles, and express tricycles). A question mark is required for effective identification.

You may also like

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

Privacy & Cookies Policy