Tesla secret configuration disables ‘nag’ for Autopilot, FSD

0
53
Tesla secret configuration disables ‘nag’ for Autopilot, FSD

A Tesla Model Y is seen in a Tesla parking lot on May 31, 2023 in Austin, Texas.

Brandon Bell | Getty Images

A security researcher using the handle “@GreentheOnly” has Found a secret setting In Tesla vehicles, the company can enable and allow drivers to use Tesla’s advanced driver assistance systems, marketed as Autopilot and Full Self-Driving, without having to keep hands on the wheel for extended periods of time.

When this mode is enabled in a Tesla vehicle, it eliminates what owners call “nagging.” The researchers nicknamed the feature “Elon Mode,” but that’s not the company’s internal nomenclature, he said.

Tesla does not currently offer self-driving cars. Chief Executive Elon Musk has promised to start delivering self-driving cars from at least 2016, and has said by the end of 2017 Tesla will be able to complete test drives in the U.S. without human intervention.

In contrast, Tesla’s driver assistance system requires a human driver to remain focused and ready to brake or steer.

Typically, when a Tesla driver is using Autopilot or FSD (or a variant thereof), a visual symbol flashes on the car’s touchscreen to prompt the driver to frequently apply resistance to the steering wheel. If the driver loses his grip on the steering wheel, the sound escalates to a beep. If the driver still does not apply torque to the steering wheel at this point, the vehicle may temporarily disable Autopilot for up to several weeks.

Elon Musk said in a December tweet that he would remove the “nagging” for at least some Tesla owners in January. The plan never materialized. By April 2023, Musk said in a tweet, “We are gradually reducing it in proportion to the increased safety,” referring to the nag.

The security researcher who revealed the “Elon mode” and whose identity is known to Tesla and CNBC requested to remain anonymous due to privacy concerns.

He has been testing the capabilities of Tesla vehicles for years and is a Tesla Model X owner. He also kept reporting bugs to the company and made tens of thousands of dollars from successful Tesla bug bounty submissions, as previously reported.

The “white hat hacker” said in a private message interview on Tuesday that “unless you work for Tesla, or otherwise have access to the company’s relevant database,” there is no way to know how many cars can use “Elon Mode” today.

In February, Tesla voluntarily recalled 362,758 vehicles in the United States, warning that its Full Self-Driving Beta system could cause crashes. (This is the second such recall.) Tesla provided an over-the-air software update to address the issues.

The FSD Beta system at that time could cause crashes, Safety Recall Report By allowing affected vehicles to: “engage in unsafe behavior around intersections, such as driving straight through an intersection in a turn-only lane, entering an intersection controlled by a stop sign without coming to a complete stop, or If someone pays attention, a steady yellow traffic signal will be issued.”

GreentheOnly said he expected future recalls to be related to issues with the FSD Beta and the effect of the system automatically stopping “traffic control devices” such as traffic lights and stop signs.

According to the latest available Data from the National Highway Traffic Safety AdministrationTesla has reported 19 incidents to the agency that resulted in at least one death, and the company’s driver assistance systems were engaged within 30 seconds of a collision.

Tesla reported 21 fatal accidents to NHTSA in which the cars were equipped with driver assistance systems.

Tesla did not immediately respond to a request for comment.

LEAVE A REPLY

Please enter your comment!
Please enter your name here