Green Technology

Feds Target Tesla Over Autopilot & Full Self Driving Claims


Sign up for daily news updates from CleanTechnica on email. Or follow us on Google News!


Words have consequences, especially when uttered by someone who has an enormous public following. For almost ten years, regulators have been wrangling with Elon Musk over his claims that Tesla automobiles can drive themselves with little to no input from human drivers. The name “Autopilot” has been controversial from the start, as many contend it lulls drivers into a false sense of security. Musk, in his own inimitable fashion, has refused to consider changing the name to something less controversial.

Full Self Driving implies the system is capable of Level 4 autonomous driving, which it clearly is not. It may be good, it may even be very good, but it is at best a Level 2+ system. Even Tesla admits drivers must “supervise” it, which confuses the situation even more. A year ago, the National Highway Transportation Safety Administration (NHTSA) issued a directive that required Tesla to “recall” all of its cars sold in the US that had the “Autosteer on City Streets” feature installed (“recall” in this case meant that it had to roll out a software update). NHTSA describes the defect as follows:

“FSD Beta is an SAE Level 2 driver support feature that can provide steering and braking/acceleration support to the driver under certain operating limitations. With FSD Beta, as with all SAE Level 2 driver support features, the driver is responsible for operation of the vehicle whenever the feature is engaged and must constantly supervise the feature and intervene (e.g., steer, brake or accelerate) as needed to maintain safe operation of the vehicle.

“In certain rare circumstances and within the operating limitations of FSD Beta, when the feature is engaged, the feature could potentially infringe upon local traffic laws or customs while executing certain driving maneuvers in the following conditions before some drivers may intervene: 1) traveling or turning through certain intersections during a stale yellow traffic light, 2) the perceived duration of the vehicle’s static position at certain intersections with a stop sign, particularly when the intersection is clear of any other road users, 3) adjusting vehicle speed while traveling through certain variable speed zones based on detected speed limit signage and/or the vehicle’s speed offset setting that is adjusted by the driver, and 4) negotiating a lane change out of certain turn-only lanes to continue traveling straight.”

Tesla pushed out a software update it said would satisfy NHTSA. Since then, there has been a fair amount of wrangling going on between the agency and the company over whether the update addressed the concerns stated in the recall announcement. At the end of 2023, NHTSA required Tesla to roll out another update, this one designed to prevent drivers from engaging the Autosteer feature on certain roads, especially those with cross traffic. Now the government thinks the company has been less than forthcoming with new data that demonstrates the effectiveness of the software update.

According to CNBC, the agency sent Tesla a letter on May 6, 2024, notifying the company that it had until July 1 to provide NHTSA with the information it wants or face fines up to $135.8 million. Since the recall was completed earlier this year, at least 20 Tesla vehicles have been involved in crashes in which the system was thought to be in use, according to the NHTSA website.

The “recall remedy” probe follows a three-year-long investigation by the agency that found safety issues with Tesla Autopilot contributed to at least 467 collisions and 14 deaths from January 2018 through August 2023. NHTSA concluded that drivers involved in those crashes “were not sufficiently engaged in the driving task and that the warnings provided by Autopilot when Autosteer was engaged did not adequately ensure that drivers maintained their attention on the driving task.”

Driver monitoring systems in Tesla vehicles include torque sensors in the steering wheel to detect whether drivers are keeping their hands on the wheel, and in-cabin cameras that monitor a driver’s eyes. Those systems should alert any inattentive driver to pay attention and stay ready to steer or brake at any time, regulators claim.

NHTSA is seeking detailed crash data from Tesla since the agency completed the Autopilot recall, including data and video stored in or streamed from its cars and retained by the company. It is also asking for records about Tesla’s engineering teams and their approach to “safety defect determination decision making,” “issue investigation,” “action design including human factors considerations (initial and modifications),” and “testing.”

The Justice Department May Be Investigating Tesla

Reuters, the news organization Tesla supporters love to hate, reports prosecutors in the US are looking into whether Tesla committed securities or wire fraud, according to three anonymous sources it spoke with. While Tesla has warned drivers to stay ready to take over driving, the Justice Department is examining statements by Tesla and Elon Musk that suggest Tesla cars can drive themselves.

The investigators are said to be looking into whether Tesla committed wire fraud — which involves deception in interstate communications — by misleading consumers about its driver assistance systems, those sources said. Another focus of the investigation involves whether Tesla committed securities fraud by deceiving investors. One source claimed the Securities and Exchange Commission (SEC) is also investigating Tesla over its representations about driver assistance systems to investors.

Tesla did not respond to a request for comment from Reuters but did acknowledge last October that the Justice Department had asked the company for information about its Autopilot and Full Self Driving systems. Reuters is careful to point out that an investigation is not evidence of wrongdoing and could result in criminal charges, civil sanctions, or no action at all. Prosecutors are far from deciding how to proceed, one of the sources said, in part because they are sifting through voluminous documents Tesla provided in response to subpoenas.

Elon is often out ahead of his skis when it comes to the semi-autonomous capabilities of Tesla automobiles. For years the company posted videos on its website that purported to show Teslas driving themselves and saying, “The person in the driver’s seat is only there for legal reasons. He is not doing anything. The car is driving itself.” And yet, a Tesla engineer testified in 2022 in a lawsuit over a fatal crash involving Autopilot that one of the videos, posted in October 2016, did not accurately portray the true capabilities of the Autopilot system at that time. Musk nevertheless posted the video on social media, writing: “Tesla drives itself (no human input at all) thru urban streets to highway streets, then finds a parking spot.” During an earnings call in October 2022, Musk mentioned an upcoming FSD upgrade he said would allow customers to travel “to your work, your friend’s house, to the grocery store without you touching the wheel.”

Musk, Tesla, And The Law

In the US, people selling things are allowed to refer to their products in glowing terms, a practice known as “puffing.” Courts in the US have consistently ruled that “puffery” or “corporate optimism” regarding product claims do not amount to fraud. In 2008, a federal appeals court ruled that statements of corporate optimism alone do not demonstrate that a company official intentionally misled investors. “Mere failure to realize a long term, aspirational goal is not fraud,” Tesla lawyers said in a 2022 court filing.

Prosecutors scrutinizing Tesla’s autonomous driving claims are proceeding with caution, recognizing the legal hurdles they face, the people familiar with the inquiry said. They will need to demonstrate that Tesla’s claims crossed a line from legal salesmanship to material and knowingly false statements that unlawfully harmed consumers or investors, three legal experts uninvolved in the probe told Reuters.

Justice Department officials will likely seek internal Tesla communications as evidence that Musk or others knew they were making false statements, said Daniel Richman, a Columbia Law School professor and former federal prosecutor. That is a challenge, he said, but the safety risk involved in overselling self-driving systems also “speaks to the seriousness with which prosecutors, a judge and jury would take the statements.”

Just how serious was made clear last month when police in the state of Washington arrested a man on suspicion of vehicular homicide after his Tesla struck and killed a motorcyclist while the driver looked at his phone with Autopilot engaged. In a probable cause statement, a trooper cited the driver’s “admitted inattention to driving, while on autopilot mode … putting trust in the machine to drive for him.” In Washington state, a driver remains “responsible for the safe and legal operation of that vehicle” regardless of its technological capabilities, a state patrol spokesperson told Reuters.

The same month, NHTSA launched an investigation into whether a Tesla recall of more than 2 million vehicles in December adequately addressed safety issues with Autopilot. That investigation found “a critical safety gap between drivers’ expectations” of Tesla’s technology “and the system’s true capabilities,” according to agency records. “This gap led to foreseeable misuse and avoidable crashes.”

The Takeaway

In the midst of all this backing and forthing over autonomous driving features, Elon Musk has been resolute in his support for Autopilot, Autosteer, and Full Self Driving. To some, he has been playing a dangerous game for years by allowing cars equipped with such systems to conduct beta testing on public roads without seeking permission from state or federal regulators. Musk always seems to take an “It’s better to beg for forgiveness than to ask for permission” approach.

The motorcyclist killed recently in Washington did not consent to be part of Musk’s beta test. We can be fairly certain lawyers for his estate will file a claim against Tesla for contributing to his demise. Tesla recently settled a lawsuit brought by the estate of Walter Huang, a former Apple engineer who was killed when his Tesla veered into a highway barrier.

In that case, there were allegations that the deceased was using his cell phone at the time. In addition, the State of California may have been negligent in not repairing that highway barrier in a timely fashion following a prior crash at the same location. Nevertheless, Tesla chose not to let that case proceed to trial. We can draw an inference that had the litigation continued, damaging information about its semi-autonomous systems might have become publicly available that other litigants could have used in their own claims against Tesla.

There may be nothing to the Justice Department investigation, but with all the upheaval at Tesla in the past few weeks, suggestions of wire fraud and securities law violations are the last thing the company needs. Musk is doing a great job of undermining confidence in the company all by himself. He doesn’t need any help.


Have a tip for CleanTechnica? Want to advertise? Want to suggest a guest for our CleanTech Talk podcast? Contact us here.


Latest CleanTechnica.TV Video

Advertisement



 


CleanTechnica uses affiliate links. See our policy here.