October 23, 2024

TechNewsInsight

Technology/Tech News – Get all the latest news on Technology, Gadgets with reviews, prices, features, highlights and specificatio

US investigating Tesla's 'full self-driving' system after pedestrian killed: NPR

US investigating Tesla's 'full self-driving' system after pedestrian killed: NPR

A Tesla car logo is photographed at the Paris Motor Show in Paris on October 14, 2024.

Michel Euler/AP


Hide caption

Toggle caption

Michel Euler/AP

DETROIT – The U.S. government's road safety agency is investigating Tesla's “full self-driving” system after receiving reports of crashes in low visibility conditions, including one that killed a pedestrian.

The National Highway Traffic Safety Administration said in documents that it opened the investigation on Thursday after the company reported four crashes when Tesla vehicles encountered sun glare, fog and airborne dust.

In addition to the pedestrian's death, another accident occurred that injured a pedestrian, the agency said.

Investigators will look at whether “fully autonomous driving” can “detect and respond appropriately to low-visibility conditions on the road, and if so, what conditions contribute to these crashes.”

The investigation covers nearly 2.4 million Tesla vehicles from 2016 through 2024.

A message was left on Friday seeking comment from Tesla, which has repeatedly said the system cannot drive itself and human drivers must be ready to intervene at all times.

Last week, Tesla held an event at a Hollywood studio to unveil a fully self-driving robotaxi with no steering wheel or pedals. Musk, who has promised self-driving vehicles before, said the company plans to have self-driving Models Y and 3 that operate without human drivers next year. He said robotaxis without steering wheels will be available in 2026 starting in California and Texas.

See also  Dow futures fall: Nasdaq jumps as Nvidia, Chip and AI perform higher, but market breadth is terrible

The impact of the investigation on Tesla's self-driving ambitions is not clear. NHTSA would have to approve any robotaxis without pedals or a steering wheel, and that is unlikely to happen while the investigation progresses. But if the company tries to deploy self-driving vehicles in its existing models, it would likely fall foul of state regulations. There are no federal regulations specifically focused on autonomous vehicles, although they must meet broader safety rules.

NHTSA also said it would look into whether any other similar incidents involving “full self-driving” occurred in low-visibility conditions, and would request information from the company about whether any updates affected the system's performance in those conditions.

“In particular, this review will evaluate the timing, purpose, and capabilities of any such updates, as well as Tesla’s assessment of their impact on safety,” the documents said.

Tesla reported the four incidents to NHTSA under an order from the agency covering all automakers. The agency's database says the pedestrian was killed in Rimrock, Arizona, in November of 2023 after being struck by a 2021 Tesla Model Y. Rimrock is about 100 miles (161 kilometers) north of Phoenix.

The Arizona Department of Public Safety said in a statement that the accident occurred after 5 p.m. on November 27 on Interstate 17. Two cars collided on the highway, blocking the left lane. The Toyota 4Runner stopped, and two people got out to help monitor traffic. A red Tesla Model Y then struck the 4Runner and one of the people who got out. A 71-year-old woman from Mesa, Arizona, was pronounced dead at the scene.

See also  Oracle (ORCL) Q3 2024 earnings report

Raul Garcia, the department's public information officer, said the collision occurred because the sun was in the Tesla driver's eyes, so the Tesla driver was not charged. He added that the sun's glare was also a contributing factor in the first impact.

Tesla has twice recalled its “full self-driving” system under pressure from the NHTSA, which in July requested information from law enforcement and the company after a Tesla using the system struck and killed a motorcyclist near Seattle.

The summonses were issued because the system was programmed to run stop signs at slow speeds and because the system did not comply with other traffic laws. Both issues were to be fixed with online software updates.

Critics said Tesla's system, which only uses cameras to detect hazards, does not have adequate sensors for fully autonomous driving. Almost all other companies working on self-driving vehicles use radar and laser sensors as well as cameras to see better in darkness or poor visibility conditions.

Musk said humans drive by sight only, so cars should be able to drive with cameras only. He described LIDAR (Light Detection and Ranging), which uses laser beams to detect objects, as a “fool's errand”.

The “full self-driving” recalls arrive after a three-year investigation into Tesla's less-sophisticated Autopilot system crashing into emergency vehicles and other parked vehicles on highways, many with warning lights flashing.

That investigation was closed last April after the agency pressured Tesla to recall its cars to enhance the weak system that ensures drivers pay attention. A few weeks after the recall, NHTSA began investigating whether the recall was successful.

See also  Alibaba stock continues to decline. Analysts are quick to cut targets after abandoning the offer.

NHTSA began its investigation into Autopilot crashes in 2021, after receiving 11 reports that Tesla vehicles using Autopilot crashed into parked emergency vehicles. In documents explaining why the investigation ended, NHTSA said it ultimately found 467 crashes involving Autopilot that resulted in 54 injuries and 14 deaths. Autopilot is a fancy version of cruise control, while Musk described “full self-driving” as being able to drive without human intervention.

The investigation opened Thursday enters new territory for the National Highway Traffic Safety Administration (NHTSA), which previously viewed Tesla's systems as helping drivers rather than driving the car themselves. With the new probe, the agency is focusing on “full self-driving” capabilities rather than just making sure drivers are paying attention.

Michael Brooks, executive director of the nonprofit Center for Auto Safety, said the previous investigation into Autopilot did not look into why Tesla cars did not see and stop for emergency vehicles.

“Before, they put the burden on the driver instead of the car,” he said. “Here they say these systems are unable to adequately detect safety risks whether drivers are paying attention or not.”