U.S. to Probe Tesla’s ‘Full Self-Driving’ System After Death

You May Be Interested In:Last North Gaza Hospital for Premature Babies at Risk


DETROIT — The U.S. government’s road safety agency is again investigating Tesla’s “Full Self-Driving” system, this time after getting reports of crashes in low-visibility conditions, including one that killed a pedestrian.

The National Highway Safety Administration says in documents that it opened the probe on Thursday with the company reporting four crashes after Teslas entered areas of low visibility, including sun glare, fog and airborne dust.

In addition to the pedestrian’s death, another crash involved an injury, the agency said.

Investigators will look into the ability of “Full Self-Driving” to “detect and respond appropriately to reduced roadway visibility conditions, and if so, the contributing circumstances for these crashes.”

The investigation covers roughly 2.4 million Teslas from the 2016 through 2024 model years.

A message was left early Friday seeking comment from Tesla, which has repeatedly said the system cannot drive itself and human drivers must be ready to intervene at all times.

Last week Tesla held an event at a Hollywood studio to unveil a fully autonomous robotaxi without a steering wheel or pedals. CEO Elon Musk said the company plans to have fully autonomous vehicles running without human drivers next year, and robotaxis available in 2026.

The agency also said it would look into whether any other similar crashes involving “Full Self-Driving” have happened in low visibility conditions, and it will seek information from the company on whether any updates affected the system’s performance in those conditions.

“In particular, his review will assess the timing, purpose and capabilities of any such updates, as well as Telsa’s assessment of their safety impact,” the documents said.

Tesla has twice recalled “Full Self-Driving” under pressure from the agency, which in July sought information from law enforcement and the company after a Tesla using the system struck and killed a motorcyclist near Seattle.

The recalls were issued because the system was programmed to run stop signs at slow speeds and because the system disobeyed other traffic laws.

Critics have said that Tesla’s system, which uses only cameras to spot hazards, doesn’t have proper sensors to be fully self driving. Nearly all other companies working on autonomous vehicles use radar and laser sensors in addition to cameras to see better in the dark or poor visibility conditions.

share Paylaş facebook pinterest whatsapp x print

Similar Content

Supreme Court to Weigh in on Louisiana Congressional District
Supreme Court to Weigh in on Louisiana Congressional District
How Digital Technology Can Help the U.N. Achieve its 2030 Agenda
How Digital Technology Can Help the U.N. Achieve its 2030 Agenda
Virtue Damage Reverse Serum: the 200 Best Inventions of 2024
Virtue Damage Reverse Serum: the 200 Best Inventions of 2024
Should Elephants Have the Same Rights as People?
Should Elephants Have the Same Rights as People?
Why the U.S. Can't Bridge the Divide at the Kitchen Table
Why the U.S. Can’t Bridge the Divide at the Kitchen Table
Why Surgeons Are Wearing The Apple Vision Pro
Why Surgeons Are Wearing The Apple Vision Pro
News Sphere | © 2024 | News