Thu Sep 26 14:16:38 UTC 2024: ## Tesla’s Full Self Driving Still Requires Human Intervention, Study Finds

**Los Angeles, CA** – A new study by AMCI Testing has revealed that Tesla’s Full Self Driving (FSD) system still requires significant human intervention to prevent potentially dangerous situations. After driving over 1,000 miles in Southern California, drivers had to intervene more than 75 times to correct the system’s actions, averaging one intervention every 13 miles.

The study tested Tesla models equipped with FSD builds 12.5.1 and 12.5.3 on various road types, including city streets, highways, and mountain roads. Some of the alarming behaviors observed included running red lights and veering into oncoming traffic on curves.

“These failures are the most insidious,” said Guy Mangiamele, director of AMCI Testing, in an interview with Ars Technica. “There are also continuous failures of simple programming inadequacy, such as only starting lane changes toward a freeway exit a scant tenth of a mile before the exit itself.”

While the study highlighted these concerning flaws, it also acknowledged FSD’s impressive abilities in certain scenarios, such as navigating tight spaces and handling blind curves.

Despite repeated attempts to reach Tesla for comment on the study’s findings, the company has not yet responded.

This study raises further questions about the safety and reliability of Tesla’s FSD system, especially in light of its ongoing claims of fully autonomous driving capability. As the technology continues to evolve, it remains crucial to ensure that its performance aligns with the safety of all road users.

Read More