Tesla Full Self Driving requires human intervention every 13 miles

0


Enlarge / An independent automotive testing company has evaluated Tesla FSD, and it found some concerning results.

PonyWang/Getty Images

Tesla’s controversial “Full Self Driving” is now capable of some quite advanced driving. But that can breed undeserved complacency, according to independent testing. The partially automated driving system exhibited dangerous behavior that required human intervention more than 75 times over the course of more than 1,000 miles (1,600 km) of driving in Southern California, averaging one intervention every 13 miles (21 km).

AMCI Testing evaluated FSD builds 12.5.1 and then 12.5.3 across four different environments: city streets, rural two-lane highways, mountain roads, and interstate highways. And as its videos show, at times FSD was capable of quite sophisticated driving behavior, like pulling into a gap between parked cars to let an oncoming vehicle through, or moving left to give space to pedestrians waiting at a crosswalk for the light to change. AMCI also praised FSD for how it handled blind curves out in the countryside.

“It’s undeniable that FSD 12.5.1 is impressive, for the vast array of human-like responses it does achieve, especially for a camera-based system,” said Guy Mangiamele, director of AMCI Testing.

“But its seeming infallibility in anyone’s first five minutes of FSD operation breeds a sense of awe that unavoidably leads to dangerous complacency. When drivers are operating with FSD engaged, driving with their hands in their laps or away from the steering wheel is incredibly dangerous. As you will see in the videos, the most critical moments of FSD miscalculation are split-second events that even professional drivers, operating with a test mindset, must focus on catching,” Mangiamele said.

The dangerous behavior encountered by AMCI included driving through a red light and crossing over into the oncoming lane on a curvy road while another car was headed toward the Tesla. Making matters worse, FSD’s behavior proved unpredictable—perhaps a consequence of Tesla’s reliance on the probabilistic black box that is machine learning?

“Whether it’s a lack of computing power, an issue with buffering as the car gets “behind” on calculations, or some small detail of surrounding assessment, it’s impossible to know. These failures are the most insidious. But there are also continuous failures of simple programming inadequacy, such as only starting lane changes toward a freeway exit a scant tenth of a mile before the exit itself, that handicaps the system and casts doubt on the overall quality of its base programming,” Mangiamele said.



Source
Las Vegas News Magazine

Leave A Reply

Your email address will not be published.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More