Elon Musk's autopilot system is 'fundamentally flawed': Former Tesla employees

Former employees have spoken with The New York Times to reveal automaker Elon Musk may have undermined safety in designing its Autopilot driver-assistance system to fit its chief executive’s vision.

"Unlike technologists at almost every other company working on self-driving vehicles, Mr. Musk insisted that autonomy could be achieved solely with cameras tracking their surroundings. But many Tesla engineers questioned whether it was safe enough to rely on cameras without the benefit of other sensing devices — and whether Mr. Musk was promising drivers too much about Autopilot’s capabilities," writers Cade Metz and Neal E. Boudette mused in the story.

The National Highway Traffic Safety Administration is investigating Musk "after at least 12 accidents in which Teslas using Autopilot drove into parked fire trucks, police cars and other emergency vehicles, killing one person and injuring 17 others...Families are suing Tesla over fatal crashes, and Tesla customers are suing the company for misrepresenting Autopilot and a set of sister services called Full Self Driving, or F.S.D.," the article read.

“Where I get concerned is the language that’s used to describe the capabilities of the vehicle,” said Jennifer Homendy, chairwoman of the National Transportation Safety Board. “It can be very dangerous.”

READ MORE: Tesla faces fresh scrutiny over assisted driving features

The hardware has also been in question with concerns being raised over safety issues.

"Within Tesla, some argued for pairing cameras with radar and other sensors that worked better in heavy rain and snow, bright sunshine and other difficult conditions," the writers reported. "For several years, Autopilot incorporated radar, and for a time Tesla worked on developing its own radar technology. But three people who worked on the project said Mr. Musk had repeatedly told members of the Autopilot team that humans could drive with only two eyes and that this meant cars should be able to drive with cameras alone."

In early November, Tesla recalled nearly 12,000 vehicles that were part of the beta test of new F.S.D. features, after deploying a software update that the company said might cause crashes because of unexpected activation of the cars’ emergency braking system, the Times reported.

Schuyler Cullen oversaw the team that explored autonomous-driving possibilities at the South Korean tech giant Samsung and said Musk's cameras-only approach was fundamentally flawed.

READ MORE: US lawsuit alleges 'rampant' sex harassment at Tesla factory

“Cameras are not eyes! Pixels are not retinal ganglia! The F.S.D. computer is nothing like the visual cortex!” Cullen said.

Mobileye's Chief Executive Officer Amnon Shashua said Musk’s idea of using only cameras in a self-driving system could ultimately work, but the technology is not there yet.

“One should not be hung up on what Tesla says,” Shashua said. “Truth is not necessarily their end goal. The end goal is to build a business.”