Site icon News Bit

Company insiders rip Tesla’s stance on safety in hard-hitting Elon Musk doc

If you own a Tesla, or a loved one does, or you’re thinking about buying one, or you share public roads with Tesla cars, you might want to watch the new documentary “Elon Musk’s Crash Course.”

Premiering Friday on FX and Hulu, the 75-minute fright show spotlights the persistent dangers of Tesla’s automated driving technologies, the company’s lax safety culture, Musk’s P.T. Barnum-style marketing hype and the weak-kneed safety regulators who seem not to care.

Solidly reported and dead-accurate (I’ve covered the company since 2016 and can attest to its veracity), the project, part of the ongoing “New York Times Presents” series, may well become a historic artifact of the what-the-hell-were-they-thinking variety.

Newsletter

The complete guide to home viewing

Get Screen Gab for weekly recommendations, analysis, interviews and irreverent discussion of the TV and streaming movies everyone’s talking about.

You may occasionally receive promotional content from the Los Angeles Times.

The central through line is the story of Joshua Brown, a rabid Tesla fan and derring-do techno-geek beheaded when his Autopilot-engaged Tesla drove itself at full speed on a Florida highway underneath the trailer of a semi-truck in 2016.

Whatever lessons were learned at Tesla did not prevent an almost identical fatal crash, also in Florida, three years later. An unknown number of Autopilot-related crashes have occurred since — unknown to anyone but Tesla, which has the ability to track its cars through wireless connections — because the government’s decades-old process for collecting crash statistics is unfit for the digital age. The company is currently under investigation by federal safety regulators for an apparent tendency to crash into emergency vehicles parked by the side of the highway.

Here are four more key takeaways from “Elon Musk’s Crash Course.”

The New York Times Presents “Elon Musk’s Crash Course” Episode 1 airs Friday.

(FX)

1. Tesla’s Autopilot feature did not receive adequate testing, ex-employees allege

The pressure to push Autopilot features out to customers fast, ready or not, was relentless, according to several former members of the Autopilot development team featured in the documentary. “There was no deep research phase” like at other driverless car companies, says one engineering program manager, with “customers essentially standing in for professional test drivers.”

The testimony of these developers is a standout feature of “Crash Course.” It’s rare to hear from insiders at Tesla because “free-speech absolutist” Musk makes employees sign strict nondisclosure agreements and enforces them with a vast army of well-paid lawyers.

When Brown’s car ran under the truck, the company said, the system mistook the side of the trailer for the bright sky, and blamed its camera provider. But inside Tesla, the Autopilot team was still struggling with how its software could distinguish a truck crossing a highway from an overhead bridge, says software engineer Raven Jiang: “The rate of learning wasn’t great. It was personally hard for me to believe the promise was going to be lived up to.”

The media were reporting on the Theranos fraud at the time, which provoked some “soul searching” for Jiang, who quit for another job. Akshat Patel, an Autopilot engineering program manager, echoes Jiang’s concern. If anyone considers Tesla “an example of scientific integrity, public responsibility, and reasoned and methodical engineering,” Patel says, “it is not.”

2. Fully autonomous Teslas are more science-fiction than reality

Tesla currently sells a feature for $12,000 called Full Self-Driving, which is not full self-driving at all. Fully autonomous cars available to individual buyers do not exist.

But that hasn’t stopped Musk from claiming, year after year after year, that autonomous Teslas are right around the corner.

Clip after clip in “Crash Course” puts his false claims on display.

2014: “No hands, no feet, nothing,” he says from the wheel of a Tesla. “The car can do almost anything.”

2015: Musk tells a crowd he is “quite confident” autonomy will be achieved within three years, to the point where “you could be asleep the whole time.”

2016: “I think we’re basically less than two years away from full autonomy,” he tells journalist Kara Swisher on a conference stage.

2017: “We’re still on track to be able to go cross country from Los Angeles to New York by the end of the year, fully automated.”

2018: “By the end of next year self-driving will be at least 100% to 200% safer than a person.”

2019: Buying a car that does not include Full Self-Driving is “like buying a horse.”

2022: In a black cowboy hat and sunglasses: “The car will take you anywhere you want ultimately 10 times safer than if you were driving it yourself. It’s just going to completely revolutionize the world.”

The doc also explains how Tesla manipulated a widely shared video of a Tesla driving itself through the streets of Palo Alto a few months after Brown’s death.

The revenue from Autopilot and Full Self-Driving, it should be noted, is largely responsible for meeting compensation goals that have made Musk so rich.

Engineer Raven Jian in “Elon Musk’s Crash Course.”

(FX)

3. Musk’s fans don’t hold back. Even on camera

Among his 94 million Twitter followers, Musk has attracted a particularly rabid fan base, to which “Crash Course” gestures on occasion. (One tweet reads, “Elon is the Lord.”) But the documentary doesn’t go deep into why so many people seem so enthralled by him. That’s the realm of speculation, or perhaps psychology.

Still, there are some choice quotes from the Tesla fans and Musk supporters who sit for interviews, apparently unaware of the irony:

“I think Elon wants to make a dent in the world.”

“Any company would kill to have that level of fandom and devotion.”

“He has the resources that allow him to do things that would be irresponsible or insane to anybody else.”

4. Regulatory failures are part of the problem

Partway through “Crash Course,” viewers might begin wondering: Where are the safety regulators?

Great question. The National Highway Traffic Safety Administration investigated the Brown crash, determined that Autopilot somehow missed the broad side of a truck in front of the car, and yet determined there was no defect, giving Tesla a pass.

“I was a little bit dumbfounded,” New York Times reporter Neal Boudette tells the camera. “The system couldn’t see the tractor trailer? And that’s not a defect?”

A communications official for NHTSA at the time attempts to explain: “It’s a little complicated and almost counterintuitive, right? Autopilot didn’t even engage to try to stop the crash. But the fact of the matter is, Autopilot wasn’t designed to stop every crash in every instance.”

The fact of the matter is too that several top NHTSA officials in both the Obama and Trump administrations went on to take jobs in the driverless car industry.

NHTSA under Biden and Transportation Secretary Pete Buttigieg is getting tougher with Tesla on access to data, and its investigations remain ongoing.

Meanwhile, a high-speed Tesla crashed into a Newport Beach highway construction site on May 12. Three people were killed. Were Autopilot or Full Self-Driving involved? Police are looking into the incident, and NHTSA has opened an investigation.

‘The New York Times Presents: Elon Musk’s Crash Course’

Where: FX

When: Saturday, 10 p.m.

Streaming: Hulu, any time, starting Saturday

Rating: TV-MA (may be unsuitable for children under the age of 17)

For all the latest Entertainment News Click Here 

 For the latest news and updates, follow us on Google News

Read original article here

Denial of responsibility! NewsBit.us is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – abuse@newsbit.us. The content will be deleted within 24 hours.
Exit mobile version