This site may earn chapter commissions from the links on this folio. Terms of apply.

It was all obvious. In the wake of the fatal crash of a Tesla running in Autopilot mode, the media asked an obvious question: Would Tesla disable Autopilot? The respond from CEO Elon Musk was equally obvious: Autopilot stays agile. Just Tesla says information technology will be more than aggressive in telling drivers about the limits of Autopilot.

The Feds, meanwhile, take gotten involved and have asked Tesla for an avalanche of paperwork and information about Autopilot usage, close calls, and commuter interest. That solitary may be punishment enough for Tesla getting out front end of self-driving.

Tesla X dashboard

A organization that "would save lives"

This week, Elon Musk told the Wall Street Journal, "A lot of people don't empathize what it [Autopilot] is and how yous turn it on," calculation Tesla brought Autopilot to market, and plans to proceed it bachelor, because "nosotros knew nosotros had a organisation that on balance would save lives."

Tesla plans a blog post — one of the company's ways of keeping in touch with owners — further explaining the nuances of Autopilot and what information technology can and cannot do.

Tesla-Model-S-blue front

Insights on Tesla blog

Meanwhile, on the Tesla blog, Tesla provided some more insight into Autopilot. The Tesla postal service was a swipe at a July 5 Fortune article noting that Tesla (the visitor and its CEO, Musk) sold $ii billion of Tesla stock after the May 7 fatal crash that killed Joshua Chocolate-brown while Autopilot was switched on in his Model X, but weeks before publicly announcing the fatal accident. Tesla says:

Here'southward what nosotros did know at the time of the accident and subsequent [stock] filing:

1. That Tesla Autopilot had been safely used in over 100 1000000 miles of driving by tens of thousands of customers worldwide, with zero confirmed fatalities and a wealth of internal data demonstrating safer, more predictable vehicle command performance when the arrangement is properly used.

2. That assorted against worldwide accident information, customers using Autopilot are statistically safer than those not using it at all.

3. That given its nature as a driver assistance arrangement, a collision on Autopilot was a statistical inevitability, though by this point, not one that would modify the conclusion already borne out over millions of miles that the organisation provided a net safety benefit to guild.

Tesla goes on to say that Fortune made "simulated assumptions," including:

[bold] that this blow was caused by an Autopilot failure. To be clear, this blow was the upshot of a semi-tractor trailer crossing both lanes of a divided highway in forepart of an oncoming car. Whether driven under manual or assisted mode, this presented a challenging and unexpected emergency braking scenario for the driver to reply to. In the moments leading up to the collision, at that place is no show to suggest that Autopilot was not operating every bit designed and as described to users: specifically, as a driver aid system that maintains a vehicle's position in lane and adjusts the vehicle's speed to match surrounding traffic.

All this is in the context of Tesla saying the first Autopilot-on fatal crash (in 130 million driving miles) was not fabric to investors planning to buy Tesla stock, and Fortune suggesting it probably was. Fortune also dinged NHTSA, which "sat on the news — of possible interest to the driving public, wouldn't you say? — until announcing it June 30 … almost eight weeks after the accident."

NHTSA's demand for information

The National Highway Traffic Rubber Information last week sent Tesla a nine-page request for documents and data about 2015 Tesla Model S vehicles to larn more almost the cars' automatic emergency braking, Autosteer, and crash avoidance systems. It's the kind of asking that may make the public feel meliorate that someone is checking up, and information technology may brand businessmen weep over the farthermost level of detail NHTSA is seeking.

NHTSA wants to know virtually design changes and updates fabricated to Autopilot since information technology was first made available (equally an over-the-air update) in 2015. It also wants to know about automated emergency braking events with adaptive cruise control activated merely not Autosteer, so with Autosteer activated, and so with neither cruise control nor Autosteer enabled — in other words, the total number of put-your-hands-on-the-bicycle Autosteer warnings and the warnings "that escalated to a reduction in power." It too wants reports of crashes, lawsuits filed, and results of any arbitration proceedings, along with the results of Tesla crash reconstructions and Tesla'southward assessment of systems that did not activate in crashes.

NHTSA calls this a request for information, although the letter of the alphabet ends with a alarm of civil penalties if it doesn't provide the data.