Tesla: No Plans to Disable Autopilot Feature in Its Cars
Mr. Musk, in an interview with Mike Ramsey and Jonathan Bach of The Wall Street Journal, said the company is planning an explanatory blog post that highlights how Autopilot works and what drivers are expected to do after they activate it. “A lot of people don’t understand what it is and how you turn it on," Mr. Musk said.
Tesla’s co-founder pushed hard to launch the Autopilot feature as soon as possible because “we knew we had a system that on balance would save lives." While many auto makers offer systems that rely on automatic braking, steering assist or adaptive cruise control to aid drivers, Tesla’s system steers the car more actively than similar systems and the company has marketed it more aggressively.
The safety of Tesla’s Autopilot is under scrutiny in the wake of a May 7 crash in Florida that killed 40-year-old Joshua Brown, a Model S owner who was using the self-driving system at the time of the accident. The National Highway Traffic Safety Administration in June said it
NHTSA on Tuesday disclosed a nine-page letter requesting documents and details of additional crashes involving Tesla’s Autopilot as part of its ongoing probe. Regulators are homing in on emergency braking and forward-collision warning functions that allegedly didn’t respond as expected before the May 7 crash.
A spokesman for the auto-safety regulator characterized the July 8 request as a standard step and said “NHTSA has not made any determination about the presence or absence of a defect in the subject vehicles."
Tesla confirmed the company received the letter and said it is cooperating.
In its letter to Tesla, the agency included a questionnaire seeking details on Autopilot’s design and engineering, and reports of crashes, deaths, injuries or other claims related to the technology.
Mr. Brown’s crash “calls for an examination of the design and performance of any driving aids in use," regulators said in an earlier document opening their probe. Tesla responses to some questions from the more recent information request are due July 29, while others are due Aug 26.
Tesla called the Autopilot function a beta feature when it launched it last year and designed it so that the system is off by default until a driver activates it. “It says beta specifically so people do not become complacent," Mr. Musk said. He said disclaimers provided to drivers are “written in super plain language." Some customers have said Tesla’s warnings should be clearer and more prominent, and that Autopilot didn’t perform as they expected before crashes.
The company issued some data related to Autopilot when the NHTSA preliminary investigation was disclosed nearly two weeks ago. The Palo Alto, Calif., auto maker said the May 7 accident was the first fatal crash in more than 130 million miles driven with Autopilot since the system made its debut in October. The car’s system failed to distinguish the truck’s white trailer from a bright sky, so the vehicle’s automatic emergency brake didn’t activate, Tesla said. The Autopilot system allows cars to drive themselves under certain circumstances, though Tesla warns motorists that the technology doesn’t make vehicles autonomous and that they should remain engaged behind the wheel.
NHTSA’s July 8 letter requests details of Tesla’s own investigations and reconstruction of the May 7 incident.
Since NHTSA disclosed the investigation in late June, there have been at least two crashes in which Tesla drivers say Autopilot was engaged.
The most recent example is of a driver of a Tesla Model X SUV who told local authorities the feature was active when the vehicle crashed into railing wires along the side of Montana State Highway 2 near Whitehall on Saturday.
The driver was en route from Seattle to Yellowstone National Park, according to Montana Highway Patrolman Jade Shope. The Model X hit the railing and traveled for 200 feet before moving back onto the roadway, Trooper Shope said. It is unclear if the driver had his hands on the wheel when the accident occurred, he said.
A Tesla spokeswoman said the car had its autosteer feature enabled and that data suggests the driver’s hands weren’t on the steering wheel. Failing to periodically place hands on the steering wheel violates terms drivers agree to when enabling the feature, the Tesla spokeswoman said. The technology reminded the driver to put his hands on the wheel shortly before the crash, she said. Tesla advises against using autosteer on high speeds or undivided roads such as the one in the Montana crash, she said.
An earlier incident involved Albert Scaglione, of Farmington Hills, Mich., who was driving outside Pittsburgh on July 1. He recalls crashing his Model X into a guardrail on the Pennsylvania Turnpike.
“We were on Autopilot," Mr. Scaglione told The Wall Street Journal on Tuesday. He said he was hospitalized for a few days and was in two different trauma centers. Tesla said last week it had “no reason to believe that Autopilot had anything to do with this accident," based on information it had at the time. The auto maker said it hadn’t received data about the vehicle’s controls, possibly due to a damaged antenna.
Mr. Scaglione said he is waiting for the black box results from Tesla and NHTSA before speaking further on the matter.
Pennsylvania State Police cited Mr. Scaglione for careless driving and failing to safely stay in his traffic lane, said Cpl. Adam Reed.
Comment:
As a young "senior" and retired airline pilot, I have always embraced technology, from all my Apple "I" things to my desire to fly the newest technology in aircraft design in the '90's. I volunteered to check out as Captain on the new generation Airbus 300 series narrow-body jets my company acquired as the first American carrier to fly them. I didn't get my chance until a few years later, but it was still a steep learning curve when I did. The Airbus system was the first highly computerized flight guidance and management system in non-military aircraft. And for "old heads" totally new and sometimes confusing. But the training, while also imperfect, kept the public safe as we all rapidly got comfortable with what ultimately was a very reliable third pilot on the side stick controller (no old style wheel or yoke on a fly by wire/computer jet). But when we did get comfortable, another problem arose. Technology Complacency.
So not only do you have to learn the limits of auto "drive" technology, how to operate it, when to operate it, and its potential dangers, an operator needs to have the discipline and training to never depend on it and constantly monitor it for reliability and accurate operation. Frankly, in flight, there are very few conflicts, other aircraft just feet away, buildings, stray dogs, etc. Even the most critical autopilot operations, autoland capability, does not rise to the complexity of maneuvering a car on an urban highway.
Because of my flying experience, I am very skeptical that car autopilots are ready for the road. I think it's a dangerous mistake to introduce them so rapidly for marketing purposes, without a regulatory framework that addresses not only the safety of the systems, but ignores the fact that the human element is an integral and vital component in the system and that multiple fail safes and training are required. Mike Ramsey and Jonathan Bach The Wall Street Journal