CEO Elon Musk on Wednesday accused Fortune magazine of incorrectly reporting Tesla Motors' handling of a fatal accident that involved a Tesla Model S. The incident took place in May. Musk's online post is the latest salvo in a war of wars between Fortune and Tesla.
Musk accused Fortune, in essence, of jumping the gun in its reporting of Tesla's response to the accident. The magazine reported that Tesla Motors -- and Musk -- made a combined US$2 billion by selling stock on May 18 in a public offering without disclosing the crash, which had occurred on May 7.
Musk and his company knew about the accident, Fortune reported, yet essentially withheld the information prior to the stock transaction.
Tesla reportedly did contact the National Transportation Highway Safety Administration about the accident but did not disclose the information prior to the sale of the stock.
Tesla contacted the NTHSA on May 16, the company said. Tesla's own investigator was unable to get to Florida, where the accident occurred, until two days later -- the same day as the transaction. The company had not been able to access the vehicle's logs remotely due to damage from the accident, and it was necessary for the investigator to travel to Florida to pull the complete vehicle logs.
It wasn't until the end of May that Tesla completed its investigation, the company said.
The accident reportedly involved a semi-tractor trailer that crossed both lanes of a divided highway, a fact that Tesla said would have made it very difficult for a driver to respond to as well.
To date Tesla has not received a product liability claim in relation to the crash, and the company further reported that its stock ended up on July 1, not down. Its stock on Wednesday closed 0.2 percent higher at $214.44 per share.
Tesla late last month posted a response to the NHTSA's preliminary evaluation into the automaker's Autopilot, noting that the May accident was the first known fatality involving the technology.
It happened after Tesla cars had driven more than 130 million miles using the Autopilot technology, the company pointed out.
Further, the Autopilot driver-assist software is new technology that is in the public beta test, which drivers are required to acknowledge before they activate Autopilot while behind the wheel, Tesla said.
Autopilot isn't a fully autonomous technology, the company explained. It allows a driver to relinquish some control to the Tesla computer -- such as maintaining vehicle speed, like traditional cruise control, employing smart lane changing and steering to stay in a lane.
An Autopilot-equipped car is still a long way from a self-driving car in other words.
"'Autopilot' is an unfortunate name for a driver-assist feature," said Paul Teich, principal analyst at Tirias Research.
"It may have sounded great to Tesla's marketing department, but it is misleading," he told the E-Commerce Times. "It is more like a 'super cruise control.'"
More importantly "the technology Tesla has deployed isn't what we currently are working on for self-driving cars," said Rob Enderle, principal analyst at the Enderle Group.
"It is enhanced cruise control, linked with accident avoidance and lane keeping; it has neither the rich set of sensors, which can see both line of site and through objects -- Lidar/infrared -- nor the advanced computational capability that allows the system to read signs and identify objects," he told the E-Commerce Times.
"Used properly, it is safer than many people drive for short periods because it doesn't get distracted, but it also doesn't see particularly well and can't tell the difference between a person and a tree," Enderle added.
The NHTSA this spring held a pair of meetings to gather input as it developed guidelines for the safe development of automated safety technology. The meetings touched on a number of issues that will need to be resolved before any vehicle can truly be trusted to do the driving.
"There are a lot of technical concerns, even as Tesla, Google, Honda, Apple and others work to develop an autonomous vehicle," said Kyle Landry, research associate with the autonomous systems 2.0 team at Lux Research.
"These companies have suggested that autonomous vehicles will be on the road by 2020, but then we saw that Google's vehicle crashed into a bus, and now Tesla had this fatal crash so those raise concerns," he told the E-Commerce Times.
"Caution in developing this technology too quickly is absolutely necessary," he added.
One of the factors that could have come into play in the accident involving the Model S and the tractor trailer is the fact that the Tesla's systems don't rely on Lidar, suggested Landry.
"The vehicle's other sensors may have been a factor," he added.
"The camera on the road points down, so it didn't 'see' the trailer, while the white color of the trailer and the bright sunny sky could have meant that the algorithms missed it as well," explained Landry. "This was a failure of the sensors; yet there isn't one enabling sensor outside the vehicle. You will still need cameras, radar, ultrasound, Lidar and even GPS with mapping so the car knows where it is on the road."
It isn't just the external sensors that will need refinement -- expectations regarding the technology's capabilities need to be managed as well.
"Completely diverting attention to another activity is not what Autopilot was designed for," said Tirias Research's Teich.
That is why the internal sensors may need to be refined as well, to ensure that a human driver is capable of taking over.
"With Tesla's system, the human is part of the redundancy, and this is where the systems failed," said Lux Research's Landry. "An autopilot system doesn't know if the driver is paying attention; so there really needs to be inward-facing cameras, eye-tracking sensors, and other systems that can ensure that the driver is capable of taking control."
Drivers need to understand these facts. Otherwise, potential consequences could be reminiscent of "the story where the guy who rented a motor home and thought cruise control was like an autopilot discovered, when he went into the back of the motor home to fix a snack, that it wasn't," said Enderle.
"Some Tesla drivers seem to think Autopilot is more than it is," he remarked, "and for that reason, this will always end badly."