Tesla Autopilot Crash Under NHTSA Investigation

...

This session will provide you with best practices for introducing analytics concepts to the busine

The National Highway Traffic Safety Administration is looking into the circumstances surrounding a fatal accident involving a Tesla being driven under autopilot.

The National Highway Traffic Safety Administration has opened an inquiry into the autopilot system in Tesla's Model S, following the death of a driver who was using the system.

In a statement posted on the Tesla Motors website on June 30, the company acknowledged the inquiry and characterized the incident as "the first known fatality in just over 130 million miles where Autopilot was activated."

The NHTSA said in a statement Tesla had alerted the agency to the crash, which occurred on May 7 in Williston, Fla.

The Levy Journal Online, which covers Levy County, Fla., where the crash occurred, described the accident based on an account provided by the Florida Highway Patrol. A tractor-trailer was  traveling west on US 27A and made a left turn onto NE 140 Court as the Tesla driver was heading in the opposite direction. The Tesla passed underneath the 18-wheeler and its roof collided with the truck. It then continued along the road before striking two fences and a utility pole.

"Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied," Tesla said in its statement. "The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S."

The failure of Tesla's computer vision system to distinguish the truck from the similarly colored sky appears to have been compounded by radar code designed to reduce false positives during automated braking. Asked on Twitter why the Tesla's radar didn't detect what its cameras missed, CEO Elon Musk responded, "Radar tunes out what looks like an overhead road sign to avoid false braking events."

The driver of the Model S, identified in media reports as 40-year-old Joshua D. Brown from Canton, Ohio, died on the scene.

The driver of the truck, 62-year-old Frank Baressi, told the Associated Press that Brown was "playing Harry Potter on the TV screen" at the time of the crash.

A spokesperson for the Florida Highway Patrol did not immediately respond to a request to confirm details about the accident.

In its June 30 statement, Tesla said drivers who engage Autopilot are warned to keep both hands on the wheel at all times. Autopilot, despite its name, is intended as an assistive feature rather than an alternative to manual control.

The incident has stoked doubts about the viability of self-driving cars and the maturity of Tesla's technology. Clearly, a computer vision system that cannot separate truck from sky in certain light conditions could use further improvement. It was unclear at press time whether Tesla will face any liability claims related to its code or sensing hardware.

However, Tesla insisted in its statement that, when Autopilot is used under human supervision, "the data is unequivocal that Autopilot reduces driver workload and results in a statistically significant improvement in safety when compared to purely manual driving."

In April, at an event in Norway, Musk said, "The probability of having an accident is 50% lower if you have Autopilot on," according to Electrek.

That may be, but data isn't the only consideration. When human lives are at stake, perception and emotion come into play. Automated driving systems will have to be demonstrably better than human drivers before people trust them with their lives.

Yet, perfection is too much to expect from autopilot systems. Machines fail, and failible people are likely to remain in the loop. In aviation, automation is common and has prompted concerns that it degrades the skills pilots need when intervention is called for. If the same holds true for cars with autopilot systems, we can expect to become worse drivers, less able to respond to emergencies, even as our autopilot systems reduce fatalities overall.

There may be no getting around the fact that, given current vehicle designs, driving down a highway at high speed entails some degree of risk, whether a person or a computer is at the wheel.

Categories
APPLICATIONS
0 Comment

Leave a Reply

Captcha image


RELATED BY

  • 5300c769af79e

    Cheetah Mobile chief talks about running fast

    “Three years ago, we made it our dream to take our mobile business global,” CEO Sheng Fu said, speaking on global mobile innovation at the Global Mobile Internet Conference in San Francisco on Tuesday.On top of that, he said that Cheetah Mobile was able to benefit from fast execution.
  • 5300c769af79e

    Apple Adds Chinese Instruments to GarageBand

    After a massive investment in Chinese Uber rival Didi Chuxing, Apple continues to woo the Asian country with.Cupertino today announced the addition of new Chinese instruments, as well as extensive language localization throughout the iOS and Mac apps.
  • 5300c769af79e

    Enterprise Applications: ERP & Mission-Critical Software

    Charles BabcockBy Editor at Large, Cloud, 4/20/2016ReadPost a Comment "Microsoft has invested heavily in making many of the productivity features.Jessica DavisBy Senior Editor, Enterprise Apps, 4/19/2016ReadPost a Comment IBM is not a startup, and venture capital is still borrowed money.
  • 5300c769af79e

    DEAL: Buy a Moto X Pure Edition 64GB, Get a Free Moto 360 Sport

    Now until June 28, anyone who purchases a Moto X Pure Edition directly from Motorola, so long as its a 64GB model, will receive a free Moto 360 Sport smartwatch.The price of a 64GB Pure Edition starts at $399, the same price as a new OnePlus 3.