Latest News And Breaking Headlines

The role of Tesla Autopilot in the fatal crash of 2018 will be determined this week

Tesla’s Autopilot is again in the spotlight of the government. On February 25 in Washington, DC, researchers from the National Transportation Safety Board (NTSB) will present the findings of her nearly two-year investigation into the fatal crash of Wei “Walter” Huang. Huang died on March 23, 2018, when his Tesla Model X hit the barrier between a left exit and the HOV lane on the US-101, just outside of Mountain View, California. He used the advanced steering assist function from Tesla, Autopilot, at the time of the crash.

It will be the second time that the NTSB has held a public meeting on an investigation into a crash related to the autopilot; in 2017, the NTSB found that a lack of ‘guarantees’ contributed to the death of 40-year-old Joshua Brown in Florida. Tomorrow’s event also comes just a few months after the NTSB had held a similar meeting in which it was said that Uber was partially guilty of the death of Elaine Herzberg, the pedestrian who was killed in 2018 after being hit by one of the self-driving test vehicles from Company .

After investigators have set out their findings, the members of the board vote on the proposed recommendations and give a final statement about the likely cause of the crash. Although the NTSB does not have the legal authority to implement or enforce those recommendations, they can be adopted by regulators. The entire meeting will be streamed live on the NTSB website from 9:30 ET.

In the days prior to the meeting, the NTSB has opened the public cause for the investigation, displaying the factual information collected by the NTSB researchers. (Among the findings: Huang had encountered problems with Autopilot in the same spot where he crashed, and he may have played a mobile game before the crash.) The NTSB also issued a preliminary report in June 2018 in which some of the earliest findings were described , including that Huang’s car steered itself to the barrier and accelerated before being hit.

Tesla admitted shortly after Huang’s death that Autopilot was involved during the crash, but pointed out that he had “received multiple visual and one audible hands-on warning earlier in the ride” and claimed that Huang’s hands were not on the wheel detected for six seconds prior to the collision, “that is why” no action was taken “to avoid the barrier.

That announcement led the NTSB to remove Tesla from the investigation to release information “before it was screened and confirmed by” the board – a process that all parties must agree to when they register for NTSB investigations.

The newly released documents show that a number of factors probably contributed to Huang’s death, and Autopilot was just one of them. Determining the role that Tesla’s advanced driving assistance function played is probably only part of what the researchers and the board will discuss. But since this is only the second time that the NTSB has completed an investigation into a crash with Autopilot, the conclusions of this investigation may be important.

That said, this is what we expect from tomorrow’s meeting.

Image: NTSB

The crash

One of the first things that will happen after NTSB chairman Robert Sumwalt opens tomorrow’s meeting (and introduces the people there) is that the lead investigator goes through an overview of the crash.

Most details of the crash are well established, especially after the preliminary report was released in 2018. But the documents released last week provide a more complete picture.

At 8:53 PM PT on March 23, 2018, Huang dropped off his son at kindergarten in Foster City, California, just like most days. Huang then drove his 2017 Tesla Model X to US-101 and began the 40-minute drive south to his job at Apple in Mountain View, California.

On the way, he called in Tesla’s driver assistance system Autopilot. Huang had used Autopilot a lot since he bought the Model X at the end of 2017, researchers discovered. His wife told them he was “very familiar” with the position, as researchers said, and he even watched YouTube videos about it. He also spoke to colleagues about Autopilot, and his supervisor said that Huang – who was a software engineer – was fascinated by the software behind Autopilot.

Huang turned on Autopilot four times during that drive to work. The last time he activated it, he left it on for 18 minutes until the crash.

Tesla instructs owners (in the manuals of the cars and also via the infotainment screens of the cars) to keep their hands on the wheel when they use Autopilot. When Autopilot is active, the car also constantly checks whether a driver is applying torque to the steering wheel in an attempt to check whether his hands are on the steering wheel. (Tesla CEO Elon Musk rejected more complex steering control systems because he said they were “ineffective”.)

If the car does not measure enough torque input on the wheel, it flashes an increasing series of visual and then audible warnings to the driver.

During the last 18-minute Autopilot assignment, Huang received a number of these warnings, according to data collected by researchers. Less than two minutes after activating Autopilot for the last time, the system gave a visual and then an audible warning for him to put his hands on the wheel, which he did. A minute later he received another visual warning. He no longer received warnings during the last 13 to 14 minutes before the crash. But the data shows that the car has not measured steering input for about 34.4 percent of the last 18-minute Autopilot session.

The autopilot was still busy when Huang approached part of the US-101 to the south where a left-hand lane allows cars to take part on State highway 85. While that lane moves further to the left, a “gore area“Develops between it and the HOV job. Ultimately, a concrete median rises and acts as a barrier between the two lanes.

Huang drove in the HOV lane, thanks to the clean air sticker provided by owning an electric vehicle. Five seconds before the crash, as the exit lane split off to the left, Huang’s Model X began “following the lines” in the gore area between the HOV lane and the exit lane. Researchers discovered that Autopilot initially lost sight of the HOV lanes and then quickly picked up the lines of the gore area as if it were its own highway lane.

Huang had set his cruise control at 75 miles per hour, but he followed cars that were closer to 62 miles per hour. While his Model X was pointing him to the barrier, he also no longer registered cars and started driving back to 75 miles per hour.

A few seconds later, Huang crashed into the barrier. The large metal crash barrier for the barrier, which is supposed to deflect part of the kinetic energy of a moving car, was completely crushed 11 days earlier during another crash. Researchers discovered that the California transportation department had not repaired the attenuator despite an estimated repair time of “15 to 30 minutes” and average costs of “less than $ 100”. This meant that Huang’s Model X essentially collided with the concrete median behind the attenuator with most of the kinetic energy of his car intact.

Image: NTSB

Despite the violent blow, Huang initially survived the crash. (He was also hit by a car from behind.) A number of cars stopped on the highway. Several people called 911. A couple of drivers and a motorcyclist approached Huang’s car and helped him pull it out as they noticed the Model X batteries were hissing and popping. After struggling to remove his jacket, they were able to bring Huang to relative safety before the Model X’s battery caught fire.

Paramedics performed CPR on Huang and performed a blood transfusion when they brought him to a nearby hospital. He was treated for cardiac arrest and blunt pelvis trauma, but he died a few hours later.

Possible contributing factors

One of the new details that emerged in the documents released last week is that Huang may have played a mobile game Three Kingdoms while driving to that day.

Researchers obtained Huang’s phone records from AT&T, and because he was mainly using an iPhone development model, they were able to retrieve diagnostic data from his phone with the help of the company. Looking at this data, researchers say they could determine that there was a “pattern of active gameplay” every morning around the same time in the week before Huang’s death, although they indicated that the data did not contain enough information to determine whether [Huang] held the phone or how interactive he was with the game at the time of the crash. ”

Another possible contributing factor to Huang’s death is the crash attenuator himself and the fact that it had disappeared for 11 days without being repaired. The NTSB has even issued an early recommendation to officials in California back in September 2019, telling them to go faster when it comes to repairing attenuators.

The design of the left exit and the gore area for the attenuator is also something that the board is likely to have contributed to Huang’s death. Huang even struggled a few times before his death with Autopilot on the same section of the highway, as announced last week.

The van Huang family has said since his death that he had previously complained about how Autopilot would pull him to the left where he eventually crashed. Researchers found two examples of this in the data in the month before his death.

On February 27, 2018, data shows that Autopilot turned the Huang wheel 6 degrees to the left and directed it to the gore area between the HOV lane and the left exit. Huang’s hands, however, were on the wheel and two seconds later he turned the wheel and remained in the HOV track. On March 19, 2018 – the Monday before he died – the autopilot turned the Huang steering wheel 5.1 degrees and sent it to the same pore area. Huang turned the car back into the HOV lane a second later.

Huang had complained about this problem to one of his friends who was a co-owner of Tesla. The two were delighted with a new Tesla software update five days before the crash when Huang told the friend how Autopilot “almost brought me back to the median this morning”.

The role of the autopilot

Whether Autopilot played a role in Huang’s death (and if so to what extent) will probably receive a lot of attention during Tuesday’s meeting.

The NTSB has already completed one investigation into a fatal crash in which Autopilot has been used and has made recommendations based on that. But the circumstances of that crash were much different than those of Huang. In 2016, Joshua Brown used Autopilot on a divided highway in Florida when a tractor trailer crossed him. Autopilot could not recognize the broad side of the trailer before Brown collided and Brown did not take evasive action.

Image: NTSB

The Autopilot design “allowed the motorist’s excessive dependence on automation,” the NTSB wrote in its findings in 2017. (Tesla said that overconfidence in Autopilot is the cause of many crashes that occur while the feature is on, although it remains to claim that driving with Autopilot reduces the risk of an accident.) The board wrote that Autopilot “made it possible to abandon the driving task longer and enabled the driver to use it in ways that are inconsistent with the manufacturer’s instructions and warnings. ”

The NTSB, in turn, advised that Tesla (and any other car manufacturer that works with similar advanced driver assistance systems) should add new safeguards that limit the misuse of features such as Autopilot. The NTSB also recommended that companies such as Tesla should develop better ways to feel the involvement of a driver, while using features such as Autopilot.

Since Huang’s death, Tesla has increased the frequency and reduced the delay time of warnings for drivers who do not appear to have steering wheel while Autopilot is active. Whether the company has gone far enough will probably be discussed on Tuesday.

Why Tesla was kicked off the probe

On March 30, 2018, a week after Huang died, Tesla announced that Autopilot was engaged during the crash. The company also claimed that Huang was not in control and that he had received several warnings in the minutes before the crash.

The NTSB was not happy that Tesla shared this information while the investigation was still ongoing. Sumwalt called Musk on 6 April 2018 to tell him that this was a violation of the agreement that Tesla had signed to be a party to the investigation. Tesla then issued another statement to the press on April 10, which the NTSB considered ‘incomplete, analytical in nature’, and [speculative] about the cause ‘of the crash. So Sumwalt called Musk again and told him that Tesla was being removed from the investigation.

Tesla claimed it withdrew from the probe because if The Wall Street Journal so to speak, the company felt that “restrictions on disclosures could endanger public security.” Whether this comes to the fore in tomorrow’s meeting will be something else to watch out for.

What is next?

One thing that will not be resolved on Tuesday is the lawsuit that Huang’s family filed against Tesla in 2019. The family’s lawyer argued last year that Huang died because “Tesla is testing the Autopilot software beta on live drivers.” That case is still ongoing.

The NTSB is likely to make a series of recommendations at the end of tomorrow’s hearings based on the probe’s findings. If it feels the need, it can also label a recommendation as ‘urgent’. The board may comment on whether it believes that Tesla has made progress on the recommendations it set out in the 2017 meeting on Brown’s fatal crash. Those recommendations include:

  • Crash data must be “recorded and available in standard formats on new vehicles equipped with automated vehicle control systems”
  • Manufacturers must “build in system safeguards to limit the use of automated control systems to the conditions for which they are designed” (and that there must be a standard method to verify those guarantees)
  • Automakers need to develop ways to “feel the involvement of a driver more effectively and to warn when engagement is missing”
  • Automakers must “report incidents, accidents and exposure numbers with vehicles equipped with automated vehicle control systems”

While Tesla helped the NTSB recover and process the data from Huang’s car, the company is still much better protected against crash data than other manufacturers. In fact, some are owners sued to gain access to that data. And while Tesla increased the frequency of Autopilot warnings after the Huang crash, it has largely not changed how it controls drivers using the function. (Other companies, such as Cadillac, use methods such as eye-tracking technology to ensure that drivers watch the road while using driver support features.)

The NTSB recommendations can improve this original guideline or even ignore it. Although it doesn’t change anything that Walter Huang died in 2018, the agency’s actions on Tuesday can help shape the experience of Autopilot. The NTSB also recently opened another investigation into an autopilot-related deathand Autopilot is starting face control of legislators. So whatever comes out of Tuesday’s meeting, it seems that the spotlight on Autopilot is only getting better from now on.