Feds reviewing autonomous truck mishap
Was an incident where a TuSimple truck made contact with a concrete medium divider simply a human error, or something that speaks to a potential larger issue with the company or with self-driving trucks in general? That’s the question about an April incident which has attracted the attention of the Federal Motor Carrier Safety Administration (FMCSA).
According to published reports, FMCSA has launched what it described in a May 26 letter to TuSimple as a “safety compliance investigation.” The letter referenced the accident.
The April 6 incident in Tucson, Ariz., happened as the truck was being operated on the highway, within its mapped “operational design domain,” according to TuSimple’s self-report to the National Highway Traffic Safety Administration (NHTSA).
“The driver and test engineer attempted to engage the automated driving system. However, the ADS was not functional at that moment due to the computer unit not having been initialized and should not have been attempted to be activated,’’ TuSimple said.
The truck abruptly veered left because a person in the cab hadn’t properly rebooted the autonomous driving system before engaging it, causing it to execute an outdated command, according to industry reports. The left-turn command was two and a half minutes old – “an eternity in autonomous driving’’ — and should have been erased from the system but wasn’t,’’ according to an internal account.
Enjoying our insights?
Subscribe to our newsletter to keep up with the latest industry trends and developments.
Stay InformedTuSimple’s report explained that the safety driver took control of the steering and was able to prevent a crash, but not before the left front truck tire and left front quarter panel came into contact with the concrete barrier. The contact resulted in a scuff to the left tire and damage to the radar unit extending from the left quarter panel, the report said.
“In short, this was a failed attempt to engage the system as a result of human error,” TuSimple said.
However, researchers at Carnegie Mellon University said blaming the entire accident on human error is misleading.
“Common safeguards would have prevented the crash had they been in place,’’ according to the researchers. “For example, a safety driver — a person who sits in the truck to backstop the artificial intelligence — should never be able to engage a self-driving system that isn’t properly functioning. The truck also shouldn’t respond to commands that are even a couple hundredths of a second old. And the system should never permit an autonomously-driven truck to turn so sharply while traveling at 65 mph.”
In a recent blog post on its website, TuSimple said that in response to the incident, “we immediately grounded our entire autonomous fleet and launched an independent review to determine the cause of the incident. With learnings from this review in hand, we upgraded all of our systems with new automated system checks to prevent this kind of human error from ever happening again and we reported the incident to NHTSA and the Arizona Department of Transportation….
“Because of our self-reporting of the incident, FMCSA formally requested information, and we welcomed their team along with staff from NHTSA to visit our site in Tucson to discuss what occurred and the solutions we put in place to safeguard against human errors,’’ TuSimple officials said. “Currently, we are helping FMCSA and NHTSA with the review process.”
TuSimple noted in its blog that there are more than 500,000 crashes involving large trucks each year, according to NHTSA data.
“In comparison, in seven years and 7.2 million miles of autonomous vehicle testing, this is the first on the road incident for which we’ve been responsible. While our safety record is many times better than traditional manually-driven trucks, we take our responsibility to find and resolve all safety issues very seriously,’’ company officials said.