<!–

–>

In the early morning hours of Saturday, February 18th, a Tesla Model S hit a parked fire truck on I-680 in California. Today, reports suggest that not only does the NHTSA believe that some form of autonomous driving software was active at the time of the crash but that it’s sent a special team to investigate further.

The crash killed the driver of the Tesla and injured the only other passenger. The fire truck itself was parked diagonally in the lane in an effort to protect other emergency workers when the crash occurred. Four firemen were buckled up inside of the fire truck at the time of impact and were uninjured.

At the time of the accident, authorities specifically made it clear that the cause was unknown. They didn’t rule out the possibility that Tesla’s Autopilot or Full Self-Driving autonomous software was active but they also didn’t rule out the possibility that drugs or alcohol could’ve played a role. Today, numerous reports say that autonomous driving software was likely engaged.

More: Tesla Driver Dies After Crashing Into Stationary Firetruck On California Freeway

According to the Associated Press, investigators believe that the Tesla was using an automated driving system and that the NHTSA “dispatched a special crash investigation team to look into the Feb. 18 crash.” That action is in line with a larger investigation that the NHTSA has into Tesla’s autonomous driving features as a whole. To date, at least 15 Teslas have crashed into emergency vehicles while autonomous driving software was engaged.

advertisement scroll to continue

Tesla requires drivers to accept responsibility for safety whenever an autonomous feature is active. The problem is that not every driver does that and humans in general aren’t great at that sort of task. In addition, it seems as though there’s still potential that drugs and alcohol could’ve played a role here.

Notably, the NHTSA required Tesla to recall more than 350,000 vehicles in February over concerns about its Full Self-Driving feature. Specifically, it said that “the FSD Beta system may allow the vehicle to act unsafe.” Those concerns were centered around how the technology handles intersections but it speaks to the general concern around Tesla’s software.

At the same time, Tesla’s self-reported statistics on Autopilot and Full Self-Driving suggest that it’s already safer than a human driver. In any case, we’ll keep an eye on the developing investigation and provide an update as we learn more.

[embedded content]