Tesla has quietly unredacted all 17 of its autonomous driving crash narratives filed with NHTSA, revealing for the first time what actually happened in each incident. The automaker had been the only ADS operator to fully redact its crash reports, marking every single narrative as “confidential business information.” The data shows what we always suspected: most of Tesla’s crashes were not the fault of the autonomous system. But there are some genuinely concerning incidents buried in there. Tesla was alone in hiding crash data For the better part of a year, Tesla was the only autonomous vehicle operator filing crash reports with NHTSA under the agency’s Standing General Order 2021-01 that completely redacted its crash narratives. Every single one of Tesla’s reports contained the same boilerplate text: “[REDACTED, MAY CONTAIN CONFIDENTIAL BUSINESS INFORMATION].” Meanwhile, Waymo, Zoox, Avride, May Mobility, and every other ADS operator provided detailed, multi-paragraph accounts of what happened in each crash. Advertisement - scroll for more content Tesla even admitted it would “suffer financial harm” if its crash data became public, arguing that competitors could use the information to assess Tesla’s progress with autonomous systems. We criticized that approach repeatedly. The lack of transparency made it impossible to determine whether Tesla’s crashes were minor fender-benders caused by other drivers or serious failures of the autonomous system. It purposefully muddied the waters, letting critics and supporters alike project whatever narrative they wanted onto the data. Now, in the latest NHTSA data dump, Tesla has gone back and filed updated versions of all its older reports — specifically removing the confidential business information designation and making the narratives publicly available. What the crash data actually shows The 17 unique Tesla ADS incidents span from July 2025 through March 2026 — the entire period of the company’s “Robotaxi” testing in Austin. Every single one involves a 2026 Model Y with the ADS engaged and a safety monitor present. The injury breakdown is relatively mild: 13 resulted in property damage only, 2 had no injuries reported, 1 involved a minor injury without hospitalization, and 1 — the most serious — involved a minor injury requiring hospitalization. Here’s where it gets interesting. As suspected based on reports from other companies, a significant portion of the crashes were clearly not Tesla’s fault: Several incidents involved the Tesla ADS being rear-ended while completely stopped — at red lights, stop signs, and in traffic. In one case, a passenger vehicle simply drove into the back of the stopped Tesla at an intersection. In another, a truck rear-ended it at a stop sign. A pedicab clipped the Tesla’s mirror while it was stopped at a red light. A city bus sideswiped it while making a turn. An SUV crept forward into the Tesla’s rear at a red light. This is a pattern that mirrors what we see in Waymo’s crash data — autonomous vehicles tend to get rear-ended by human drivers who aren’t paying attention or don’t expect the AV to stop when it does. The concerning incidents Not all crashes were the fault of other drivers. Several reveal genuine limitations of Tesla’s autonomous system — and its teleoperator backup. Two crashes happened when a remote teleoperator took over the vehicle. In July 2025, a safety monitor requested support because the ADS wasn’t proceeding forward. The teleoperator took control and drove the car up a curb and into a metal fence at 8 mph — resulting in the only non-hospitalization injury reported. In January 2026, a similar scenario played out: the safety monitor requested navigational help, a teleoperator took over, and drove the car into a construction barricade at 9 mph. The ADS itself also had issues with spatial awareness. In one September 2025 incident, the system drove into a metal chain while entering a parking lot after making an unprotected left turn. In October, the ADS clipped a dump trailer’s gooseneck hitch that was sticking into the street. In January, it backed into a wooden electrical pole, and in another January incident, it hit a curb while reversing into a parking space. The most serious crash by injury involved the Tesla creeping forward at 2 mph in a right-turn slip lane when it was rear-ended by an SUV. The safety monitor later reported pain and sought medical evaluation. While technically the other driver’s fault, the incident raises the same question that we’ve raised before about whether the ADS’s overly cautious behavior contributes to getting hit. Electrek’s Take We always said the context would matter. When Tesla was redacting everything, the crash count alone painted an incomplete picture. Despite being painted as “haters” by the Tesla fans, we were always clear that we suspected that many of the crashes were likely not Tesla’s fault — and that turned out to be correct. A good number of these incidents are rear-endings and sideswipes by inattentive human drivers, which is exactly what Waymo reports in the majority of its incidents too. But Tesla’s decision to hide that context for nearly a year was a self-inflicted wound. Had this information been public from the start, the narrative around the “Robotaxi” program’s safety record would have been significantly different. Instead, Tesla let the crash count speak for itself while every competitor provided full transparency — and then argued that disclosing the data would cause “financial harm.” That said, the teleoperator incidents are genuinely concerning. In two separate cases, a human operator took over because the ADS was stuck and then crashed the vehicle into objects. If Tesla can’t trust its ADS to navigate tricky situations and its teleoperators are crashing the cars when they intervene, that’s a real problem — especially as the company pushes to remove safety monitors entirely. The spatial awareness issues — hitting chains, hitches, poles, and curbs — also suggest the system still struggles with detecting smaller or unusual obstacles, particularly while reversing. These are the kinds of edge cases that matter when you’re trying to scale an unsupervised robotaxi fleet. I think it’s fair to suspect that ultrasonics and radars could have helped in those situations. Stay up to date with the latest content by subscribing to Electrek on Google News. You’re reading Electrek— experts who break news about Tesla, electric vehicles, and green energy, day after day. Be sure to check out our homepage for all the latest news, and follow Electrek on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our YouTube channel for the latest reviews.