Tesla reveals details of 17 ‘Robotaxi’ crashes: New safety concerns emergeFor months, Tesla stood out among autonomous vehicle companies for one reason: it kept the details of its self-driving crash reports hidden from public view. While other operators submitted detailed incident summaries to the US regulator, Tesla blacked out every narrative entirely, labeling them as confidential business information.That has now changed, and the newly available reports suggest a more nuanced picture than many expected. Most of the 17 incidents involving Tesla’s autonomous driving system were caused by other road users, rather than failures of the software itself. Still, several reports raise new questions, including a handful of incidents where the system’s decisions or reactions appear less straightforward.Companies like Waymo, Zoox, Avride and May Mobility routinely submitted multi-paragraph descriptions to the National Highway Traffic Safety Administration. Tesla alone replaced every account with the same redacted placeholder, making it the only operator to withhold the narratives under the agency’s crash reporting order.Data shows most Tesla Robotaxi incidents were caused by other driversThe release provides the first detailed look at all 17 incidents recorded during its Robotaxi testing program in Austin between July 2025 and March 2026. All incidents involved 2026 Tesla Model Y vehicles operating with the autonomous driving system engaged and a human safety monitor onboard. Most were relatively minor: 13 caused property damage only, two had no injuries, one resulted in a minor injury without hospitalization, and one involved a minor injury requiring hospital treatment.AdvertisementAdvertisementThe reports show a pattern already seen across the industry, Electrek writes. Many crashes happened while Tesla vehicles were stationary, waiting at traffic lights, stop signs or in slow traffic. In multiple cases, other road users struck the autonomous vehicle from behind or clipped it while passing. Incidents included a truck rear-ending a stopped Tesla, a city bus sideswiping it during a turn, and even a pedicab hitting one of its mirrors.The trend closely matches data previously disclosed by Waymo, where self-driving vehicles are often hit by inattentive human drivers who misjudge or fail to anticipate the car’s cautious stopping behavior. The newly released reports also point to incidents where Tesla’s system itself appears to have contributed to the crash. A small number highlight limitations not only in the autonomous software, but also in the remote teleoperation system used as backup during testing.Two incidents occurred after a human safety monitor requested remote assistance. In one case from July 2025, a teleoperator took control after the vehicle failed to move forward and then drove it onto a curb and into a metal fence. Another, in January 2026, ended similarly when remote control was activated and the vehicle struck a construction barricade. Both raise questions about how smoothly Tesla’s remote intervention system handles edge cases during real-world testing.`Robotaxi struggles with parking and obstacle detection`The records also show several cases where Tesla’s autonomous system struggled with nearby obstacles during low-speed maneuvers. Rather than collisions caused by surrounding traffic, these incidents involved the vehicle misjudging objects directly in its path.AdvertisementAdvertisementIn one September 2025 event, the system struck a metal chain while entering a parking lot after completing an unprotected left turn. Another report from October shows the vehicle clipping part of a dump trailer that extended into the roadway. Two incidents in January involved reversing errors, including backing into a wooden utility pole and hitting a curb while parking.https://www.youtube.com/watch?v=Y3aTCisvxYs&t=1sWhile minor, the crashes point to challenges in object recognition and spatial awareness, particularly in complex parking or tight urban environments. The incidents involving contact with chains, trailer hitches, poles, and curbs indicate recurring limitations in detecting smaller or irregular obstacles, especially during reversing maneuvers. Furthermore, these events highlight ongoing challenges in edge-case handling that are relevant to the safe scaling of an unsupervised robotaxi fleet.