Generally speaking, there’s no field of pop-culture technological advancement I would more like to see perfected than that of the commercially available driverless vehicle. Call it an abdication of my stereotypically gendered duty as a red-blooded American male, but I don’t particularly care for the task of driving places, particularly in long hauls, when I could be doing, y’know, literally anything else. Calling a driverless robotaxi to transport me to happy hour, without dealing with the myriad potential annoyances and outright hazards of interacting with a strange driver? That’s the dream right there, and I’m sure I’m not the only one. But also: I’d prefer not to be swept away in a flood, or become an accessory to running over a kid leaving a school bus. The question then becomes: What risk of those things happening is acceptably small enough that I’d actually call the robotaxi? And sadly, we’re definitely not there yet, and it’s hard to say when we ever will be.Not that I’ve had the opportunity, living on the East Coast. The domain of the commercially active robotaxi, as operated by companies like Waymo (Alphabet Inc.), Zoox (Amazon), Cruise (General Motors) and Tesla’s Cybercab, is mostly in the Southwest and West Coast, in cities like San Francisco, Phoenix, Las Vegas, Austin, Seattle and Houston, where robotaxis are increasingly in regular use. Almost all are EVs, which only makes sense for reasons of both climate and practicality, particularly in the region of the country where gasoline is most expensive. In those cities, we’ve seen both the good and the bad of what this technology can offer, but it will always be difficult to look past the most scandalizing of the headlines. What it boils down to is this: You can teach an autonomous vehicle to operate near perfectly within a normal range of driving scenarios, but they continue to struggle when presented with more specialized scenarios or extenuating circumstances.One of those types of circumstances recently turned out to be the ability of Waymo-operated robotaxis to successfully navigate roads when said roads were underwater. In two separate incidents in April, Waymo driverless cars reportedly encountered flooded areas of San Antonio, Texas, and rather than say, coming to a stop and finding an alternate route, the cars instead decided to attempt to ford the river like they were playing Oregon Trail. This worked about as well for the vehicles as it does for your typical team of oxen. One of the Waymo vehicles became stuck in the flood waters, while the San Antonio Express-News reported that another was “swept away”–to where, we cannot say. Thankfully, neither of those vehicles was occupied by human beings as the time, but they quite obviously could have been, which is a problem when your robotaxi sensor system can’t decide what to do about a large amount of swiftly moving water blocking what is supposed to be a road. Waymo subsequently shut down service in San Antonio entirely, saying “As a result of the flooding in San Antonio, we temporarily paused our local operations and are continuing to monitor road conditions.” A similar event last fall in Phoenix likewise resulted in cars with actual passengers getting stuck in floodwaters. Not great, Bob!Now, Waymo has taken the next step of conducting a recall applying to 3,791 vehicles in its fleet, updating their software in an attempt to address this particular issue, which the National Highway Traffic Safety Administration (NHTSA) described as “slowing, but not stopping, when encountering flooded roads they they could not traverse.” The federal agency said Waymo was still engaged in “developing the final remedy for this recall.” Previous recalls, meanwhile, have been triggered by other incidents such as low-speed crashes into stationary objects like parking gates or telephone poles, or in a particularly unnerving case, by Waymo robotaxis passing stopped school buses and endangering children despite the deployed “stop” signs on the side of each bus.AdvertisementAdvertisementNor can promises of software updates and fixes simply be blindly trusted: In the case of the school bus incidents, the city of Austin’s independent school district has claimed that even after the problem was supposedly addressed and fixed, that school buses have continued to record Waymo vehicles illegally passing them. The same thing has likewise been reported in Atlanta.This all matters, because in agreeing to make use of a robotaxi or driverless car in general, you are obviously surrendering your agency and choosing to trust a machine and a piece of software to not only keep you safe, but to keep others safe around you. You are handing over trust to a thing that does not, on a fundamental level, understand what a human life is, because it’s not conscious. You can teach that car to “preserve itself” and the people inside by extension, so that it hopefully won’t behave in an outwardly suicidal way. And yes, there is certainly data to suggest that robotaxis, per mile driven, end up in less serious crashes than human drivers do. But you can’t teach that car to perceive and understand the kind of extenuating circumstances that radically transform how a human being in the same situation would alter their behavior. Worse, you don’t realize that these problems exist until some incident comes along to enlighten you. How do we discover vulnerabilities with robotaxis? We find them when catastrophic failure happens, in true venture capitalist style.Meanwhile, the companies operating such services do their best to handwave their responsibilities in terms of discovering these things before they’re in scenarios where actual human lives are on the line. Alphabet Inc. can pat itself on the back for initiating a recall of 4,000 vehicles in order to teach them not to drive into floodwaters, but that doesn’t change the fact that these vehicles were already on the road, accepting rides for the last few years, with those same vulnerabilities built right into them, waiting for the circumstances to arrive that could have been deadly. We have no choice but to ask ourselves how many other, undiscovered vulnerabilities of this sort may exist in the same robotaxis.How is the consumer meant to approach this kind of news, of the Waymo recall? Should they consider whether driving conditions are optimal, before they call one instead of a rideshare car with a human driver? Should they have to consider whether they’ll be driving near any school loading zones? Are any of these reasonable burdens to put on the consumer? I rather doubt it.AdvertisementAdvertisementSo for now, my dream of being ferried to the local cocktail bar by a silent, spotless future chariot will have to wait. Try as I might, even the thought of being able to avoid idle chitchat with a driver can’t outweigh the comforting certainty that my human driver probably won’t attempt to drive us through a flash flood. Maybe that could be Lyft’s new slogan?