Home » Two Self-Driving Waymo Taxis Get Confused By A Pickup Being Towed Backwards, Crash Into It

Two Self-Driving Waymo Taxis Get Confused By A Pickup Being Towed Backwards, Crash Into It

Waymo Pickup Truck
ADVERTISEMENT

The idea of a robotaxi is quite appealing. It’s a car that takes you where you want to go, and neither you, nor anybody else, has to worry about driving. The reality of robotaxis is altogether different. Many of us are concerned about systems that are incapable of dealing with the whole gamut of often-chaotic road conditions. Waymo’s recent escapades certainly don’t help in that regard.

Titled “Voluntary recall of our previous software,” Waymo’s Chief Safety Officer Mauricio Peña’s new entry on the company’s blog explains a recall report the company filed with the National Highway Traffic Safety Administration (NHTSA). The filing was made in response to a hilarious and embarrassing incident on December 11, 2023 involving two of Waymo’s self-driving robotaxis.

Vidframe Min Top
Vidframe Min Bottom

According to Waymo, one of its robotaxis was operating in the city of Phoenix when it came across a pickup truck facing backwards on the road. The company alleges the vehicle was being “improperly towed” and that “the pickup truck was persistently angled across a center turn lane and a traffic lane.” When the Waymo robotaxi hit the pickup under tow, the tow truck driver didn’t stop after the collision, and continued traveling down the road. Mere minutes later, a second Waymo vehicle hit the same pickup truck under tow, at which point the tow truck driver elected to stop. Here’s Waymo’s full description of events:

On December 11, 2023 in Phoenix, a Waymo vehicle made contact with a backwards-facing pickup truck being improperly towed ahead of the Waymo vehicle such that the pickup truck was persistently angled across a center turn lane and a traffic lane. Following contact, the tow truck and towed pickup truck did not pull over or stop traveling, and a few minutes later another Waymo vehicle made contact with the same pickup truck while it was being towed in the same manner. Neither Waymo vehicle was transporting riders at the time, and this unusual scenario resulted in no injuries and minor vehicle damage.

Just imagine, you’re driving your truck with a pickup in tow behind you, and you feel a little something from behind. You look in the mirror and spot a Waymo vehicle, but assume you maybe just imagined the jolt. You get back to driving down the road, only for another Waymo to show up and again hit your consist from behind. You’d start to think these robot taxis were out to get you or something.

As covered by TechCrunch, both crashes caused only minor damage to bumpers and a sensor. The crashes were reported to police the same day, and the NHTSA on December 15. There were no reported injuries as a result of the crashes, and neither Waymo vehicle was carrying passengers at the time. Waymo put the problem down to the strange towing configuration, which confused its autonomous vehicle software. It apparently could not accurately understand or predict the motion of the tow truck or the pickup behind it, which led to the crash.

ADVERTISEMENT

Here’s the company’s explanation of why these Waymos crashed into the truck, per the aforementioned blog entry:

Given our commitment to safety, our team went to work immediately to understand what happened. We determined that due to the persistent orientation mismatch of the towed pickup truck and tow truck combination, the Waymo AV incorrectly predicted the future motion of the towed vehicle. After developing, rigorously testing, and validating a fix, on December 20, 2023 we began deploying a software update to our fleet to address this issue (more here on how we rapidly and regularly enhance the Waymo Driver’s capabilities through software updates).

[Editor’s Note: The fundamental problem here is one I’ve discussed before: automated vehicles have no idea what they’re actually doing. They’re computers, following sets of instructions and reacting to sensor/camera inputs, but they, of course, lack any sort of consciousness or understanding of what they’re doing. And that sort of general understanding of the world around you is actually quite important to the task of driving, and it’s something that we humans do without even thinking about it. 

Looking at the description of events, it seems that it’s just a case of a truck being towed backwards. No human driver would have been confused by this; anyone capable of operating a car would understand that they were looking at a car being towed, and would understand how that affected the motion of the car. This isn’t because humans were calculating the physics of towed vehicle masses or whatever, it’s because we’ve all seen towed cars before and get how that works. 

We can call them “edge cases,” but really they’re just cases. Things like this happen every single day on the roads, and humans deal with them wonderfully, because we have an understanding of the world and what we’re doing in it. Can AVs be programmed to have some sort of conceptual model of the world that will allow them to make sense of potentially confusing situations like this one, with the backwards-towed truck? I’m not sure. But it’s not a concept we can ignore. – JT]

Waymo implemented a software fix for the problem, rolling it out on December 20 last year. The full fleet had received the update by January 12.  “Our ride-hailing service is not and has not been interrupted by this update,” reads Waymo’s blog entry.

ADVERTISEMENT

It’s an interesting decision to make in the context of the accident. On the one hand, nobody was hurt in the twin incidents, and damage was minor. Plus, if Waymo’s account is accurate, it was an oddball situation which they might not reasonably expect to see again any time soon. At the same time, when two cars crash in the same way just minutes apart, you might consider shutting things down until a fix is out.

Based on conversations with the NHTSA, Waymo decided to file a voluntary recall report over the matter. However, this terminology is somewhat confusing, as Waymo didn’t really recall anything. It simply updated the software on its own vehicles over a period of a few weeks. Instead, the report really serves as a public notification that Waymo made a software change in response to the incident.

Waymovehicles
Waymo’s autonomous vehicles use cameras, radars, and lidars to understand the world. Regardless, when dealing with something unfamiliar or unusual, they can struggle to react appropriately.

It’s true that Waymo hasn’t seen quite as much bad press as Cruise. The latter GM-aligned company has had to contend with one of its autonomous vehicles dragging a stricken pedestrian along the road for 20 feet. But it’s faced its own woes. A Waymo vehicle recently hit a cyclist, prompting investigation by California regulators. Worse, the company saw a mob go wild and destroy one of its vehicles in Chinatown just a few days ago.

But ultimately, all these robotaxi operations will need to sharpen up their act. Crashes like this one are the sort of thing that even a poor human driver can avoid. An inexperienced driver is generally smart enough not to drive into a pickup truck dangling from a tow hook, nor would they drag a pedestrian along the road after running them over.

The problem for these companies is that it’s not the regular driving task that’s hard to master. It’s challenging, sure. But the real problem is the strange edge cases that humans deal with every day. You can’t expect every road hazard to have a big flashing orange light or a stop sign, but you have to be able to deal with them anyway.

ADVERTISEMENT

Waymo has actually published papers on this very topic, including one entitled Measuring Surprise In The Wild. It discusses methods for using machine learning models to “detect surprising human behavior in complex, dynamic environments like road traffic.” An ability to appropriately handle novel situations in traffic would be a major boon to any autonomous driving system. In comparison, the only other solution is for companies like Waymo to imagine every conceivable situation ever and provide appropriate countermeasures for such. Obviously, a more general ability to handle surprise is more desirable.

It’s clear from this incident that Waymo isn’t quite there yet; it’s just learned from one more strange situation and stuck that in the training files. Here’s hoping the robotaxis don’t start ganging up on broken BMWs or rusty Jeeps, lest The Autopian staff shortly end up in the firing line.

Image credits: Waymo

 

Share on facebook
Facebook
Share on whatsapp
WhatsApp
Share on twitter
Twitter
Share on linkedin
LinkedIn
Share on reddit
Reddit
Subscribe
Notify of
99 Comments
Inline Feedbacks
View all comments
Dest
Dest
9 months ago

All of this self driving crap is a colossal waste of money.

Tsorel
Tsorel
9 months ago

So I probably have close to a half million miles of accident-free driving/motorcycle riding. Waymo has 20,000,000 miles of real world experience and over 20,000,000,000 simulated miles of experience. How many miles does it take a computer to learn not to run into something you have seen for the first time? How about just don’t run into it? It doesn’t need to “anticipate” anything. There is something there, don’t run into it. This is why I’m not in programming.

Amy Andersen
Amy Andersen
9 months ago
Reply to  Tsorel

“Just don’t run into it” is pretty easy for a stationary object, not so much for a moving object. To avoid hitting a moving object you need to predict where it’s going to be in the next few seconds; you do that all the time without even thinking about it but an AV needs to be programmed to do so. The direction a vehicle is pointing is an important cue for making that sort of prediction, which is probably why the car got confused; it wasn’t programmed to ignore that cue in this unusual situation the way a human could.

PL71 Enthusiast
PL71 Enthusiast
9 months ago

Would love to see the video. It sounds like they’re trying to say the truck was not behaving stably and ended up in their lane. I can totally see a human driver getting in an accident like that, but not 2 in a short span of time.

Eva
Eva
9 months ago

Glad I don’t live in a city that forces these beta tests on the public but I fear its only a matter of time.

Studdley
Studdley
9 months ago

Mien gott this country will do anything except build good public transportation infrastructure. Auto taxis exist, their called buses and subways. If you want to drive yourself, ride a bike.

Knowonelse
Knowonelse
9 months ago

When my ’64 F100 crewcab needed to be towed, it either had to be on a flatbed or towed backwards due to having an auto trans. And when towed backwards, sometimes the steering wheel is tied off, sometimes not. In either case, the front end doesn’t track straight by default, so it wanders side-to-side randomly. Something a human drive can figure out after a minute or so. Not so automatic systems.

Zelda Bumperthumper
Zelda Bumperthumper
9 months ago

Waymo Developer: All right. While we’re still in beta, I want you to go back out on that road and hit the tow truck.

Robotaxi: Hit the tow truck?

Waymo Developer: Hit the tow truck.

Robotaxi: What for?

Waymo Developer: Because you’re gonna hit every other goddamned thing out there, I want you to be perfect.

HowintheNameofZeus
HowintheNameofZeus
9 months ago

Sorry, you can’t bring the autonomous car in to be fixed right now. We’re eating ice cream.

Stef Schrader
Stef Schrader
9 months ago

{ slow clap }

StillNotATony
StillNotATony
9 months ago

This is two movie references in one day. Did DT get either of them?

99
0
Would love your thoughts, please comment.x
()
x