In December 2024, a video started making the rounds that looked like a deleted scene from a low-budget sci-fi movie. It happened in Los Angeles, at the corner of La Cienega and Melrose. A white Waymo Jaguar, one of those self-driving SUVs with the spinning sensors on top, made a right turn and T-boned a four-wheeled delivery robot.
People online immediately called it "robot-on-robot violence." Honestly, it’s one of those clips that’s both funny and deeply unsettling.
You’ve probably seen the headlines about Waymo hits Postmates car or various other delivery mishaps. But this specific LA incident wasn't a car-on-car crash. It was a Waymo versus a Serve Robotics bot. Serve is a spin-off from Uber/Postmates, those cute little coolers on wheels that deliver burritos to your door. The collision raises a massive question: how does a multi-billion dollar AI "driver" fail to see a giant white box right in its path?
The Glitch in the Matrix
Basically, the delivery robot was trying to cross the street. It reportedly ran a red light—yep, even robots break the law sometimes—and was struggling to hop the curb on the other side. It hit the concrete lip, failed to climb it, and then backed up slightly into the crosswalk.
That's when the Waymo turned.
It didn't "see" a human. Waymo actually admitted this to TechCrunch. The car’s software correctly identified the delivery bot as an "inanimate object." Because it didn't register as a "Vulnerable Road User" (like a pedestrian or a cyclist), the Waymo's safety buffers were different. It assumed the object would keep moving out of the way. When the delivery bot backed up instead of climbing the curb, the Waymo’s "prediction stack" basically choked. It slammed on the brakes, but it was too late to stop a 4-mph bump.
It’s kinda fascinating. If that had been a toddler or a dog, the Waymo would have likely given it a ten-foot berth. But because it saw a "thing," it tried to squeeze past.
Why the Waymo Hits Postmates Car Narrative Persists
There is a lot of confusion between this LA "robot-on-robot" incident and older accidents. Back in 2019, there were rumors of a Waymo hitting a Postmates Chevy Spark in Phoenix. The truth is usually more boring: most Waymo "accidents" are actually humans hitting the Waymos.
For instance, in Chandler, Arizona, a Honda swerved into a Waymo’s lane to avoid another car. The Waymo got smashed. The headlines still read "Self-Driving Car Crash," even though the AI was just sitting there minding its own business.
💡 You might also like: Why See Who Doesn't Follow You Back Instagram Still Matters and How to Do It Safely
But the LA delivery bot incident is different. That one was on Waymo. It was a failure of "prediction," not just "perception." The car saw the bot. It just didn't think the bot would do something as "human" as getting stuck and reversing.
The Stats Nobody Tells You
Waymo likes to brag about their safety, and to be fair, they have a point. According to their 2025 safety reports, they have about 80% fewer injury-causing crashes than human drivers.
- Total Miles: Over 127 million miles driven.
- Fatalities: Only two involving their vehicles (neither found to be the Waymo's fault).
- The Catch: Experts like Henry Liu from the University of Michigan point out that most AV accidents happen in "routine" situations. Humans rarely crash in an empty intersection at 2:00 AM.
When a Waymo hits a delivery robot, it’s a "long-tail" event. It’s something developers didn't specifically code for. Is a robot a pedestrian? A vehicle? A piece of trash? If the AI gets the answer wrong, you get a bumper-to-bumper standoff in West Hollywood.
👉 See also: The Square Root of 100: Why This Simple Number Still Trips People Up
What This Means for Your Commute
If you're living in Phoenix, SF, or LA, you're sharing the road with these things every day. The biggest takeaway from the LA crash is that these cars are programmed to be "polite" to humans but "efficient" around objects.
If you’re walking, you’re safe. If you’re a delivery robot... well, maybe keep an eye out.
The real-world implication is that we’re entering a "liability gray zone." If a robot hits a robot, and there’s no human inside either one, who pays the insurance claim? In the LA case, the Serve robot was actually being remotely supervised by a human in a different country. The Waymo was fully autonomous. It’s a legal mess that hasn't been fully untangled yet.
💡 You might also like: Hulu Contact Phone Number: What Most People Get Wrong
How to Stay Safe Around Robotaxis
You don't need to be afraid, but you should be predictable.
The Waymo "brain" loves patterns. It hates it when you do something weird, like a "California stop" or a sudden U-turn.
- Don't assume they see "everything." They see the object, but they might misjudge your intention if you're doing something non-standard (like the delivery bot struggling with a curb).
- Watch the sensors. If the "domes" on top aren't spinning, the car is likely offline or being manually towed.
- Give them space. If you see a Waymo hesitant at an intersection, just wait. It’s processing a million data points. Honking won't make the AI move faster; it'll just make the remote operator's headset ring.
At the end of the day, the "Waymo hits Postmates car" story is a lesson in edge cases. As more automated delivery services hit the streets, these machines are going to have to learn how to talk to each other. Until then, expect a few more awkward bumps in the night.
If you're interested in the tech behind this, keep a close eye on NHTSA's "Standing General Order" reports. They’re the only place where you can see the raw, unpolished data on how often these "minor" bot-on-bot scrapes actually happen. Most of them never make the news because no one gets hurt, but they're the best indicator of where the software still needs work.