They’re super conservative. I rode just once in one. There was a parked ambulance down a side street about 30 feet with it’s lights one while paramedics helped someone. The car wouldn’t drive forward through the intersection. It just detected the lights and froze. I had to get out and walk. If we all drove that conservatively we’d also have less accidents and congest the city to undrivability.
Back in February, I took a Waymo for the first time and was at first amazed. But then in the middle of an empty four lane road, it abruptly slammed the brakes, twice. There was literally nothing in the road, no cars and because it was raining, no pedestrians within sight.
If I had been holding a drink, it would have spelled disaster.
After the second abrupt stop, I was bracing for more for the remainder of the ride, even though the car generally goes quite slow most of the time. It also made a strange habit of drifting between lanes through intersections and using the turning indicators like it had no idea what it was doing—it kept alternating went from left to right.
Honestly it felt like being in the car with a first time driver.
Maybe the reason they crash less is because everyone around them have to be extremely careful with these cars. Just like in my country we put a big L on the rear of the car for first year drivers.
How long ago was that? Last year I took a couple near Phoenix and they did great, lights or no. The hardest part was dropping me off at the front of a hotel, as people were in and out and cars were everywhere. Still didn’t have issues, just slowed down to 3mph when it had 15 years left or so
just slowed down to 3mph when it had 15 years left or so
Damn, spending 15 years in a car going 3mph sounds terrible.
Haha, yeah I didn’t check that, was eating. 15 yards. I’m actually still sitting there.
Because they are driving under near ideal conditions, in areas that are completely mapped out, and guided away from roadworks and avoiding “confusing” crosses, and other traffic situations like unmarked roads, that humans deal with routinely without problem.
And in a situation they can’t handle, they just stop and call and wait for a human driver to get them going again, disregarding if they are blocking traffic.I’m not blaming Waymo for doing it as safe as they can, that’s great IMO.
But don̈́t make it sound like they drive better than humans yet. There is still some ways to go.What’s really obnoxious is that Elon Musk claimed this would be 100% ready by 2017. Full self driving, across America, day and night, safer than a human. I have zero expectation that Tesla RoboTaxi will arrive this summer as promised.
You’re not wrong, but arguably that doesn’t invalidate the point, they do drive better than humans because they’re so much better at judging their own limitations.
If human drivers refused to enter dangerous intersections, stopped every time things started yup look dangerous, and handed off to a specialist to handle problems, driving might not produce the mountain of corpses it does today.
That said, you’re of course correct that they still have a long way to go in technical driving ability and handling of adverse conditions, but it’s interesting to consider that simple policy effectively enforced is enough to cancel out all the advantages that human drivers currently still have.
You are completely ignoring the under ideal circumstances part.
They can’t drive at night AFAIK, they can’t drive outside the area that is meticulously mapped out.
And even then, they often require human intervention.If you asked a professional driver to do the exact same thing, I’m pretty sure that driver would have way better accident record than average humans too.
Seems to me you are missing the point I tried to make. And is drawing a false conclusion based on comparing apples to oranges.
Waymo can absolutely drive at night, I’ve seen them do it. They rely heavily on LIDAR, so the time of day makes no difference to them.
And apparently they only disengage and need human assistance every 17,000 miles, on average. Contrast that to something like Tesla’s “Full Self Driving” (ignoring the controversy over whether it counts or not), where the most generous numbers I could find for it are a disengagement every 71 city miles, on average, or every 245 city miles for a “critical disengagement.”
You are correct in that Waymo is heavily geofenced, and that’s pretty annoying sometimes. I tried to ride one in Phoenix last year, but couldn’t get it to pick me up from the park I was visiting because I was just on the edge of their area. I suspect they would likely do fine if they went outside of their zones, but they really want to make sure they’re going to be successful so they’re deliberately slow-rolling where the service is available.
Waymo can absolutely drive at night
True I just checked it up, my information was outdated.
I specifically didn’t ignore that. My entire point was that a driver that refuses to drive under anything except “ideal circumstances” is still a safer driver.
I am aware that if we banned driving at night to get the same benefit for everyone, it wouldn’t go very well, but that doesn’t really change the safety, only the practicality.
driving might not produce the mountain of corpses it does today.
And people wouldn’t be able to drive anywhere. Which could very well be a good thing, but still
True enough, it would not be a wise economic or political move
I think “near ideal conditions” is a huge exaggeration. The situations Waymo avoids are a small fraction of the total mileage driven by Waymo vehicles or the humans they’re being compared with. It’s like you’re saying a football team’s stats are grossly wrong if they don’t include punt returns.
I have zero expectation that Tesla RoboTaxi will arrive this summer as promised.
RoboTaxis will also have to “navigate” the Fashla hate. Not many will be eager to risk their lives with them
Considering the sort of driving issues and code violations I see on a daily basis, the standards for human drivers need raising. The issue is more lax humans than it is amazing robots.
it’s hard to change humans. It’s easy to roll out a firmware update.
Raising the standards would result in 20-50% of the worst drivers being forced to do something else. If our infrastructure wasn’t so car-centric, that would be perfectly fine.
:Looks at entire midwest and southern usa:
The bar is so low in these regions you need diamond drilling bits to go lower.
What’s a zipper merge?
Screams in Midwestern
I have spent many years in both the midwest and the south.
In some areas of the south, people drive extremely aggressively and there are lots of issues with compliance to various traffic laws but it is usually not difficult to get over if you need to. People will let you in. The zipper merge is a well-honed machine and almost everyone uses it and obeys it.
In the midwest, drivers tend to me more docile, cautious, and lawful overall but have an extreme sense of entitlement over their place in line. “How dare that person use that completely empty lane to get ahead of me! Can they not see there is a line!” They will absolutely not let you in. It does not matter if the zipper merge would improve traffic flow. It just is not going to happen.
“You don’t have to be faster than the bear, you just have to be faster than the other guy”
We always knew good quality self-driving tech would vastly outperform human skill. It’s nice to see some decent metrics!
My drive to work is 8 minutes. This morning i almost had a crash because a guy ran a stop sigh. I don’t think the bar is very high at this point.
That’s the beauty of it - we’ve only just begun to improve the situation. It’s going to get better and better until eventually traffic accidents are a rarity.
deleted by creator
Indeed
“After 6 miles, Teslas crash a lot more than human drivers.”
So only drive 5 miles. I guess that’s good advice in general
I hate felon musk but I honestly believe their self driving tech is safer than humans.
Have you seen the average human? They’re beyond dumb. If they’re in cars it’s like the majority of htem are just staring at their cell phones.
I don’t think self driving tech works in all circumstances, but I bet it is already much better than humans at most driving, especially highway driving.
I think the fair comparison would be humans that drive legally.
Idiots that drive high or drunk or without prescription glasses or whatever, shouldn’t count as “normal” human driving.
In the same way a self driving car can have issues that will make it illegal.The problem is that legal self driving Tesla is not as safe as a legal person. I sees poorly at night, it gets confused in situations people handle routinely. And Tesla is infamous for not stopping when the road is blocked from 1m and up, and for breaking without reason. I’ve seen videos where they demonstrated an unnecessary break every ½ hour!! Where a large part was the German Autobahn, which is probably some of the easiest driving in the world!!
I think the fair comparison would be humans that drive legally.
Humans don’t drive legally. I don’t believe for a second there is a human on this planet who has never violated a rule of the road. The easy default is that we all speed.
Who hasn’t done a rolling stop at a stop sign? Taken a turn they legally shouldn’t have? (No U turns? lol) Taken a right on red when it says not to but there’s literally nobody around?
Cell phones are mostly illegal everywhere while driving and if you look around almost everyone is staring at them.
This mythical person who never, ever does anything against the rules is impossible.
And Tesla is infamous … for breaking without reason.
No notes!
We discussed the test here on Lemmy a few days ago. It was a German 6 hour test.
I can’t find the video that was debated, but you can’t be serious about not knowing about this issue?!?!?
It’s a years old issue that is still not fixed!!! It’s commonly known as Phantom breaking. AKA breaking without reason.https://www.carscoops.com/2025/02/german-court-finds-teslas-autopilot-defective-after-lawsuit/
The way I edited the quote, it was just a like joke about braking vs breaking.
Like I could make a pedantic reply about spelling, but no teslas in fact brake unexpectedly AND break unexpectedly. So, no notes!
Ah OK didn’t notice that. English is 2nd language.
No worries. I’m glad I explained it then!
The first thing that comes to mind for popular media using “no notes” the way I did is probably John Oliver. I spent 10 seconds stacking for a clip or a montage of him saying it but came up empty.
Bro I saw a video of their car drive through a wall and hand the controls back to the driver. No, it absolutely is not.
When was the last time you saw a “wall” erected on a freeway that was perfectly painted to mimic the current time of day, road, weather, etc. I’m not talking about for that example, i’m talking about in the real world.
The answer is never.
Yes, the optical sensors are fooled by an elaborate ruse that doesn’t exist in real world operating conditions on a highway.
I still argue that for most normal driving circumstances, it is massively safer than humans who malfunction constantly.
I will never, ever buy a tesla so long as felon musk has any ownership in it whatsoever. The guy is irredeemable. Still have way more faith in self driving tech overall (industry wide) than human drivers though. That’s the work of engineers, not an asshole.
Human drivers have an extremely long tail of idiocy. Most people are good (or at least appropriately cautious) drivers, but there is a very small percentage of people who are extremely aggressive and reckless. The fact that self driving tech is never emotional, reckless or impaired pretty much guarantees that it will always statistically beat humans, even in somewhat basic forms.
It’s all about the whole dunning-kruger effect where most just know nothing despite thinking otherwise, right?
I honestly believe their self driving tech is safer than humans.
That’s how it should be. Unfortunately, one of the main decision maker on tesla’s self driving software is doing their best to make it perform worse and worse every time it gets an update.
Your username is a lie huh?
Removed by mod
I used to hate them for being slow and annoying. Now they drive like us and I hate them for being dicks. This morning, one of them made an insane move that only the worst Audi drivers in my area do, a massive left over a solid yellow across no stop sign with me coming right at it before it even began acceleration into the intersection.
As a techno-optimist, I always expected self-driving to quickly become safer than human, at least in relatively controlled situations. However I’m at least as much a pessimist of human nature and the legal system.
Given self-driving vehicles demonstrably safer than human, but not perfect, how can we get beyond humans taking advantage, and massive liability for the remaining accidents?
how those robot food delivery “robot ai boxes”? by starship doing?
Evolution took a billion years too, so it’s kinda fair to say “well, vehicles need some training”.
Thing is, the end goal after sorting out all the bugs in the AI is no human druven cars since having both will only lead to crashes dur to AI being unable to predict a human. All the AI cars would be linked to a central system to communicate with eachother and alwats know where eachither are. Then all we have to do is make sure people only use the cross walks and traffic accudents will be solely due to idiots.
I doubt a central system would ever be viable, but they would certainly communicate to other nearby cars with more than just blinky lights
I live in Phoenix, Arizona and these are all around. Honestly I feel like the future everyone will have Waymo type services and no one will own cars or even need to learn how to drive one. Who needs to worry about car repairs insurance etc.
I’ve rode in them a few times, fell asleep even. I trust a Waymo more than most human drivers. Best test of its capabilities I saw was when school let out and the side road was covered in kids and parents and cars in random spots waiting for people. It stayed in the “lane”, no lane lines, and calmly navigated forward as people gave it space. I was in the car the whole time. Still there are some issues to be ironed out, but ultimately I don’t think I have ever had a bad riding experience.
Makes sense. There’s less automated cars than human drivers. Human drivers have also been around way longer.
They accounted for that in this report. I believe you are a troll.
I believe you are a troll.
Then you don’t know what trolling actually is.
Okay, I’m sorry. Let me clarify how it’s easy to account for the kind of bias you’re talking about. Simply divide by the population count. So, they divided the waymo crash count by the number of waymos, and the human crash count by the number of humans. This gives the waymo crash rate and the human crash rate. (In reality, it’s a bit more complicated, since the human crash rate is calculated independently each year.)
Let me clarify further: It was an attempt at humor, and not meant to be taken seriously as you are doing.
Ah. Sorry. There are some truly braindead takes on autonomous vehicles so I couldn’t tell that apart from what some people have said earnestly. My bad. 👍
I do think it would be much safer with zero human drivers and only autonomous vehicles on the road, for sure. But I also think it would be impractical to replace everything all at once. Even the best programmed thing would eventually encounter a human driver that defies all previously known data and freaks out the computer.
I don’t know anything about how autonomous vehicles work. As far as humans doing unusual things, well assuming the human driver only steers the wheel and controls the gas and breaks, it should be possible with existing technology to avoid crashing into them at least as well as any human can. So that leaves really unusual things, like the human hopping out of their car in the middle of an intersection, as the high-hanging fruit to model. I would imagine for most of these really strange cases, even if the autonomous vehicle can’t understand what’s happening, they can at least realize that something strange is happening and then pull over.
Obviously there will be truly unusual situations that cause fatal collisions. So long as that is at a lower rate, then what’s the safety concern?
Safety is a red herring IMO, as better code can fix it. There are much worse potential problems that autonomous vehicles will cause than rare collisions. NotJustBikes has a lot of points I’d never considered before in the second half of this video. (The first half, though, I found aggravating; it’s just about solvable safety risks.)
*human drivers remotely controlling cars crash less than humans directly controlling cars
But it’s not like that. There’s some kind of ML involved but also like they had to map put their entire service area, etc. If something goes wrong, a human has to come up and drive your driverless car lmao
Most trips require remote intervention by one of their employees at at least some point.
That’s what happens when you have a reasonable sensor suite with LIDAR, instead of trying to rely entirely on cameras like Tesla does.
And are limited to highly trained routes. There’s a reason you only see them in specific neighborhoods of specific cities.
deleted by creator
people … can drive
Citation needed
People have a brain. Well most people. AI is no replacement for brains.
Tesla go durrrrr
Tesla go 🔥🔥🔥🔥
At least the repair for a camera-only front is cheaper after the car crashes into a parked white bus
Tap for spoiler
/s