Human drivers keep rear-ending Waymos
Comments
codeflo
dmurray
If you brake more aggressively, you will be involved in more accidents, but in fewer accidents that are considered your fault (courts and insurance companies will almost always find against the rear-ender).
You might also be involved in less serious accidents: a low or moderate speed rear end collision is safer overall than hitting a pedestrian, T-boning someone, or colliding at the higher speed from before braking.
That's a trade off that Waymo presumably know about, and are taking intentionally. It would be interesting to know what they prioritise: is it fewer overall deaths and injuries, or fewer accidents where they were determined to be at fault?
Suppafly
>If you brake more aggressively, you will be involved in more accidents, but in fewer accidents that are considered your fault (courts and insurance companies will almost always find against the rear-ender).
I was talking to a kid once that was bragging about how his dad always got rear-ended, due to aggressive braking, but it was 'never his fault'. I was like "your dad is an asshole, that isn't something brag about."
potato3732842
That kid's dad is exactly why insurers push the OBD2 dongles and cell apps so hard.
The insurers want to know who those people are so they can jack their rate preemptively because there's a non-trivial chance that a) both parties in a collision have the same insurer b) the rear ended party's conduct is both indefensible and caught on camera and their insurer will wind up paying out anyway.
bayouborne
It's variation on a tired joke -
If you've been rear-ended, chances are the person behind you wasn't a very good driver. If you've been rear-ended 7 times, chances are that you are the person who isn't a very good driver.
AtlasBarfed
Yes what do people do when behind somebody that's chronically slow and incompetent?
They tailgate them
Doxin
I'd file tailgating under "incompetent". Being impatient is no excuse.
autoexec
Insurers want those things to jack up all of our rates no matter how we drive. Give them enough data and they'll find something to justify screwing you over no matter how you drive.
mint2
Thats flat out wrong.
That’s not how insurers are permitted to set rates by state regulators and also it’s a good way to go out of business due to what’s known as adverse selection.
autoexec
It's already been happening. https://www.nytimes.com/2024/06/09/technology/driver-scores-...
There's only one state in the US that prevents data collected from cars being used to increase your rates. Drive at night? Your rates go up. Drive "too often"? Your rates go up. Hit the breaks and successfully avoid hitting an animal or person suddenly running into the road? Your rates go up. Good luck counting on captured regulators to protect you. They did nothing to stop insurance companies from using records purchased from data brokers for ages. The EFF is calling on them to do more but don't expect much. https://www.eff.org/deeplinks/2023/06/steering-mobility-data...
mint2
Look, once again that’s not how rates work.
What you’re describing is segmentation, which is another way of saying geico charges the people without speeding tickets less than those with speeding tickets.
If ACME co decides to charge everyone the same, then the drivers without speeding tickets will go switch to geico leaving ACME holding the bag of speeders… and losing a ton of money since prices weren’t set for that.
Switch speeding tickets with hard breaking or whatever other thing correlated with accidents.
Also, as far as fairness goes, using actual driving behavior sound far better to me than using credit scores. But strangely I don’t see nearly the same level of complaints about that
kube-system
You really can't control the behavior of the driver behind you, you just have to trust that they're going to do their job of not hitting things in front of them, just as you are. If humans are hitting things in front of them more often because they're concerned that the person behind them might not stop, that's probably indicative of a flaw in human driving behavior. The rules surrounding accident fault exist because of the individual responsibility assigned to driving behavior. If someone would have rear-ended a hesitant waymo, they would have rear-ended an equally hesitant human.
dmurray
It obviously wouldn't be ethical for Waymo to brake aggressively just to catch out humans tailgating them too closely (at no risk to Waymo, after all, since the insurance of the car behind them will pay for damage and let's say they do this only when there are no human occupants). Even though it's a "flaw" in the tailgater's behaviour and even though he will have the responsibility assigned to him.
So is it ethical to brake aggressively to avoid a 1% chance of an accident up ahead? 5%? 0.1%?
This is a real life trolley problem, and the answer can't be as simple as "always brake, feel no remorse because it's the other driver's problem" even if that might be a good heuristic for a human to teach in driver's education.
kube-system
That isn't just unethical, it is illegal. You can't intentionally cause an accident. The waymo would have the responsibility assigned to them if it was determined that was programmed that way intentionally.
crote
It doesn't need to be intentionally programmed to cause an accident. The far more likely scenario is that it simply doesn't take into account what's behind them. It causes accidents by unintentional negligence, simply because nobody bothered to add a check.
Human drivers are taught to avoid scenarios like that. During drivers' lessons you will be taught to check your mirror before braking, and you will be taught to activate your alarm lights when there's a sudden stop on a highway to alert the drivers after you. Basic self-preservation makes it pretty clear that you don't want a car crashing into your rear.
Waymo doesn't need to care. If the other vehicle is considered at fault, their insurance is going to pay for your damages. It's at worst a minor inconvenience, so why bother spending engineering time on writing code to avoid it?
gruez
>Human drivers are taught to avoid scenarios like that. During drivers' lessons you will be taught to check your mirror before braking
If I'm going to crash into something I'm going to be slamming the brakes without checking my mirrors first.
toast0
Priorities. You should brake immediately to avoid an impending collision, of course.
OTOH, if you're driving and the light changes to yellow and you have room to stop, you should check your mirrors to see if there's a vehicle behind you that's unlikely to stop.
Additionally, in the impending collision decision, once you've taken initial action, you'll want to check on traffic to see if you have an option to evade the obstacle. And if you don't have an out, if there's a vehicle behind you and try to maintain space ahead and behind if possible.
gruez
>OTOH, if you're driving and the light changes to yellow and you have room to stop, you should check your mirrors to see if there's a vehicle behind you that's unlikely to stop.
If there's room to stop, then I'll still be braking immediately because braking immediately gives me the most time to brake, which makes the deceleration as gradual as possible and gives the person behind me the most time to react. I'm not sure what the alternative is supposed to be. Delay braking but having brake harder later? Gun it and try to run the yellow?
happymellon
No, you should brake.
It sounds like people defending their terrible driving.
mlazos
I had to retake a driver’s test recently and I’ve never heard of checking your mirrors before a yellow. What’s “a vehicle unlikely to stop” even mean? Like I can ascertain that in a second? The driver behind you is supposed to be traveling a minimum safe distance behind you for their reaction time. If they aren’t it’s 100% their problem.
toast0
> If they aren’t it’s 100% their problem.
It's your problem too if you stop for a yellow and they don't. They may be 100% at fault, but you're delayed at best, at worst may be injured and may not have a driveable car, and they may not carry insurance, so you may have a large, uncompensated loss.
By 'unlikely to stop', I really mean 'unlikely to stop without striking your vehicle if you were to stop', so things like following too closely, clearly not paying attention (which is hard to tell in a quick glance, but maybe you noticed their lack of attention before and haven't had time or space to make space), has a large vehicle that needs more room to stop, etc. I likely wouldn't stop on a fresh yellow if there was a trash truck following me closely, for example. Even at the extreme where I end up running a fresh red, it's more likely a better result than getting rear ended by a vehicle that's around 10x the weight of mine.
gruez
>By 'unlikely to stop', I really mean 'unlikely to stop without striking your vehicle if you were to stop', so things like following too closely, clearly not paying attention (which is hard to tell in a quick glance, but maybe you noticed their lack of attention before and haven't had time or space to make space), has a large vehicle that needs more room to stop, etc. I likely wouldn't stop on a fresh yellow if there was a trash truck following me closely, for example. Even at the extreme where I end up running a fresh red, it's more likely a better result than getting rear ended by a vehicle that's around 10x the weight of mine.
At best that's an argument for not slamming on the brakes on a yellow if you already know that someone is following you, not for you to waste precious seconds diverting your attention to your rear view mirror and trying to make a judgment call. In critical moments like that you really can't afford to divert your attention for whatever is happening behind you, because there could be important stuff happening in front of you as well (eg. the car in front decides to slam on the brakes). If you're really in a situation where you think you can't safely come to a stop because there's a truck tailgating you, you really should get yourself out of that situation rather than trying to run the next yellow.
JohnFen
> The far more likely scenario is that it simply doesn't take into account what's behind them.
Which would make it a bad driver.
hoseja
Good luck extracting intention from the driving AI.
tstrimple
> (at no risk to Waymo, after all, since the insurance of the car behind them will pay for damage and let's say they do this only when there are no human occupants
That vehicle will still be taken out of rotation for repairs. In situations like that, does the insurance typically cover lost revenue as well? Otherwise there's still a ton of risk for Waymo if it's a typical deductible + repair costs. I genuinely know nothing about insuring a business like Waymo.
kube-system
Depends on state law:
https://www.mwl-law.com/wp-content/uploads/2019/01/LOSS-OF-U...
But generally, yes, loss of use is generally something that courts recognize as a valid claim.
clort
> you just have to trust that they're going to do their job of not hitting things in front of them, just as you are.
Thats a very human-centric viewpoint, understandably. A human driver can only see in one direction, and should be looking in the direction where things are happening the vast majority of the time. A self-driving car can see in all directions and in fact is looking in all directions at all times. Perhaps the rules will be updated at some point.
kube-system
No, it's not a human-centric viewpoint. It's a physics centric viewpoint. When one vehicle follows another, the rear vehicle is on a collision course.
Human drivers are required to be aware of vehicles in all directions at all times.
potato3732842
You only need to train one person to drive to realize there's a really huge gulf between the bare minimum you're legally mandated to do when driving and what you should do to make driving less dangerous and less frequently displeasurable.
crote
> A human driver can only see in one direction
Cars have mirrors. Any halfway-decent drivers' education will teach you to regularly check them, especially when you're about to brake.
Gigachad
It doesn’t really matter what’s behind you. If there is a danger in front of you and you don’t brake, it’s your fault, if you hit the brakes and someone rear ends you, it’s their fault for not keeping a safe braking distance.
You’d never design your system to intentionally be at fault.
d13
Safe breaking distance is 2 seconds of travel between two and the car in front of you. In my experience almost no one ever follows this guidance and therefore yes, the rear-ender is always at fault.
Gigachad
I’m imagining some kind of future system where self driving cars are constantly scanning for tailgating, and when it happens, the number plate is logged and the data sold to insurance companies so they can adjust for high risk drivers.
mikhailfranco
Modern traffic radars have enough spatial and velocity resolution to apply a tailgating measurement. There are existing speed camera systems that can detect and ticket vehicles for tailgating, after a manual review of video and telemetry data (often just a quick glance and mouse click from a police officer).
There are relatively few jurisdictions that have a concrete quantitative definition of tailgating. So it's not yet widespread, but law enforcement certainly has the option available now.
hnburnsy
The future will be that California will pass a law that mandates that newly manufuctured cars cannot tailgate at all, just like the one pending to beep at you when speeding.
In all serousiness, eventually, driving will be removed from humans.
bitnasty
How can you simultaneously check all the mirrors at the same time while also not looking away from the front?
throwup238
> A human driver can only see in one direction
And it's usually in the direction of their phone. Even while driving.
83
You might not be able to control the behavior of the driver behind you, but you can certainly influence it. When there's someone close behind me and I know I'm approaching a situation that might require me to make a sudden stop, I often hold the brakes just enough to trigger the brake lights. This lets the person behind know to back off, or at least be ready.
I'm curious if Waymo does any sort of warning brake lights like that if chance of panic braking > 15% or something.
aaron695
[dead]
isthatafact
> "If you brake more aggressively, you will be involved in more accidents"
Is that statement based on evidence?
While that seems plausible, it also seems counter-intuitive to me. I suspect that many crashes happen because someone was afraid to apply maximum brakes, either out of fear that someone might hit them from behind, or simply due to being unfamiliar with braking hard.
dmurray
It's true at some points in the parameter space, not at others. If you never brake at all, starting to brake more aggressively would be great!
Once you've tuned your parameters to be driving pretty safely and minimizing accidents, which is what Waymo is surely doing for some definition of "minimizing accidents", it must be true that every further parameter change involves a tradeoff like this.
happymellon
Of you brake aggressively and someone rear ends you, could it not also be that the person behind was too close in the first place?
Not to be rude, but American drivers are not normally considered great, and San Francisco is number 7 on the list of cities you are most likely to get in an accident in the US.
https://www.forbes.com/advisor/legal/auto-accident/cities-mo...
maxerickson
The degree to which people tailgate is incredible. I live in a small town and people still act like it's going to take forever to get to where they are going.
One of 4 cars on a city street? Better try to push the car in front of you along, even if the other 2 in front of them are going the same speed.
Driving 2 miles on a 45 mph road? Better crowd the car in front of you that is already going 50 mph.
And then you get to the freeway where people think it's a great idea to try to push the car in front of you out of the way so you can weave through a little bit of traffic.
CalRobert
When I was taking motorcycle safety they made it clear you needed to ride only at a speed where you could stop in the space you knew to be clear. That means if the car in front of you turned in to a stationary boulder, you needed to be able to stop.
Hopefully human drivers learn to apply the same wisdom to their own driving.
nh2
Indeed.
Most countries' driving laws for good reason demand that you must maintain enough distance to the vehicle in front of you so that if it hard-breaks surprisingly for _whatever reason_, you will not touch it.
Lots of people ignore that. They are bad drivers. This normalises bad driving.
Robot cars being very good at breaking quickly just makes that more obvious, and will hopefully educate more people to drive in practice how they should have been driving all along.
watwut
I tried that. What happens is that other cars will put themselves into the space in front of you. If you slow down to make more space, it will happen more and more.
Basically, it is impossible to keep that distance on half busy road.
ghaff
I was driving someone in for an appointment in a particularly insane area of a city where I don't often drive. They were somewhat gently chiding me for not adjusting my driving to be more aggressive. And this is someone who I would consider a fairly careful driver.
CalRobert
Sadly true, and ur drove me insane when I was riding in San Diego. It helps when places actually ticket people who tailgate.
nytesky
Absolutely true. The worst is thru lanes and backed up turn lanes, where if you aren’t Gandalf on the stone bridge, you will be cut off by hundreds of cars from the “thru” lane.
whaleofatw2022
Tbh it's gotten worse post covid. Following distances seem shorter than ever and on top of that people are doing 10-15 over when in 2019 it was only 5 over in most of my state.
thirdtruck
Only 10-15 over? Here in Connecticut it's common to have someone pass you on the right at 20-25 over the limit (putting them at an absolute speed of 85-90 MPH). Tailgating is constant. Xe can't wait to move back to NYC where the drivers are much, much saner. Seriously.
nytesky
I hate when I happen to end up in left lane (construction merge from left etc) and I put my blinker on but people keep passing on right rather than let me get out of the way.
ghaff
Things seem to have gotten somewhat back to the somewhat crazy "normal" around where I live but there did seem to be a period when people were getting back on the road more when a lot seemed to have forgotten how to drive.
freeopinion
As a human driver I have to be prepared to drive behind first-year drivers, teenage racers, great-grandmas, people late for work, people on a Sunday tour, people putting on their makeup, people talking on their phone.
I don't really see autonomous vehicles as adding some vastly different curve at me. Wait, what?! They obey the law? How will I ever adjust to that?
HarryHirsch
Experienced drivers have a mental model of other drivers - when you see someone driving slowly and erratically you know to keep at a distance, that fellow is going to hit the brakes suddenly or make an unexpected turn.
Waymos on the other hand, they are so new that other traffic participants don't know what to make of them. Maybe it improves in the future because people get more experienced, maybe it doesn't because the alien intelligence is not compatible with human intelligence and those things are inherently unpredictable. We'll see.
moffkalast
That is a problem isn't it? For human drivers seeing a line of cars drive 20 over the speed limit almost always makes us follow herd instinct and "keep up with traffic" even if it's technically illegal. Something a self driving car (I presume) won't do. The problem is entirely on the human side though and needs to be solved there.
nytesky
So self driving cars do not speed. Ever?
mnahkies
I was going to make a similar comment. One important aspect of being a safe driver is being predictable. Cars (or cyclists/pedestrians for that matter) that suddenly change speed or direction without warning create a lot of risk.
Edit: I've been down voted so as clarification I mean a sudden change without any obvious/predictable reason. Think the person that realized too late that they wanted to turn here and instead of continuing and turning around they slam the breaks. Yes you need to be ready for this, but it's still more risky.
CalRobert
One big issue is that pedestrians includes kids. And kids change speed or direction very suddenly. In that scenario, the risk comes from the car, though, not the kids. Unless getting bumped in to by a 3 year old is fatal.
Having a child who's a runner made it clear just how much the world is literally asphalt ribbons of death right out everyone's front door.
specialist
> One important aspect of being a safe driver is being PREDICTABLE.
Bingo. (Emphasis mine.)
Design knowing full well that humans are error prone, and account for that. Wisdom popularized by Donald Norman's book Design of Everyday Things [1988].
We added a 3rd brake light, distinct from left & right turn signal lights, to reduce rear-ending. It worked, right?
Changes will be needed to accomodate robotaxis. For instance, one suggestion I read, repeated here without any judgement, is adding blue running lights, so that humans know a vehical is autonomous.
--
From the hip, because I don't know anything about robotaxis or road safety rules: Waymo could use those blue lights to signal when someone is tailgating too closely. As in too close for the robotaxi's own stopping distance.
Who knows? Someone will figure something out.
yadaeno
Safety is the thing we care about.
Let’s not get caught up handwringing about predictability since it’s very possible that being less predictive is safer overall.
We have the safety data to prove that Waymo is safer than human drivers, so I don’t think this predictability thing is that important afterall.
specialist
Safer for occupants.
Ylpertnodi
>We added a 3rd brake light, distinct from left & right turn signal lights, to reduce rear-ending.
...no orange turn signals, then?
kjksf
Consider famous autonomous Uber crash that killed a person illegally crossing the road at night.
The right thing to do for the autonomous software would be to break hard to avoid killing that person.
If there was a car behind that Uber car, it would likely rear end the Uber. That would be preferable to killing the human. The software would be right. Unpredictable but right.
If the read ending happens because Waymo erroneously did phantom break (i.e. it did break because software was confused), then software would be wrong.
Without analyzing rear ending accidents to decide if the software was right or wrong, we can't say if Waymo did the right thing or not.
In human-to-human crashes, rear-ender is presumed guilty.
bryanlarsen
The rear-ender is always guilty. You have to be able to stop safely if the car in front of you stops abruptly for something you cannot see.
The person in front may also be guilty, if they stop for no good reason, or a malicious reason (aka brake checking).
potato3732842
The rear ender is guilty by default in the absense of other information. You can always prove otherwise if you have the evidince. Preventing other people from gaming this default to your detriment is arguably the main point of dashcams.
In ye olden days before 1080p everywhere it was considered somewhat common courtesy to hang around and make a statement to the police if you witnessed a rear ending that was undeniably the fault of the party who got rear ended.
kube-system
It is hard to paint with a wide brush here; rules for determining fault vary by locale (sometimes significantly), but there are absolutely situations where your vehicle might strike the rear of another vehicle, and you are found completely free of liability.
Often, situations like:
* You are stopped in traffic, the driver in the front reverses into you.
* Traffic comes to a quick stop, you stop, and the driver behind you rear ends you and pushes you into the vehicle in front of you.
Brake checking and unsafe lane changes might result in a split of fault in places that assign comparative negligence, which might be more of what you're thinking of. Often the idea here is that the driver in the rear could have done something more to avoid the accident. This is less likely to be determined to be the case when a vehicle is completely stopped in traffic and the driver has no options.
bryanlarsen
I wouldn't call a situation where somebody backs into you "rear-ending".
ghaff
You can definitely cause accidents without technically committing a traffic violation. We all assume that other people behave "normally" in traffic while providing some margin.
falcolas
> You can definitely cause accidents without technically committing a traffic violation.
Yup. Just drive the exact speed limit on every highway you're on. And if you're on larger width highways, you can do that in the left lane, since you're at the legal maximum allowed speed.
Driving safely and driving perfectly legally are not always compatible.
ghaff
There are plenty of stay right except to pass signs but I expect that getting ticketed for driving the speed limit in the left lane is pretty rare.
falcolas
Ticketed, no. Hit by traffic which is all going 30 over, yes.
spacecadet
I tend to agree with you, I have decades of driving experience across a huge range of model years and vehicle types. Im very cautious and maintain distances as Im often driving something with subpar stopping power or lacking ABS. My observations have been similar, first when breaking assist appeared and now with self-driving. Those vehicles are quicker to react, more prone to over-react, slower to return to normal, and apply too much breaking power. Ive watched other cars in front of me hard break in a series of what appears to me as erratic decisions, because of what I assume is noise created by traffic patterns, pedestrians, and latency. I also wonder if humans will adjust their behavior... people seem so distracted while driving today.
JohnFen
> It's possible confuse other drivers without technically making a mistake
Exactly this. One of the most important things that enhances safety is to avoid doing anything that surprises others.
yadaeno
No, the most important thing is not causing a collision.
We don’t need to guess here, Waymo is way safer than human drivers as a car and a pedestrian. So clearly optimizing for “not confusing human drivers” is not correct.
Human drivers frequently run into stationary objects and plow over bikers/pedestrians so maybe there’s a huge category of situations where they *should* be surprised
JohnFen
> No, the most important thing is not causing a collision.
That's why I said "one of the most important", not "the most important". Driving in the least surprising way is one of the methods of avoiding accidents.
If Waymo's driving style means that they get rear-ended more than normal, that means that Waymo is behaving in a way that surprises other drivers. While it's true that the fault is the driver that did the rear-ending, it can also be true that the accident may have been avoided if Waymo drove in a manner that didn't surprise the tailgater.
Modified3019
>It's possible confuse other drivers without technically making a mistake
Yeah I experience that in my own truck (a 2022 dodge ram provided by and for work) has this awful delay in acceleration. It’s almost like it needs to “spin up” before there’s power to move, and it’s generally slow when it comes to moving up gears.
This results in people misjudging my vehicular body language, thinking that I’m waiting/indecisive when I’m not. So by the time I’ve started moving from a stop, the other guy got convinced I was letting them go first and is also moving.
Activating “Tow Haul” mode every time I start up seems to mostly, but not entirely mitigates the lethargy.
Also the wheelbase is too damn long, which makes reasonable turns in parking lots into curbs scraping or 3-7 point affairs to get out of.
I‘ve driven boats with better handling than this stupid vehicle.
Vrondi
What model truck do you have?
Modified3019
A 1500, with a 153.5 inch wheelbase
psychlops
Or if they have aggressive regenerative braking on their cars.
martindbp
I wonder if the NHTSA-stop is partially to blame
Log_out_
Or just vote them of the roads forever.
Animats
The classic rear-ending Waymo collision, from DMV reports, looks like this:
- AV approaches intersection where some obstacle blocks cross-street visibility.
- AV slowly enters intersection to get better visibility.
- AV detects cross traffic and stops quickly.
- Trailing human driver following too closely rear-ends AV.
Back when Google was testing in Mountain View, this was the most common accident. One particular intersection had a tree in the median strip, at a height that blocked the vehicle-top LIDAR, forcing a slow, careful entry to the intersection. At least two Google AVs were rear-ended there.
As the article says, there are no recorded incidents where a Waymo AV entered an intersection with legit cross-traffic and was hit. There's one where a human driver ran a red light, clearly not the fault of the AV.
One partial solution would be to have AVs flash their brake lights rapidly when they're in this may-brake-suddenly situation. That would warn humans to back off. AVs know when they're being tailgated.
xnx
Same source information as "Human drivers are to blame for most serious Waymo collisions" 3 days ago | 94 comments https://news.ycombinator.com/item?id=41516934
harmmonica
Since Waymo is very heavily California-biased at this point, a possible explanation for this, and one I think is responsible for a lot of rear-end collisions all across the world, is the "California roll." For those unfamiliar, in California (and, again, almost everywhere) it's super typical for people to approach a stop sign and not actually come to a complete stop. Instead, the driver slows down and then continues to "roll" through the stop. Many drivers here in CA expect the driver in front of them to do the California roll and so their calculation of when they will reach the actual stop sign/line will be wrong if they assume the car in front of them will roll.
I've said this on here many times before, but one of the reasons I love riding in Waymos is because they (in my experience) obey all traffic laws to the letter of the law. So if there's a stop sign, they all stop.
Would love to know the specifics of these rear-end collisions because I'd bet that they're either California rolls at stop signs or doing the same roll behavior when turning right on red traffic lights.
fallingfrog
Funny, in New England we call this a “New Jersey stop”.
baggy_trough
Super typical behavior like this should be legalized. It is super typical because it is safe and efficient.
ghaff
Well, and a lot of the time you have to creep out to peek around various obstructions so, in practice, there really isn't a hard stop line because you need to go a bit further to see around that tree that hasn't been trimmed.
hilux
You believe that not stopping at STOP signs is safe?
baggy_trough
Yes, that is obviously the case, which is why almost everyone does it.
thirdtruck
Oh stars no. Xe heart skips a beat every time a driver goes right back the stop sign and the white line, especially when they practically put their nose in the traffic lane.
We put far, far too many entrances and exits directly into traffic here in the US. The behavior to normalize should be directing folks onto back roads with much saner speed limits.
edward28
Or just replace useless stop signs with give ways.
ghaff
And now you're normalizing not really mostly stopping and taking a good look before proceeding--which is probably OK when merging but probably not at a lot of cross-traffic stops.
potato3732842
Putting stops in places they needn't be trains people to ignore them too.
hilux
I mostly see STOP signs at intersections. Those serve a purpose.
Where are you finding these unnecessary STOP signs?
AStonesThrow
They obey laws on the open road, but their parking decisions are ATROCIOUS around here. Taxi drivers always annoyed me by deliberately parking badly or even illegally, and Waymo follows suit. Red curbs, unsafe egress, blocking off three spaces outside my home, too far to walk, business that hates me. Sometimes I contact Support because there's no other way to move the thing when your trip's ended.
yial
I haven’t driven behind a waymo. So this is just curiosity.
But, having used and been behind other assisted driving vehicles. (Which one thing I find exhausting with auto cruise is that people pass, usually on the right, and then cut in constantly if you have the distance set safely)…
But as another commenter said about mental models of drivers. I wonder if part of it is they brake sooner, and perhaps more completely.
Example: someone turning left. Most human drivers will slow and go around if safe. Does waymo brake far away, and abruptly?
Again, I can’t comment on waymo, but I know my own vehicle too will sometimes brake late and harder if set to auto because of sensor distance. (I try to not let this happen and over ride and brake sooner of course, gradually slowing vs flying up, detecting a stop or slowdown, and then braking aggressively )
robrenaud
I just moved to SF a couple weeks ago. In city driving, I really like driving near Waymos as opposed to human cars. You quickly learn that the Waymos drive well and passively, which enables safe, assertive driving near them. Being able to condition on them not being aggressive and dumb (like some fraction of humans drivers, mostly young/male) makes driving easier.
nkingsy
Strange headline for a very positive article about Waymo safety
cut3
Waymos are notorious for signaling for half a second and then cutting you off to merge. Ive almost hit a couple that cut me off only to then immediately brake in front of me.
paulddraper
Makes you wonder about the decision to allow human drivers.
CalRobert
You're not alone! When the car showed up there were a lot of people very upset that their cities were now much more dangerous. The car industry pushed a lot of changes to push the blame for getting killed on the dead pedestrians. It helped that cars were much liked by the wealthy and disliked by the poor.
duxup
It was the best option at the time.
paulddraper
Fair
Fezzik
As a tangent: if you have not ridden in a Waymo you owe it to yourself to take a ride. I was in LA last weekend and took a 60 minute ride from my cousin’s house to a concert and, after flying, or maybe even before, it is the most futuristic thing I have ever experienced. And it feels far safer than any human driver I have been in a car with, including myself.
hindsightbias
I’ve observed Waymos pulling over in neighborhood streets when aggressively tailgated. They then continue on.
I’d submit aggressive drivers aren’t used to that.
AStonesThrow
I've logged 762 Waymo miles and over 2,500 ride minutes, as a passenger, so here goes:
Waymos indeed make odd and non-human maneuvers. A quite frequent one is inching forward jerkily as it pulls over (<5mph) at which point there's usually no-one moving nearby either.
Waymo has veered into oncoming lanes a few times, though traffic was stopped at that intersection. I've really never witnessed any risky maneuver that would endanger property, humans, or even warrant a ticket.
BUT... motorists and pedestrians HATE Waymo with a passion. I was literally spit upon when riding a scooter-share and it may get worse in those comfy white Jaguars.
No overt incidents yet, but several passers-by have vocally expressed their ire, and there is no shortage of folks who deliberately block our path just because I have no control. But I suspect, violence will arise as the pressure rises.
hacoo
Yep, some incidents may be caused by human drivers being extra aggressive around the waymo. I work in this industry and this definitely happens. That said, weird / unexpected braking is a frequent problem for self driving cars too.
Kalanos
Probably stopping at yellow lights and for pedestrians at crosswalks. It's a good way to get rear ended in Boston.
wrp
This sounds eerily like the difference between men and women drivers, as in the traditional explanations for why male drivers are more often judged to be at fault.
carapace
I'm a fan of self-driving vehicles (can we please call them auto-autos?) because they are, as this article points out, already safer than human drivers.
Still, it is and always has been completely and totally insane to test these heavy fast robots on public streets. I never in a million years would have thought we would do that.
The obvious thing to do is:
A) Start with light slow robots, like a golf cart covered in foam, and work your way up to Knight Industries Two Thousand. Going directly for KITT is a fetish.
B) Test it in one of those artificial cities with, you know, people who have at least signed waivers to be human test dummies for your two-ton high-speed killer robots?
E.g.:
> Mcity is a 32-acre simulated urban and suburban environment that includes a network of roads with intersections, traffic signs and signals, streetlights, building facades, sidewalks and construction obstacles. It is designed to support rigorous, repeatable testing of new technologies before they are tried out on public streets and highways.
https://news.umich.edu/u-m-opens-mcity-test-environment-for-...
abound
You have described more or less exactly how Waymo did their rollout.
A. They used to have these cute little cars (e.g. golf cart sized) that couldn't go over 30 mph, long before they rolled out the Chryslers and Jaguars
B. My memory is fuzzy on the details, but I'm sure there's public info on this: Waymo owned or leased some massive plot of land where they did exactly that, testing all sorts of chaotic scenarios (weird construction, skateboards coming out of nowhere, etc)
carapace
That's great! It's probably why they haven't killed anybody yet.
dopylitty
The other thing that should be done is to create a uniform federal test that computer driven cars must pass before being allowed on the streets. The test would have a large variety of environments (fast, slow, construction zones, farmers markets) as well as conditions (wet, dry, light, dark). The test would also measure how the computer driven car reacts when one or all of its sensors are damaged. Only after passing the test should a car be allowed on the road. Any software or hardware changes would necessitate a full re-test.
None of this "well the manufacturer says it works so let's just let 'er rip" that has already gotten people killed.
And yes human drivers should have to pass the same test in the vehicle they intend to drive.
bastawhiz
> a uniform federal test
Just like the existing uniform state run test that shows how all licensed human drivers are safe? It shouldn't need to be pointed out how any kind of standardized test isn't going to prove or even realistically evaluate the safety of a system like this. We can't even get a standardized test for GPU performance that manufacturers don't cheat on.
> The test would also measure how the computer driven car reacts when one or all of its sensors are damaged.
I do agree with this, though: we already expect cars to behave well in the face of hardware failures. Autonomous or no, cars should have requirements for failure modes.
joked56
It's unfortunate that the National Electrical Code doesn't exist. In fact, it is unfortunate that we can't possibly have national regulations for anything at all. If only this were possible.
bastawhiz
> Still, it is and always has been completely and totally insane to test these heavy fast robots on public streets
Another comment pointed out that Google did roll out in an extremely safe way, but to this point: do you still consider this to be the testing phase? You might call it that but if we're at a point where "they are, as this article points out, already safer than human drivers" (and in particular, nearly 2^3 times better in some cases) that would suggest that the fundamental concept of the vehicles no longer need testing.
It's simply not possible to develop autonomous vehicles completely in simulated environments. There's two kinds of testing:
1. Testing that the car behaves appropriately such that it's suitable for public use. This is the phase where you're not sure that it's safe.
2. Iteratively improving the vehicle after it's been deemed safe. This is where you're rolling out changes that you believe will improve safety and comfort on an existing safe product.
Google already did #1. They did it for over a decade. If you believe that #1 and #2 are not two separate phases, then if #1 can't be evaluated with real human drivers you can't ever get to a place where you deem the vehicles safe. There's no threshold where you can confidently say the cars will behave well because you've never put them in front of real, unpaid humans on real, non-simulated roads. How could you call them safe in good conscience if you've never tested them outside of a test environment? How do you know you've controlled every last variable? You simply don't and can't.
TZubiri
> can we please call them auto-autos?
Proposal accepted.
carapace
Thank you. :)
Ylpertnodi
Literally 'self-selfs' in some languages. 'Automobile' is a clue.
carapace
In English too. :)
(The first "auto-" is self-moving as contrasted with the cart that needed an external motive (horse etc.), then the second "auto-" is self-directing with sensors and feedback systems forming a crude "mind" with a "self image" (all efficient regulators contain a model of themselves[1]). A self-motile and self-guided machine is an auto-auto-mobile. I love it.)
JohnClark1337
[dead]
ChrisTrenkamp
[dead]
jaharios
Many accidents are because humans are in a hurry, something slow self-driving cars won't even know what it is.
Comparing self-driving cars with human car drivers is stupid, a more fair comparison is trains or trams, which also are less prone to accidents and you can't be in a hurry when taking a ride and make them go faster.
m463
I drove a tesla with "full self driving (supervised)" and with the default settings, the car will only go up to the speed limit. You would think this should be ok, but there are plenty of 25 mph zones where even conservative drivers will do 30 or 35 mph.
I had to take over the driving in these places because cars would pile up behind me, drivers would get impatient, and in some cases people would get angry and tailgate or try to pass where they shouldn't.
Turns out there is a FSD setting that lets the car adjust to conditions, and it will "go with the flow", even going over the speed limit, but matching the traffic and harmonizing.
Note there is a concept traffic engineers use called "the 85th percentile rule" that helps design highways by setting the speed limit for maximum compliance. Using this rule leads to the safest roads.
Some folks seem to be outraged that people don't comply with the speed limits, but when they are set artificially low, speed variance increases and roads become more dangerous. I think driving the speed limit or below in these places might amplify this.
data on speed variance vs safety (fig 4):
https://www.fhwa.dot.gov/publications/research/safety/17098/...
crote
> Note there is a concept traffic engineers use called "the 85th percentile rule" that helps design highways by setting the speed limit for maximum compliance. Using this rule leads to the safest roads.
The fundamental flaw in this concept is that it completely ignores that 1) the ideal speed depends on external circumstances and intended road use, and that 2) road design heavily influences driving speed. If you want to have a safe road network, you have to design it for a specific speed. When done properly, drivers will naturally be driving at the intended speed. Speeding doesn't happen because it doesn't feel safe to speed.
Highways have to be straight and wide, without any level crossings. People are supposed to drive at high speeds, so it is designed in such a way that it is safe to do so, and it feels safe to do so. Local access roads are narrow, twisty, have relatively poor visibility, and are equipped with things like speed bumps. You're supposed to drive at slow speeds because there are lots of level crossings and driveways, so they are intentionally designed to make high-speed driving unsafe to the point of being physically impossible.
m463
Maybe my wording is confusing, but the idea is not flawed. It is used to set the speed limit, given the road.
If you want a lower speed limit, you can just set it, and if below the 85th percentile speed, you'll get a less safe road.
The proper way to do it would be to adjust the characteristics of the road so that drivers slow down and the speed variance is minimized.
Same for increasing the limit. A wider or smoother or straighter road may work.
> It’s also possible that Waymo's erratic braking contributed to a few of those rear-end crashes
It's possible confuse other drivers without technically making a mistake. The article doesn't say whether Waymos are rear-ended more often than human drivers, but it's plausible that this could be the case. Human drivers have a mental model of what other drivers will or won't do, and that model might not work well for autonomous vehicles. It's very possible that Waymos are faster to put on the brakes in unclear situations than a human driver would be, and/or that they categorize different situations as dangerous than a human driver would. I'm sure human drivers will eventually adapt to the behavior of these robots sharing the road with them, but that might take a bit of time and experience.