To be honest. I think this is one of the strengths of autonomous cars.
With humans when they do this at max we can punish that individual. To increase population wide compliance we can do a safety awareness campaign, ramp up enforcement, ramp up the fines. But all of these cost a lot of money to do, take a while to have an effect, need to be repeated/kept up, and only help statistically.
With a robot driver we can develop a fix and roll it out on all of them. Problem solved. They were doing the wrong thing, now they are doing the right thing. If we add a regression test we can even make sure that the problem won't be reintroduced in the future. Try to do that with human drivers.
Road designs play an important role as well, it's not just enforcing the law.
Some roads are going to be safer simply because drivers don't feel safe driving fast. Others are safer simply because there's less opportunities to get into a collision.
Wide street in cities encourage faster driving which doesn't really save a lot of time while making the streets more dangerous, for example.
> we can develop a fix and roll it out on all of them.
You have to know what you're fixing first. You're going to write a lot of code in blood this way.
It's not that people are particularly bad at driving it's that the road is exceptionally dynamic with many different users and use cases all trying to operate in a synchronized fashion with a dash of strong regulation sprinkled in.
In this case the expected behaviour is clearly spelled out in the law.
> You're going to write a lot of code in blood this way.
Do note that in this case nobody died or got hurt. People observed that the autonomous vehicles did not follow the rules, the company got notified of this fact and they are working on a fix. No blood was spilled to achieve this result.
Also note that we spill much blood on our roads already. And we do that without much of any hope of learning from individual accidents. When George runs over John there is no way to turn that into a lesson for all drivers. There is no way to understand what went wrong in George’s head, and then there is no way to adjust all driver’s heads so that particular problem won’t happen again.
That's true at least once they surpass human drivers in collisions per driver mile under equivalent conditions.
It seems like we're pretty close to that point, but the numbers need to be treated with care for various reasons. (Robotaxis aren't dealing with the same proportions of conditions - city vs suburban vs freeway - and we should probably exclude collisions caused by human bad-actors which should have fallen within the remit of law enforcement - drink/drugs, grossly excessive speed and so on).
I was a bot hyperbolic but having Teslas steer by wire with remote code execution is close enough to an Elon Musk behind every wheel. What was the name of the movie, "Leave the World Behind"?
Not sure about a movie but that reminded me of the "Driver" short story in the "Valuable Humans In Transit and Other Stories" tome by QNTM (https://qntm.org/vhitaos).
I'd recommend to buy the book, but here's an early draft of that particular story:
There are ways, but our individualistic, consumerist, convenience-first society is reluctant to implement them - as, same as gun control, they're incompatible with certain notions of freedom.
> You have to know what you're fixing first. You're going to write a lot of code in blood this way.
This is exactly how the aviation industry works, and it's one of the safest ways to travel in the world. Autonomous driving enables 'identify problem -> widely deployed and followed solutions' in a way human drivers just can't. Things won't be perfect at first but there's an upper limit on safety with human drivers that autonomous driving is capable of reaching past.
It's tragic, but people die on roads every day, all that changes is accountability gets muddier and there's a chance things might improve every time something goes wrong.
Planes maintain vertical and lateral separation away from literally everything. Autonomy is easier in relatively controlled environments, navigating streets is more unlike flying than it is similar.
Also, humans will intentionally act counter to regulations just to be contrarian or send a message. Look at “rolling coal”, or people who race through speed meters to see if they can get a big number. Or recently near me they replaced a lane to many a dedicated bus lane, which is now a “drive fast to pass every rule follower” lane.
For some reason law enforcement seem to be particularly reluctant to deal with this kind of overtime dumbfuckery when it involves automobiles.
If you try something equivalent with building regs or tax authorities, they will come for you. Presumably because the coal-rolling dumbasses are drawn from the same social milieu as cops.
But you still don't have autonomous flying, even though the case is much simpler than driving: take off, ascend, cruise, land.
It isn't easy to fix autonomous driving not because the problem isn't identified. Sometimes two conflicting scenario can happen on the road that no matter how good the autonomous system is, it won't be enough
Though I agree that having different kind of human instead will not make it any safer
> But you still don't have autonomous flying, even though the case is much simpler than driving: take off, ascend, cruise, land.
Flying is actually a lot more complicated than just driving. When you're driving you can "just come to a stop". When you're flying... you can't. And a hell of a lot can go wrong.
In any case, we do have autonomous flying. They're called drones. There are even prototypes that ferry humans around.
Being unable to abort a flight with a moment's notice does add complication, but not so much that flying is "a lot more complicated" than driving. The baseline for cars is very hard. And cars also face significant trouble when stopping. A hell of a lot can go wrong with either.
a bit unclear from my statement before but that's the point. Something that feels easy is actually much more complicated than that. Like weather, runway condition, plane condition, wind speed / direction, ongoing incidents at airport, etc. Managing all that scenario is not easy.
the similar things also applied in driving, especially with obstacles and emergency, like floods, sinkhole in Bangkok recently, etc.
Flying is the “easy” part. There’s a lot more wood behind the arrow for a safe flight. The pilot is (an important) part of an integrated system. The aviation industry looks at everything from the pilot to the supplier of lightbulbs.
With a car, deferred or shoddy maintenance is highly probable and low impact. With an aircraft, if a mechanic torques a bolt wrong, 400 people are dead.
> You're going to write a lot of code in blood this way.
Waymo has been doing a lot of driving, without any blood. They seems to be using a combination of (a) learning a lot from close calls like this one where no one was hurt even through it still behaved incorrectly and (b) being cautious so that even when it does something it shouldn't the risk is very low because it's moving slowly.
This is actually the one technology I am excited about. Especially with the Zoox/mini bus /carpool model, I can see these things replacing personal cars entirely which is going to be a godsend for cost, saftey and traffic
If you were trying to evaluate that code deployed willy nilly in the wider world, sure. But that code exists within a framework which is deliberately limiting rollout in order to reduce risk. What matters is the performance of the combined code and risk management framework, which has proven to be quite good.
Airbus A320s wouldn’t be very safe if we let Joe Schmo off the street fly them however he likes, but we don’t. An A320 piloted within a regulated commercial aviation regime is very safe.
What matters is the safety of the entire system including the non-technological parts.
I'm just curious to see how they handle highways more broadly which is where the real danger is and where Tesla got in trouble in the early days. Waymo avoided doing that until late last year, and even then it's on a very controlled freeway test in Phoenix, not random highways
Waymo operates in San Francisco, Phoenix, Los Angeles, Austin, and Atlanta so I am sure they encountered school buses by now and learned from those encounters.
We don't even try. In the US you demonstrate that you know the rules at one point in time and that's it, as long as you never get a DUI you're good.
For instance, the 2003 California Driver's Handbook[1] first introduced the concept of "bike lanes" to driver education, but contains the advice "You may park in the bike lane unless signs say “NO PARKING.”" which is now illegal. Anyone who took their test in the early 2000s is likely unaware that changed.
It also lacks any instruction whatsoever on common modern roadway features like roundabouts or shark teeth yield lines, but we still consider drivers who only ever studied this book over 20 years ago to be qualified on modern roads.
> Anyone who took their test in the early 2000s is likely unaware that changed.
That's silly. People become aware of new laws all the time without having to attend a training course or read an updated handbook.
I took the CA driver's written test for the first time in 2004 when I moved here from another state. I don't recall whether or not there was anything in the handbook about bike lanes, but I certainly found out independently when it became illegal to park in one.
Some places will dismiss a traffic ticket if you attend a driver's education class to get updates, though you can only do this once every few years. So at least there have been some attempts to get people to update their learning.
> You're going to write a lot of code in blood this way.
Maybe? In this particular case, it sounds like no one was injured, and even though the Waymos didn't follow the law around stopping for school buses, it exercised care when passing them. Not great, certainly! But I'd wager a hell of a lot better than a human driver intentionally performing the same violation. And presumably the problem will be fixed with the next update to the cars' software. So... fixed, and no blood.
I haven't dealt with a school bus in....maybe 20 years, and it would definitely be an exception if I had to deal with one tomorrow. I kind of know what I should do, but it isn't instinct at this point.
A waymo, even if it drove in urban Seattle for 20 years where school buses aren't common, it would know what to do if it was presented with the exception tomorrow (assuming it was trained/programmed correctly), it wouldn't forget.
I disagree about the fixing, because ultimately self driving services will have political power to cap their liability. Once they dial in the costs and become scaled self sustaining operations, the incentive will be reduced opex.
I think the net improvements will come from the quantitative aspect of lots and lots of video. We don’t have good facts about these friction points on the road and rely on anecdotal information, police data (which sucks) and time/morion style studies.
even if we had good data, the major problem in the US is that the funding liabilities of transportation agencies generally massively outweighs revenues, particularly if legislators keep earmarking already limited funds for yet more road expansion in their districts.
Well, maybe not "this easy", but if we can all agree on an extensive test suite that all autonomous cars have to follow to be allowed on the road, it'd be almost like that, without the risk of a single bug taking down all of them.
The first fines should be meaningless to the company. If the issue isn't fixed the fines should get higher and higher. If the company fixes one issue but there is a second discovered quickly we should assume they don't care about safety and the second issue should have a higher fine than the first even though it is unrelated.
Companies (and people) have an obligation to do the right thing.
What do you mean by "second issue"? A second instance of the same underlying problem, or a different underlying problem? The way you phrase it as unrelated suggests the latter to me.
It's pretty wild to jump straight to "they don't care about safety" here. Building a perfect system without real world testing is impossible, for exactly the same reason it's impossible to write bug-free code on the first try. That's not a suggestion to be lax, just that we need to be realistic about what's achievable if we agree that some form of this technology could be beneficial.
The courts get to decide that. Often it is a "I know it when I see it". The real question is did they do enough to fix all possibly safety issues before this new one happened that was different. If they did "enough" (something I'm not defining!) then they can start over.
It’s not an unreasonable take given historic behavior. Rather than decrying the cynicism, what steps can we take to ensure companies like Tesla/Waymo/etc are held accountable and incentivized to prioritize safety?
Do we need hasher fines? Give auto regulators as much teeth as the FAA used to have during accident investigations?
Genuinely curious to see how addressing reasonable concerns in these areas can be done.
Why isn't allowing people to sue when they get hurt and general bad PR around safety enough? Did you see what happened to Boeing's stock price after those 737 crashes?
I’d counter that with the Equifax breach that raised thei stock prices when it became clear they weren’t being fined into oblivion. Suing is also generally only a realistic option if you have money for a lawyer.
Right. We have a precedent for how to have an ridiculously safe transportation system: accidents are investigated by the NTSB, and every accident is treated as an opportunity to make sure that particular failure never happens again.
Some of us came here because we were finding programming.reddit.com too mainstream (after all this thing was written in Arc! of which almost no-one knew any details for sure, but it was Lisp, so Lisp cool), for sure we weren't visiting this place in order to become millionaires.
Even though I agree, there was a time and a place (I'd say 2008-2010) when this forum was mostly populated by "I want to get rich!" people, maybe that is still the case and they've only learned to hide it better, I wouldn't know.
Why accept the company's say so without any proof being offered or even a description of the fix? If it's been years and this kind of thing, described in regulations so clearly some attention was paid by engineers, still happens, then maybe fixing it isn't trivial.
Sure, perhaps we shouldn't accept the company's say-so, but this seems like a fairly easy thing for a third party to verify. If that's not being done, that's not Waymo's fault; lobby the local regulatory body or legislature to get that sort of thing required.
There's a video of the actual incident.[1] (Yahoo posted some file photo).
The Waymo was entering from a side street, in front of the school bus. It clearly recognized that it was in an iffy situation and slowed to creeping speed, rather than blocking the intersection. No children are visible.
If the school bus has a dashcam, much better info may be available. This video starts too late.
the point of a bus having lights flashing and the stop sign extended is that kids could be coming or going from any direction and especially when least expected. it's certainly a minor issue until the worst case scenario happens.
Sounds more like a testing problem to me. Honestly I can't even remember if this particular rule was on the license exam when I took it. I know it because I put more care into remembering driving laws but many people don't.
This implies the absence of a school bus with flashing lights means kids can't be coming and going from any direction when least expected. It's a horrible solution and just another example of reducing drivers' responsibility on the road and effectively making it the victim's fault for being there.
The Waymo is going to be on high alert at all times, regardless of any flashing lights or stop signs.
The school bus' stop sign was extended and had red lights flashing. With the proximity to the intersection, it's most appropriately treated as an all-way stop.
Regardless of whether the bus' stop sign applies to cross streets, at some point in the turn the car is now in parallel with the bus, and the sign would apply at that point.
Also, you're blind to anyone who may be approaching the bus from the opposite side of the intersection.
On net, Waymos are safer than human drivers. Really all that matters is deaths per passenger mile, and weighted far less, injury/crash per passenger mile.
Waymos exceed human drivers on both metrics, thus it is reasonable to say that Waymos have reduced crashes compared to the equivalent average human driver covering the same distance.
Mistakes like this are very rare, and when they do happen, they can be audited, analyzed with thousands of metrics and exact replays, patched, and the improved model running the Waymo is distributed to all cars on the road.
There is no equivalent in humans. There are millions of human drivers currently driving who drive distracted, drunk, recklessly, or aggressively. Every one of them who is replaced with a Waymo is potentially many lives saved.
Approximately 1/100 deaths in the US are due to car fatalities. Every year autonomous drivers aren't rapidly deployed is just unnecessary deaths.
I mean, we do. The problem is that you need to be physically present to catch and deal with those people, and you can only really deal with one party (others will do their thing while a police officer is dealing with the first driver they stop). Not to mention that drivers change their behavior if they see the police around, so it's harder to catch them in the act. So for a variety of reasons it's harder to solve the human driver problem.
Yea so as someone who lives on a busy road with daily visibility into how many people flaunt the law I basically did this to force the city to make changes to the street. There really isn’t much you can do to the folks who break the law and drive away but high def video of daily shenanigans is great ammo for other types of solutions that force drivers into making better decisions.
Sounds like you're confused about the world you live in if you believe there aren't millions of cameras, many with ALPR capabilities, pointed at the street already.
I propose they be made actually useful instead of merely surveillance for surveillance sake, but I can see how that would feel oppressive to drivers accustomed to getting away with murder.
A) everyone is already constantly surveilled via mobile networks and license plate readers, so surveillance is a moot point. We might as well get something out of it.
B) the system can be setup to purge and/or record only at relevant times or during infractions
This is only an issue because traffic code violations are treated like criminal acts instead of... code violations. We don't have this issue with parking tickets, there's no reason we should have it with automated red light and school bus cameras.
Hence high definition camera. Most states have tints on windshield and dark tints on front windows as illegal. Also, the license plate is all that is needed, ticket the owner and they will readily give up the driver.
Other countries have no issues with camera based traffic law enforcement.
At least in socal with the way camera based traffic enforcement it has basically no teeth and plenty of ways like my quote to weasel out. You can actually ignore the ticket that is mailed to you. I’m not even sure HD cameras would help here. You even have options built into the ticket to say it wasn’t you driving or that it was someone else you know of in a sort of check a couple boxes and mail it back fashion. However if you actually look up the status of your ticket with the ticket number on the web portal, then it counts as being served a ticket and you do have to pay or show up in court.
Seems the way the law works is it needs some piece of two way communication. It doesn’t seem to work on a one way basis like it might in other countries. Maybe it is because most of our laws concerning technology are very much still structured for an analog world. E.g how in this case the old ritual of you being identified to have acknowledged the ticket by the cop writing it and handing it to you is preserved by you having to show you’ve actually received the ticket and consent to its validity viewing its status online.
Yeah fair treatment for billion dollar corporations and robots and all. Who could forget. Waymo is such a lovely person, why would anyone ask them to do better?
Out here in rural nowhere it doesn’t — it just gets the sheriff on the local news begging people to stop instead of solving the actual problem at hand by placing patrols on the routes.
Out here in rural nowhere it most certainly does. The school bus driver will record your number plate, and school buses have the equivalent of dashcams now.
Human drivers can be seen and stopped by police and given an appropriate punishment. Self-driving cars have nobody to take accountability like that so you need to go back to the source.
Most large cities I've lived in, general traffic enforcement essentially only exists on that month's/quarter's designated ticket-writing day. i.e., when highway patrol and city police just write speeding tickets all day to juice revenue
This is as close to functional as any car discussions get…citizens reported some issues, the government is checking on it, and it’s going to get fixed.
I have a question about the rules of school busses (I'm not American). It seems like the expectation is that _all_ traffic is required to stop if a bus is stopped, is that correct? If so, why?
Here (Australia) the bus just pulls over and you get off on to the sidewalk, even children, why is it not the case in the US?
It's a long video but the tldr is that Americans don't have foot paths. You would think they would but nope, it's not like Australia where everywhere you walk has a path and down paths to the road.
Even directly around schools no footpaths, and it's all because it's no one's responsibility other then the home owner.
As mentioned, in a lot of suburban areas in the US where school buses are common there are no crosswalks or traffic lights (or sidewalks or physical bus stops, for that matter). Most of the time there isn't so much traffic that stopping all of it is a huge burden.
Also, there's generally an exception for divided highways - if the road has a physical median or barrier, the oncoming traffic doesn't have to stop. I assume the bus route accounts for this and drops kids off on the correct side of the road.
> (St)Roads where the kids have to cross a busy road to get to the other side where their house is.
That's pretty different from my experience.
Almost all the school bus stops around here are on small low-speed residential streets.
And while there are surely some stops on faster 2-lane roads...
A stroad or major road would mean 4+ lanes, which in my state means the school bus only stops traffic on one side. No kids will be crossing at those bus stops.
Ah there might be some assumption here that I didn't realise. Typically we'd have a cross walk or traffic light near the bus stop where you'd cross. I'm in Sydney so I don't know of anywhere that you'd be going that fast that would also have bus stops (they max exist I'm just not aware of them)
The kids near me (in Melbourne, about 10km outside the CBD) just take the same public transport system as everyone else. You don't see school bus systems unless you're in the far outer suburbs, a regional/rural area, or maybe some other special cases.
Growing up, our school bus stop was on a service road off a 100km/h highway, but it had good visibility in both directions and most of the kids over the other side got dropped off by their parents while they were young.
I’m also curious about school zones. The one near my house has a sign, “School”
“Speed Limit 35”
“7:00AM to 4:00PM School Days”
Now, how does a robotaxi comply with that? Does it go to the district website and look up the current school year calendar? Or does it work like a human, and simply observe the patterns of the school traffic, and assume the general school calendar?
In NSW (Australia) that's exactly how it works. And it includes 'pupil-free' days where there are no students present. My old school even had a pedestrian bridge and barriers so that it wasn't even possible to get to the road.
It's so silly, when the obvious solution is to make school zones 40km/hr (25mi/hr) at all times, or to fix the road design. Typical speeds here are 60km/hr (40mi/hr), so anyone making the argument that it would 'slow traffic' is being dramatic.
(There is one exception that I know of - our east coast highway used to go near a school, which forced a change from 110km/hr (70mi/hr) to 40km/hr. In this case I will concede the speed is not the issue, the highway location is the issue)
> (There is one exception that I know of - our east coast highway used to go near a school, which forced a change from 110km/hr (70mi/hr) to 40km/hr. In this case I will concede the speed is not the issue, the highway location is the issue)
In the UK we have a sign saying something like "20MPH WHEN LIGHTS ARE FLASHING". During term time when pupils are entering or leaving the school (say between 8am and 9:30am, around lunchtime, and from around 3pm to 4:30pm) someone at the school switches the lights on. Usually it's one of the "lollipop men" who stand at crossing points that are not otherwise marked, and hold out a sign to stop traffic to let children cross, but often it's just programmed into some timer somewhere.
It's pretty simple.
You don't need clever software or self-driving cars, you just need to lift your right foot a little near schools.
Here is an example of one that just lights up with a 20mph limit when it's needed, from near where I grew up. Pretty high-tech for a remote part of the world, eh?
Yes, that's how it works in Alberta. It's particularly confusing because not all schools have the same academic calendar (e.g., most schools have a summer break, but a few have summer classes).
Unlike the sibling comment, there are no lights or indications of when school is in session. You must memorize the academic calendar of every school you drive past in order to know the speed limit. In practice, this means being conservative and driving more slowly in unfamiliar areas.
This is another example of something where, at least if you want to get all the way to completely correct operation, it's easier for an driverless car than a human. A person can't memorize the schedules of every school district they might pass, but an automated system potentially could. Of course something like Google Maps could solve this too, for both humans and Waymo.
Where I am, the school zone signs fold up; during the off season, they're folded and say things like drive nice; during the on season, they are unfolded and present the limit.
Present means "present in the school." It's not always observable while driving by if you need to obey the reduced limit or not. California does it and I find it absurd.
Many other states setup a flashing yellow light and program the light with the school schedule. Then the limit only applies "when light is flashing." Far more sensible.
This is the one most familiar to me. Usually the signs have flashing orange lights to indicate when they're active, but sometimes not. You generally know roughly when the kids are in school (maybe look at the school?), and follow what other drivers are doing. Things like this are why I think fully autonomous driving basically requires AGI.
The light and often even the sign itself are typically considered informational aids rather than strict determinants of legality. The driver is expected to comply with all the nitpicky details of the law regardless of whether the bulb is burned out or the school schedule changes.
Needless to say, most people regularly violate some kind of traffic law, we just don't enforce it.
it wasn’t at first but I suspect they received a ton of feedback and fixed it.
in my estimation the robo driver has reached a median-human level of driving skill. it still doesn’t quite know how to balance the weight of the car through turns and it sometimes gets fussy with holding lanes at night but otherwise it mimics human behaviors pretty well except where they’re illegal like rolling through the first stop at a stop sign.
Now I’m imagining the Waymo Driver calling out to Gemini to determine "school hours" by looking it up on the Internet, and wondering about the nature of life.
You are not penalized for failing to go over 35 on non-school-days. School zones are sufficiently small that the time penalty for complying on a summer weekday isn't that much of an inconvenience.
Aren’t they supposed to read signs? Otherwise they’d also ignore the overhead speed limits on the highway for traffic jams / air quality adjustments during the day.
GP is saying that reading the sign is insufficient to determine whether it is a school day. You have to either guess based on the presence of students or busses, the lights being on, etc., or you have to source the school calendar somehow.
I don't know how it is there, but here those signs near schools light up and blink during school hours (really can't miss it). And for signs that do not, I think school days are pretty fixed, shouldn't be difficult to program... and a default of just slowing down would be just fine too.
It can be dangerous though. In my area we have roads with speed limits of 45 that drop to 25 "when children are present". My EV always assumes children are present as it has no real way to determine if they are. Driving 25 in a 45 is dangerous for many reasons.
At some point self driving cars will need their own loser driving laws.
Perhaps allowing them to drive around school buses is not a good idea, although personally I have felt far safer biking or walking in front of a Waymo than a human. But rules few humans follow, like rolling stops, and allowing them to go 5 over seems like a no-brainer. We have a real opportunity here to br more sensible with road rules; let’s not mess it up by limiting robots to our human laws.
I cannot wait for the school bus to be a waymo, that could tell the other waymos around that it is full of vulnerable and unpredictable little humans, and to be on the watch out.
I can't wait until every car on the road is required by law to be self driving. You could have cars with no adults in them just driving puppers around, and it can tell the other cars hey watch out I've got a couple of good pupperinos inside so watch out!
The future is gonna be awesome. I fricking love science! Once we unlock self driving car technology, we will finally be able to move people and things from one place to another. All we have to do is force everyone on the road to install a transponder in their car that allows the government to track their location at all times, and develop a massively complex computer-camera system inside of the car that phones home and controls what the car is allowed to do.
This is a great technology and has clearly made great strides, but at this time it is hard to trust. These vehicles have had many problems that human drivers do not. Problems with maps can cause dozens of them to collect in dead end alleys. They may stop on busy one lane roads. They consistently fail to react appropriately to responders and emergency situations. And even if the supposedly reliable recording and reporting work out it is not always clear who is responsible when things go wrong. Simply not killing as often as humans is not good enough for mass deployment of this technology.
"approached the school bus from an angle where the flashing lights and stop sign were not visible"
I call bullshit on that. Yes the stop sign is only on the left side but the flashing lights are on all four corners of the bus. You'd need to be approaching the side of the bus from a direct right angle to not see the flashing lights.
there have been increase of "aggressiveness" of autonomous cars. My earlier comment - https://news.ycombinator.com/item?id=45609139 . May be that aggressiveness is sold internally as some optimization enabled by the higher skills of the robot-driver.
San Francisco is the crucible (by US standards) of dealing with pedestrians and I'm still shocked they launched there so early. But with something as distinct and vulnerable as school busses, it's time to think about hardware installation so automated vehicles can "see" farther ahead.
To be honest. I think this is one of the strengths of autonomous cars.
With humans when they do this at max we can punish that individual. To increase population wide compliance we can do a safety awareness campaign, ramp up enforcement, ramp up the fines. But all of these cost a lot of money to do, take a while to have an effect, need to be repeated/kept up, and only help statistically.
With a robot driver we can develop a fix and roll it out on all of them. Problem solved. They were doing the wrong thing, now they are doing the right thing. If we add a regression test we can even make sure that the problem won't be reintroduced in the future. Try to do that with human drivers.
Road designs play an important role as well, it's not just enforcing the law.
Some roads are going to be safer simply because drivers don't feel safe driving fast. Others are safer simply because there's less opportunities to get into a collision.
Wide street in cities encourage faster driving which doesn't really save a lot of time while making the streets more dangerous, for example.
> we can develop a fix and roll it out on all of them.
You have to know what you're fixing first. You're going to write a lot of code in blood this way.
It's not that people are particularly bad at driving it's that the road is exceptionally dynamic with many different users and use cases all trying to operate in a synchronized fashion with a dash of strong regulation sprinkled in.
> You have to know what you're fixing first.
In this case the expected behaviour is clearly spelled out in the law.
> You're going to write a lot of code in blood this way.
Do note that in this case nobody died or got hurt. People observed that the autonomous vehicles did not follow the rules, the company got notified of this fact and they are working on a fix. No blood was spilled to achieve this result.
Also note that we spill much blood on our roads already. And we do that without much of any hope of learning from individual accidents. When George runs over John there is no way to turn that into a lesson for all drivers. There is no way to understand what went wrong in George’s head, and then there is no way to adjust all driver’s heads so that particular problem won’t happen again.
And "much blood" is (globally) to the tune of ~1.2 million lives lost, and many more injuries.
Compared to that, autonomous vehicles have barely harmed anyone. Also they will probably save most of those lives once they become good.
The "least harm" approach is to scale autonomous vehicles as quickly as possible even if they do have accidents sometimes.
That's true at least once they surpass human drivers in collisions per driver mile under equivalent conditions.
It seems like we're pretty close to that point, but the numbers need to be treated with care for various reasons. (Robotaxis aren't dealing with the same proportions of conditions - city vs suburban vs freeway - and we should probably exclude collisions caused by human bad-actors which should have fallen within the remit of law enforcement - drink/drugs, grossly excessive speed and so on).
Why should we exclude the cases of human bad-actors? That's explicitly a major case solved by getting rid of the human behind the wheel...
I don't think we are better off putting Elon Musk behind every wheel.
Good thing no one is suggesting that
I was a bot hyperbolic but having Teslas steer by wire with remote code execution is close enough to an Elon Musk behind every wheel. What was the name of the movie, "Leave the World Behind"?
Not sure about a movie but that reminded me of the "Driver" short story in the "Valuable Humans In Transit and Other Stories" tome by QNTM (https://qntm.org/vhitaos).
I'd recommend to buy the book, but here's an early draft of that particular story:
https://qntm.org/frame
This is a tradeoff, in which the original case might have been the less dangerous one.
Autonomous fleets have a major potential flaw too, in form of a malicious hacker gaining control over multiple vehicles at once and wreaking havoc.
Imagine if every model XY suddenly got a malicious OTA update and started actively chasing pedestrians.
There are ways, but our individualistic, consumerist, convenience-first society is reluctant to implement them - as, same as gun control, they're incompatible with certain notions of freedom.
> You have to know what you're fixing first. You're going to write a lot of code in blood this way.
This is exactly how the aviation industry works, and it's one of the safest ways to travel in the world. Autonomous driving enables 'identify problem -> widely deployed and followed solutions' in a way human drivers just can't. Things won't be perfect at first but there's an upper limit on safety with human drivers that autonomous driving is capable of reaching past.
It's tragic, but people die on roads every day, all that changes is accountability gets muddier and there's a chance things might improve every time something goes wrong.
Planes maintain vertical and lateral separation away from literally everything. Autonomy is easier in relatively controlled environments, navigating streets is more unlike flying than it is similar.
Also, humans will intentionally act counter to regulations just to be contrarian or send a message. Look at “rolling coal”, or people who race through speed meters to see if they can get a big number. Or recently near me they replaced a lane to many a dedicated bus lane, which is now a “drive fast to pass every rule follower” lane.
For some reason law enforcement seem to be particularly reluctant to deal with this kind of overtime dumbfuckery when it involves automobiles.
If you try something equivalent with building regs or tax authorities, they will come for you. Presumably because the coal-rolling dumbasses are drawn from the same social milieu as cops.
But you still don't have autonomous flying, even though the case is much simpler than driving: take off, ascend, cruise, land.
It isn't easy to fix autonomous driving not because the problem isn't identified. Sometimes two conflicting scenario can happen on the road that no matter how good the autonomous system is, it won't be enough
Though I agree that having different kind of human instead will not make it any safer
> But you still don't have autonomous flying, even though the case is much simpler than driving: take off, ascend, cruise, land.
Flying is actually a lot more complicated than just driving. When you're driving you can "just come to a stop". When you're flying... you can't. And a hell of a lot can go wrong.
In any case, we do have autonomous flying. They're called drones. There are even prototypes that ferry humans around.
> When you're driving you can "just come to a stop". When you're flying... you can't
Would note that this is the same issue that made autonomous freeway driving so difficult.
When we solve one, we'll solve the other. And it increasingly looks like they'll both be solved in the next half decade.
Being unable to abort a flight with a moment's notice does add complication, but not so much that flying is "a lot more complicated" than driving. The baseline for cars is very hard. And cars also face significant trouble when stopping. A hell of a lot can go wrong with either.
a bit unclear from my statement before but that's the point. Something that feels easy is actually much more complicated than that. Like weather, runway condition, plane condition, wind speed / direction, ongoing incidents at airport, etc. Managing all that scenario is not easy.
the similar things also applied in driving, especially with obstacles and emergency, like floods, sinkhole in Bangkok recently, etc.
Flying is the “easy” part. There’s a lot more wood behind the arrow for a safe flight. The pilot is (an important) part of an integrated system. The aviation industry looks at everything from the pilot to the supplier of lightbulbs.
With a car, deferred or shoddy maintenance is highly probable and low impact. With an aircraft, if a mechanic torques a bolt wrong, 400 people are dead.
> You're going to write a lot of code in blood this way.
Waymo has been doing a lot of driving, without any blood. They seems to be using a combination of (a) learning a lot from close calls like this one where no one was hurt even through it still behaved incorrectly and (b) being cautious so that even when it does something it shouldn't the risk is very low because it's moving slowly.
Waymo operates in a very limited scope and area. I would not attempt to extrapolate anything from their current performance.
I absolutely would, since operating in a slowly growing limited scope and area is a part of the safety strategy.
> Waymo operates in a very limited scope and area. I would not attempt to extrapolate anything from their current performance.
This is less and less true every year. Yes, it doesn't drive in the snow yet, no, I don't drive in the snow either, I'm ok with that.
Very limited scope and area is now the whole of a few major cities.
https://support.google.com/waymo/answer/9059119?authuser=1
This is actually the one technology I am excited about. Especially with the Zoox/mini bus /carpool model, I can see these things replacing personal cars entirely which is going to be a godsend for cost, saftey and traffic
If you were trying to evaluate that code deployed willy nilly in the wider world, sure. But that code exists within a framework which is deliberately limiting rollout in order to reduce risk. What matters is the performance of the combined code and risk management framework, which has proven to be quite good.
Airbus A320s wouldn’t be very safe if we let Joe Schmo off the street fly them however he likes, but we don’t. An A320 piloted within a regulated commercial aviation regime is very safe.
What matters is the safety of the entire system including the non-technological parts.
I'm just curious to see how they handle highways more broadly which is where the real danger is and where Tesla got in trouble in the early days. Waymo avoided doing that until late last year, and even then it's on a very controlled freeway test in Phoenix, not random highways
https://waymo.com/blog/2024/01/from-surface-streets-to-freew...
Highways are pretty safe. The road is designed from start to finish to minimise the harm from collisions. That’s not true of urban streets
Waymo operates in San Francisco, Phoenix, Los Angeles, Austin, and Atlanta so I am sure they encountered school buses by now and learned from those encounters.
The human traffic code is also written in blood. But humans are worse at applying the patch universally.
We don't even try. In the US you demonstrate that you know the rules at one point in time and that's it, as long as you never get a DUI you're good.
For instance, the 2003 California Driver's Handbook[1] first introduced the concept of "bike lanes" to driver education, but contains the advice "You may park in the bike lane unless signs say “NO PARKING.”" which is now illegal. Anyone who took their test in the early 2000s is likely unaware that changed.
It also lacks any instruction whatsoever on common modern roadway features like roundabouts or shark teeth yield lines, but we still consider drivers who only ever studied this book over 20 years ago to be qualified on modern roads.
1. https://dn720706.ca.archive.org/0/items/B-001-001-944/B-001-...
> Anyone who took their test in the early 2000s is likely unaware that changed.
That's silly. People become aware of new laws all the time without having to attend a training course or read an updated handbook.
I took the CA driver's written test for the first time in 2004 when I moved here from another state. I don't recall whether or not there was anything in the handbook about bike lanes, but I certainly found out independently when it became illegal to park in one.
Some places will dismiss a traffic ticket if you attend a driver's education class to get updates, though you can only do this once every few years. So at least there have been some attempts to get people to update their learning.
This only happens if you get a traffic ticket, which is rare and getting rarer.
Ironically this means the people with the cleanest driving record are least likely to know the current ruleset.
Which, ironically, would mean that knowing the current rule set is not needed to drive safe.
> You're going to write a lot of code in blood this way.
Maybe? In this particular case, it sounds like no one was injured, and even though the Waymos didn't follow the law around stopping for school buses, it exercised care when passing them. Not great, certainly! But I'd wager a hell of a lot better than a human driver intentionally performing the same violation. And presumably the problem will be fixed with the next update to the cars' software. So... fixed, and no blood.
I haven't dealt with a school bus in....maybe 20 years, and it would definitely be an exception if I had to deal with one tomorrow. I kind of know what I should do, but it isn't instinct at this point.
A waymo, even if it drove in urban Seattle for 20 years where school buses aren't common, it would know what to do if it was presented with the exception tomorrow (assuming it was trained/programmed correctly), it wouldn't forget.
> With a robot driver we can develop a fix and roll it out on all of them. Problem solved.
I find that extremely optimistic. It's almost as if you've never developed software.
I am curious about Waymo's testing. Even "adding a regression test" can't be simple. There is no well defined set of conditions and outputs.
> Try to do that with human drivers.
At least where I live, the number of cars and car-based trips keeps increasing, but the number of traffic deaths keeps falling.
I disagree about the fixing, because ultimately self driving services will have political power to cap their liability. Once they dial in the costs and become scaled self sustaining operations, the incentive will be reduced opex.
I think the net improvements will come from the quantitative aspect of lots and lots of video. We don’t have good facts about these friction points on the road and rely on anecdotal information, police data (which sucks) and time/morion style studies.
> ultimately self driving services will have political power to cap their liability
You're fighting an objectively safer future on the basis of a hypothetical?
Also, we already have capped liability with driving: uninsured and underinsured drivers.
even if we had good data, the major problem in the US is that the funding liabilities of transportation agencies generally massively outweighs revenues, particularly if legislators keep earmarking already limited funds for yet more road expansion in their districts.
Unless there is one car that everyone drives, it will never be this easy.
And if there is one car that everyone drives, it's equally easy for a single bug to harm people on a scale that's inconceivable to me.
Well, maybe not "this easy", but if we can all agree on an extensive test suite that all autonomous cars have to follow to be allowed on the road, it'd be almost like that, without the risk of a single bug taking down all of them.
… assuming the GiantCorp running the robotaxis cares about complying with the law, and doesn’t just pay a fine that means nothing to them.
The first fines should be meaningless to the company. If the issue isn't fixed the fines should get higher and higher. If the company fixes one issue but there is a second discovered quickly we should assume they don't care about safety and the second issue should have a higher fine than the first even though it is unrelated.
Companies (and people) have an obligation to do the right thing.
>The first fines should be meaningless to the company.
Why?
Because 100 million dollars isn't a reasonable fee for a traffic violation.
Fines are completely useless if they are small enough to be considered "the price of doing business".
What do you mean by "second issue"? A second instance of the same underlying problem, or a different underlying problem? The way you phrase it as unrelated suggests the latter to me.
It's pretty wild to jump straight to "they don't care about safety" here. Building a perfect system without real world testing is impossible, for exactly the same reason it's impossible to write bug-free code on the first try. That's not a suggestion to be lax, just that we need to be realistic about what's achievable if we agree that some form of this technology could be beneficial.
The courts get to decide that. Often it is a "I know it when I see it". The real question is did they do enough to fix all possibly safety issues before this new one happened that was different. If they did "enough" (something I'm not defining!) then they can start over.
Waymo seems more interested in delivering a true solution than I have seen elsewhere.
the discourse around “corporations” has gotten absolutely ridiculous at this point, especially on this website.
It’s not an unreasonable take given historic behavior. Rather than decrying the cynicism, what steps can we take to ensure companies like Tesla/Waymo/etc are held accountable and incentivized to prioritize safety?
Do we need hasher fines? Give auto regulators as much teeth as the FAA used to have during accident investigations?
Genuinely curious to see how addressing reasonable concerns in these areas can be done.
Why isn't allowing people to sue when they get hurt and general bad PR around safety enough? Did you see what happened to Boeing's stock price after those 737 crashes?
I’d counter that with the Equifax breach that raised thei stock prices when it became clear they weren’t being fined into oblivion. Suing is also generally only a realistic option if you have money for a lawyer.
Right. We have a precedent for how to have an ridiculously safe transportation system: accidents are investigated by the NTSB, and every accident is treated as an opportunity to make sure that particular failure never happens again.
South Park had a good satire on this sort of generic anti-corporation comment. paraphrasing
"Corporations are bad"
"Why?"
"Because, you know, they act all corporate-y."
https://www.tiktok.com/@plutotvuk/video/7311643257383963937 (sorry googles first result was titktok)
It's ironic given this forum began as a place for aspiring startup (Delaware C-Corp) founders.
Some of us came here because we were finding programming.reddit.com too mainstream (after all this thing was written in Arc! of which almost no-one knew any details for sure, but it was Lisp, so Lisp cool), for sure we weren't visiting this place in order to become millionaires.
Even though I agree, there was a time and a place (I'd say 2008-2010) when this forum was mostly populated by "I want to get rich!" people, maybe that is still the case and they've only learned to hide it better, I wouldn't know.
I agree, much of the people here are still way too lenient when it comes to big corps.
We feared the advent of LLMs since they could be used as convincing spam tools. Little did we know that humans would often do the same.
> a fine that means nothing to them
Yes, this is often the case. In this instance, though, endangering children is just about the worst PR possible. That's strong leverage.
This^^^ -- the impact of positive vs negative PR is unusually huge with this type of tech.
It's a strength if you catch the bug and fix it before it injures anyone. If anything, this proves edge-cases can take years to manifest.
Why accept the company's say so without any proof being offered or even a description of the fix? If it's been years and this kind of thing, described in regulations so clearly some attention was paid by engineers, still happens, then maybe fixing it isn't trivial.
Sure, perhaps we shouldn't accept the company's say-so, but this seems like a fairly easy thing for a third party to verify. If that's not being done, that's not Waymo's fault; lobby the local regulatory body or legislature to get that sort of thing required.
Maybe verifying isn't trivial either? Sometimes bugs only appear with a lot of interactions.
As a counterpoint, a large fine or jail time as a deterrent actually has meaning.to an individual.
For a company, it's a financial calculation.
https://en.wikipedia.org/wiki/Grimshaw_v._Ford_Motor_Co.
(Add the period to the end of the link, HN won't do it)
https://en.wikipedia.org/wiki/Grimshaw_v._Ford_Motor_Co%2E
What a dystopian view.
There's a video of the actual incident.[1] (Yahoo posted some file photo). The Waymo was entering from a side street, in front of the school bus. It clearly recognized that it was in an iffy situation and slowed to creeping speed, rather than blocking the intersection. No children are visible.
If the school bus has a dashcam, much better info may be available. This video starts too late.
[1] https://www.youtube.com/watch?v=uSjwolFxvpc
I dont get it. It looks like it just made a left turn in front of a stopped school bus? That's illegal?
In any case it seems like tiny issue. Illegal or not it didnt do anything dangerous
the point of a bus having lights flashing and the stop sign extended is that kids could be coming or going from any direction and especially when least expected. it's certainly a minor issue until the worst case scenario happens.
Lots of people make this mistake around school buses. It's probably time for a different system if we are worried about children's safety.
We have some nice initiatives here.
Either completely removing cars from streets near schools, or blocking cars when children are coming or leaving school.
https://fr.wikipedia.org/wiki/Rues_aux_%C3%A9coles_%C3%A0_Pa...
Sounds more like a testing problem to me. Honestly I can't even remember if this particular rule was on the license exam when I took it. I know it because I put more care into remembering driving laws but many people don't.
This implies the absence of a school bus with flashing lights means kids can't be coming and going from any direction when least expected. It's a horrible solution and just another example of reducing drivers' responsibility on the road and effectively making it the victim's fault for being there.
The Waymo is going to be on high alert at all times, regardless of any flashing lights or stop signs.
haha great proof that humans don't follow the law all the time just like Waymo.
Yes, if you see a school bus with its flashers on, you may not pass it. Period.
That depends on how many lanes there are and if there is a median.
And even that. Speed of the sensors and computer power will outperform humans if it were to drive and a child would be sprinting behind the bus.
> That's illegal?
The school bus' stop sign was extended and had red lights flashing. With the proximity to the intersection, it's most appropriately treated as an all-way stop.
Regardless of whether the bus' stop sign applies to cross streets, at some point in the turn the car is now in parallel with the bus, and the sign would apply at that point.
Also, you're blind to anyone who may be approaching the bus from the opposite side of the intersection.
On net, Waymos are safer than human drivers. Really all that matters is deaths per passenger mile, and weighted far less, injury/crash per passenger mile.
Waymos exceed human drivers on both metrics, thus it is reasonable to say that Waymos have reduced crashes compared to the equivalent average human driver covering the same distance.
Mistakes like this are very rare, and when they do happen, they can be audited, analyzed with thousands of metrics and exact replays, patched, and the improved model running the Waymo is distributed to all cars on the road.
There is no equivalent in humans. There are millions of human drivers currently driving who drive distracted, drunk, recklessly, or aggressively. Every one of them who is replaced with a Waymo is potentially many lives saved.
Approximately 1/100 deaths in the US are due to car fatalities. Every year autonomous drivers aren't rapidly deployed is just unnecessary deaths.
> Approximately 1/100 deaths in the US are due to car fatalities. Every year autonomous drivers aren't rapidly deployed is just unnecessary deaths.
You could improve driver training. American drivers are absolutely terrifying.
I'm not complaining, but like..maybe also do this for the vast majority of human drivers who also flout these rules.
I mean, we do. The problem is that you need to be physically present to catch and deal with those people, and you can only really deal with one party (others will do their thing while a police officer is dealing with the first driver they stop). Not to mention that drivers change their behavior if they see the police around, so it's harder to catch them in the act. So for a variety of reasons it's harder to solve the human driver problem.
With how cheap high definition cameras are, I don’t see why society needs a person to be physically present.
Yea so as someone who lives on a busy road with daily visibility into how many people flaunt the law I basically did this to force the city to make changes to the street. There really isn’t much you can do to the folks who break the law and drive away but high def video of daily shenanigans is great ammo for other types of solutions that force drivers into making better decisions.
I prefer the risk of death over constant surveillance
If only we could live in completely separate jurisdictions.
probably the minority on HN, but i don't. i think traffic enforcement cameras are good and should be expanded
If you're want to operate a deadly vehicle in public you need to compromise, sorry.
Sounds like you're the one who needs to compromise because most people agree with me, or we would already have such a system.
Sounds like you're confused about the world you live in if you believe there aren't millions of cameras, many with ALPR capabilities, pointed at the street already.
I propose they be made actually useful instead of merely surveillance for surveillance sake, but I can see how that would feel oppressive to drivers accustomed to getting away with murder.
A) everyone is already constantly surveilled via mobile networks and license plate readers, so surveillance is a moot point. We might as well get something out of it.
B) the system can be setup to purge and/or record only at relevant times or during infractions
I dont get why cities dont just put up a couple of drones
“Wasn’t me in the car”
"Don't care, car is registered to you, pay up"
This is only an issue because traffic code violations are treated like criminal acts instead of... code violations. We don't have this issue with parking tickets, there's no reason we should have it with automated red light and school bus cameras.
Hence high definition camera. Most states have tints on windshield and dark tints on front windows as illegal. Also, the license plate is all that is needed, ticket the owner and they will readily give up the driver.
Other countries have no issues with camera based traffic law enforcement.
At least in socal with the way camera based traffic enforcement it has basically no teeth and plenty of ways like my quote to weasel out. You can actually ignore the ticket that is mailed to you. I’m not even sure HD cameras would help here. You even have options built into the ticket to say it wasn’t you driving or that it was someone else you know of in a sort of check a couple boxes and mail it back fashion. However if you actually look up the status of your ticket with the ticket number on the web portal, then it counts as being served a ticket and you do have to pay or show up in court.
Seems the way the law works is it needs some piece of two way communication. It doesn’t seem to work on a one way basis like it might in other countries. Maybe it is because most of our laws concerning technology are very much still structured for an analog world. E.g how in this case the old ritual of you being identified to have acknowledged the ticket by the cop writing it and handing it to you is preserved by you having to show you’ve actually received the ticket and consent to its validity viewing its status online.
Yeah fair treatment for billion dollar corporations and robots and all. Who could forget. Waymo is such a lovely person, why would anyone ask them to do better?
I'm going to call bullshit on this. Most human drivers do not flout these rules.
No kidding. Try doing this once or twice and the driver will record your information and you’ll get a nice visit from the police.
Out here in rural nowhere it doesn’t — it just gets the sheriff on the local news begging people to stop instead of solving the actual problem at hand by placing patrols on the routes.
[1] https://www.wwnytv.com/2025/02/12/absolutely-terrifying-grow...
Out here in rural nowhere it most certainly does. The school bus driver will record your number plate, and school buses have the equivalent of dashcams now.
I think maybe they meant that the majority of vehicles that flout the rules are human-driven.
Human drivers can be seen and stopped by police and given an appropriate punishment. Self-driving cars have nobody to take accountability like that so you need to go back to the source.
Yeah but in many cases they're not, traffic enforcement went way down during Covid and it's still down.
Most large cities I've lived in, general traffic enforcement essentially only exists on that month's/quarter's designated ticket-writing day. i.e., when highway patrol and city police just write speeding tickets all day to juice revenue
I don't know what large cities you've lived in, but that's not what any experts seems to be saying in any piece I've ever read. https://www.nytimes.com/interactive/2024/07/29/upshot/traffi...
Their license to operate can be taken away, which is what happened to Cruise.
This is as close to functional as any car discussions get…citizens reported some issues, the government is checking on it, and it’s going to get fixed.
I have a question about the rules of school busses (I'm not American). It seems like the expectation is that _all_ traffic is required to stop if a bus is stopped, is that correct? If so, why?
Here (Australia) the bus just pulls over and you get off on to the sidewalk, even children, why is it not the case in the US?
I'm an Australian also, this is the video that blew my mind.
https://youtu.be/lShDhGn5e5s
It's a long video but the tldr is that Americans don't have foot paths. You would think they would but nope, it's not like Australia where everywhere you walk has a path and down paths to the road.
Even directly around schools no footpaths, and it's all because it's no one's responsibility other then the home owner.
As mentioned, in a lot of suburban areas in the US where school buses are common there are no crosswalks or traffic lights (or sidewalks or physical bus stops, for that matter). Most of the time there isn't so much traffic that stopping all of it is a huge burden.
Also, there's generally an exception for divided highways - if the road has a physical median or barrier, the oncoming traffic doesn't have to stop. I assume the bus route accounts for this and drops kids off on the correct side of the road.
(St)Roads where the kids have to cross a busy road to get to the other side where their house is.
In my case, a rural highway where traffic goes 55mph.
Is better to stop all traffic than force kids to figure out how to frogger through traffic.
> (St)Roads where the kids have to cross a busy road to get to the other side where their house is.
That's pretty different from my experience.
Almost all the school bus stops around here are on small low-speed residential streets.
And while there are surely some stops on faster 2-lane roads...
A stroad or major road would mean 4+ lanes, which in my state means the school bus only stops traffic on one side. No kids will be crossing at those bus stops.
Ah there might be some assumption here that I didn't realise. Typically we'd have a cross walk or traffic light near the bus stop where you'd cross. I'm in Sydney so I don't know of anywhere that you'd be going that fast that would also have bus stops (they max exist I'm just not aware of them)
The kids near me (in Melbourne, about 10km outside the CBD) just take the same public transport system as everyone else. You don't see school bus systems unless you're in the far outer suburbs, a regional/rural area, or maybe some other special cases.
Growing up, our school bus stop was on a service road off a 100km/h highway, but it had good visibility in both directions and most of the kids over the other side got dropped off by their parents while they were young.
Rural areas especially, but most small towns in the US don't have crosswalks.
The closest crosswalk to my bus stop as a kid was about 45 miles.
I’m also curious about school zones. The one near my house has a sign, “School” “Speed Limit 35” “7:00AM to 4:00PM School Days”
Now, how does a robotaxi comply with that? Does it go to the district website and look up the current school year calendar? Or does it work like a human, and simply observe the patterns of the school traffic, and assume the general school calendar?
I suspect it continues in Mad Max mode.
Wait, how does that work? Every person in your city needs to know the exact calendar of that school?
In NSW (Australia) that's exactly how it works. And it includes 'pupil-free' days where there are no students present. My old school even had a pedestrian bridge and barriers so that it wasn't even possible to get to the road.
It's so silly, when the obvious solution is to make school zones 40km/hr (25mi/hr) at all times, or to fix the road design. Typical speeds here are 60km/hr (40mi/hr), so anyone making the argument that it would 'slow traffic' is being dramatic.
(There is one exception that I know of - our east coast highway used to go near a school, which forced a change from 110km/hr (70mi/hr) to 40km/hr. In this case I will concede the speed is not the issue, the highway location is the issue)
> (There is one exception that I know of - our east coast highway used to go near a school, which forced a change from 110km/hr (70mi/hr) to 40km/hr. In this case I will concede the speed is not the issue, the highway location is the issue)
They couldn't just put up a fence?
In Victoria there is usually (not certain if it's always) a changeable sign and flashing lights if it's reduced to 40
In the UK we have a sign saying something like "20MPH WHEN LIGHTS ARE FLASHING". During term time when pupils are entering or leaving the school (say between 8am and 9:30am, around lunchtime, and from around 3pm to 4:30pm) someone at the school switches the lights on. Usually it's one of the "lollipop men" who stand at crossing points that are not otherwise marked, and hold out a sign to stop traffic to let children cross, but often it's just programmed into some timer somewhere.
It's pretty simple.
You don't need clever software or self-driving cars, you just need to lift your right foot a little near schools.
https://maps.app.goo.gl/34QgN2KTQmGML2Ae8
Here is an example of one that just lights up with a 20mph limit when it's needed, from near where I grew up. Pretty high-tech for a remote part of the world, eh?
Ok, I get it. Here we just have 30kph limit with speed bumps at all times.
Yes, that's how it works in Alberta. It's particularly confusing because not all schools have the same academic calendar (e.g., most schools have a summer break, but a few have summer classes).
Unlike the sibling comment, there are no lights or indications of when school is in session. You must memorize the academic calendar of every school you drive past in order to know the speed limit. In practice, this means being conservative and driving more slowly in unfamiliar areas.
This is another example of something where, at least if you want to get all the way to completely correct operation, it's easier for an driverless car than a human. A person can't memorize the schedules of every school district they might pass, but an automated system potentially could. Of course something like Google Maps could solve this too, for both humans and Waymo.
Where I am, the school zone signs fold up; during the off season, they're folded and say things like drive nice; during the on season, they are unfolded and present the limit.
In the area I live, the wording is frequently "when children present" so you don't need to know school schedule.
Present means "present in the school." It's not always observable while driving by if you need to obey the reduced limit or not. California does it and I find it absurd.
Many other states setup a flashing yellow light and program the light with the school schedule. Then the limit only applies "when light is flashing." Far more sensible.
This is the one most familiar to me. Usually the signs have flashing orange lights to indicate when they're active, but sometimes not. You generally know roughly when the kids are in school (maybe look at the school?), and follow what other drivers are doing. Things like this are why I think fully autonomous driving basically requires AGI.
the sign says the hours for the reduced speed limit or, more commonly in my experience, has a light that activates during school hours.
The light and often even the sign itself are typically considered informational aids rather than strict determinants of legality. The driver is expected to comply with all the nitpicky details of the law regardless of whether the bulb is burned out or the school schedule changes.
Needless to say, most people regularly violate some kind of traffic law, we just don't enforce it.
of course. i'm confident slowing down near a school is pretty intuitive for the vast majority of drivers, though.
Sure, but the context here is a discussion about how a computer can know all of these "intuitive" rules humans follow.
The answer is encoded in the map data in this case, but it's an interesting category of problems for autonomous vehicles.
Have you had the experience of riding in a Waymo making a left hand turn against incoming traffic -- and how it handles the eventual yellow light?
I was very impressed about the decision making in this situation. Seems very intuitive (at least superficially).
it wasn’t at first but I suspect they received a ton of feedback and fixed it.
in my estimation the robo driver has reached a median-human level of driving skill. it still doesn’t quite know how to balance the weight of the car through turns and it sometimes gets fussy with holding lanes at night but otherwise it mimics human behaviors pretty well except where they’re illegal like rolling through the first stop at a stop sign.
Now I’m imagining the Waymo Driver calling out to Gemini to determine "school hours" by looking it up on the Internet, and wondering about the nature of life.
You are not penalized for failing to go over 35 on non-school-days. School zones are sufficiently small that the time penalty for complying on a summer weekday isn't that much of an inconvenience.
You can always just slow down for 30 seconds if you're not sure.
Aren’t they supposed to read signs? Otherwise they’d also ignore the overhead speed limits on the highway for traffic jams / air quality adjustments during the day.
GP is saying that reading the sign is insufficient to determine whether it is a school day. You have to either guess based on the presence of students or busses, the lights being on, etc., or you have to source the school calendar somehow.
I don't know how it is there, but here those signs near schools light up and blink during school hours (really can't miss it). And for signs that do not, I think school days are pretty fixed, shouldn't be difficult to program... and a default of just slowing down would be just fine too.
They should just always observe the lower speed limit.
The difference is usually 5 or maybe 10 mph.
Which over the distance of a school zone is nothing.
It can be dangerous though. In my area we have roads with speed limits of 45 that drop to 25 "when children are present". My EV always assumes children are present as it has no real way to determine if they are. Driving 25 in a 45 is dangerous for many reasons.
Neither do you, the lower speed limit applies when children are present inside the school.
A building that looks the same with it without children inside of it.
Are school days ever Sundays? If not, perhaps all drivers just treat every non-Sunday as school day. If so, they probably just slow every day.
At some point self driving cars will need their own loser driving laws.
Perhaps allowing them to drive around school buses is not a good idea, although personally I have felt far safer biking or walking in front of a Waymo than a human. But rules few humans follow, like rolling stops, and allowing them to go 5 over seems like a no-brainer. We have a real opportunity here to br more sensible with road rules; let’s not mess it up by limiting robots to our human laws.
What do we have to gain by allowing self driving vehicles to roll through stop signs?
I cannot wait for the school bus to be a waymo, that could tell the other waymos around that it is full of vulnerable and unpredictable little humans, and to be on the watch out.
I can't wait until every car on the road is required by law to be self driving. You could have cars with no adults in them just driving puppers around, and it can tell the other cars hey watch out I've got a couple of good pupperinos inside so watch out!
The future is gonna be awesome. I fricking love science! Once we unlock self driving car technology, we will finally be able to move people and things from one place to another. All we have to do is force everyone on the road to install a transponder in their car that allows the government to track their location at all times, and develop a massively complex computer-camera system inside of the car that phones home and controls what the car is allowed to do.
This is a great technology and has clearly made great strides, but at this time it is hard to trust. These vehicles have had many problems that human drivers do not. Problems with maps can cause dozens of them to collect in dead end alleys. They may stop on busy one lane roads. They consistently fail to react appropriately to responders and emergency situations. And even if the supposedly reliable recording and reporting work out it is not always clear who is responsible when things go wrong. Simply not killing as often as humans is not good enough for mass deployment of this technology.
"approached the school bus from an angle where the flashing lights and stop sign were not visible"
I call bullshit on that. Yes the stop sign is only on the left side but the flashing lights are on all four corners of the bus. You'd need to be approaching the side of the bus from a direct right angle to not see the flashing lights.
there have been increase of "aggressiveness" of autonomous cars. My earlier comment - https://news.ycombinator.com/item?id=45609139 . May be that aggressiveness is sold internally as some optimization enabled by the higher skills of the robot-driver.
San Francisco is the crucible (by US standards) of dealing with pedestrians and I'm still shocked they launched there so early. But with something as distinct and vulnerable as school busses, it's time to think about hardware installation so automated vehicles can "see" farther ahead.
I'm sure it's only a matter of time before the tax payers get to subsidize these hardware installations instead of our own public transit.
[dead]
Off-topic... what poor writing:
> a Waymo did not remain stationary when approaching a school bus with its red lights flashing and stop arm deployed.
Because it's physically possible to approach something while remaining stationary?