Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is the only realistic way forward for attempting to do autonomous cars. Humans use visual range light to make their decisions on where to drive. If automated vehicles use anything else their decisions will be based off different information and likely lead to different outcomes.

This is especially highlighted in the parts of the USA that have a real winter with permanent snowcover. In these areas of the country sometimes the road surface is not visible for months at a time and the lanes that are formed have little to do with existing road markers. Instead humans sort of flock and form new emergent lanes and behaviors. And all the other humans (mostly) follow this visually in a low constrast (white snow) environment. Any system that uses absolute positioning or non-visual cues will not be able to follow the emergent lanes and cause danger.



I agree following fixed lanes with stored maps and GPS won't work on snow-covered roads. But you lost me in two places.

First, does following others' tracks necessarily mean visual-range light? I'd expect you could see tracks on infrared. Probably also LIDAR—this isn't my field of expertise but obviously tracks have depth, so I'd think it'd work if the resolution/precision is sufficient. (Not sure how well LIDAR works while snow is falling but that's a different concern than the one you mentioned.)

Second, does using visual-range light (for some or all of the input) mean having automatic high beams on? I very rarely use high beams myself. (Then again, maybe automatic high beams very rarely turns on too.)

And certainly if other vehicles/pedestrians are actually present at the moment—when it's most important to be behaving like them—infrared, LIDAR, and RADAR are options for seeing them.


It depends - be aware though that the more you diverge from the sensor input others use, the more you tend to diverge from others behaviors. You see different data.

It’s important (often more important) while driving that you’re doing what others expect, less what the rules say must be done - especially on edge case behavior.

Each sensor suite has it’s own pros and cons - LiDAR can have real challenges with reflective surfaces or highly absorptive ones (wet and slick, oily, snow). It’s range is based off return signal strength. There are also problems with many LiDAR sensors and daylight drowning out the signal.

Time of flight sensors (really a type of ‘broadcast’ LiDAR) have similar issues combined with some weird edge cases with reflective geometries or some surfaces.

Passive visual light sensors have issues with contrast (high signal strength drowns out low signal strength in nearby areas) and lacks useful information about time of flight unlike LiDAR. They are Cheap though generally, and give us a signal we generally think are ‘obvious’

Active radar sensors (including phased array) also provide very useful signals, also have pros and cons.

Sonar, same.

Ideally you’d have 360 coverage from enough different sensors that you can do sensor fusion and detect and exclude a sensor in situations where you’re hitting a known problem for a sensor suite. Looking into the sun? Well visual and potentially LiDAR/ToF data is iffy, switch to sonar and radar. In a high EMF environment? Surrounded by metal? Switch off radar perhaps.

Those cost money - equipment and development - however.


> It’s important (often more important) while driving that you’re doing what others expect, less what the rules say must be done - especially on edge case behavior

First rule of the road is avoid accidents. From there flows rules like driving in a predictable manner and giving up right of way when necessary.


Unfortunately as self driving cars make apparent, the rule is about as clear cut or practically useful from an engineering perspective as Asimov’s 3 laws of robotics.

If you are in a climate with lots of snow and slush, and are in a road with zero lane visibility and everyone is moving along too fast for road conditions (unable to effectively stop in time stopping distance wise).

Do you 1) go slow in the right hand lane, potentially leading to a pileup? 2) keep up with the road traffic (maybe trying to stay in the low side), and then maybe get in a wreck if a deer jumps out in front of you? 3) know this is a common situation for that road in the current weather condition and avoid the road entirely? 4) some variant of these?

In many cases it can only be judged retroactively, or as Waymo or others are finding out, REALLY be respected by being so paranoid and conservative you can’t really function in the real world environment that presents itself and/or require essentially walled gardens so the environment is controlled enough you can have the certainty you need. The waymo answer here is ‘don’t drive at all’. Which is great, unless you’re running out of food and a bigger storm is coming in, or you’ll lose your job if you don’t show up in the next 39 minutes.

Humans manage to be somewhat functional in these kinds of environments - we’ve had to be, evolutionary pressure wise. We’ve got a ways to go before self driving cars will be a help instead of a hindrance here.


Humans make a lot of mistakes. Full self driving cars can not afford to make those mistakes. It has to be better than humans by a significant amount or you can forget adoption.


Arguably self driving cars can unlock a lot of utility which means that they don't really have to be better, they could even be worse than the average human driver still be a utilitarian improvement. Consider autonomously dropping off children (freeing up parents), providing more mobility to the disabled and elderly and perhaps even reducing the overall number of cars that have to be owned (and thus produced). If we consider every minute spent in front of a steering wheel a minute wasted then it would even save many QALYs.


For a lot of people those minutes behind the wheel are amongst the few minutes in the day that they feel like they are free agents. Not every minute spent in front of a steering wheel is wasted.


Perhaps, but that doesn't fundamentally change my argument. Those people can keep driving if they like it but for everyone else it can still free up time and thus be counted towards the benefits.


The issue with human drivers isn't that we can't see well enough. It's that we're easily distracted and have slow reaction times. A self-driving car can be way better than human drivers without needing special cameras or other sensors that would give it superhuman vision.


We could also just continue legislation and enforcement of distracted driving.

The touch interfaces in cars have really been a step backwards in this regard.


In Switzerland we have very strict laws regarding just using navigation equipment or eating while driving. Yet for whatever reason cars with touch interfaces hiding important features are permitted.


Revisit this when we have actual AI to consider.


> If automated vehicles use anything else their decisions will be based off different information and likely lead to different outcomes.

"Leading to different outcomes" is exactly why autonomous driving is attractive... Why is that a cons for you ...?


At least in the beginning, self-driving cars need to be compatible with human drivers, who are also on the road.


Not to mention our entire driving infrastructure, which is wholly built around visual driving.


Because while humans and self-driving cars are sharing the same roads, it's important that they agree on where the lanes are.


By this logic we should not use radar for driving in adverse conditions just because it sees more clear than a human driver.


It’s funny you say that. In heavy rain my car barfs at me that radar doesn’t work anymore.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: