Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I've been reading those DMV reports for years, and there are clear patterns which repeat. One is where an autonomous vehicle started to enter an intersection with poor sight lines to the cross street. Sensing cross traffic, it stopped, and was then rear-ended by a human-driven vehicle that was following too closely. Waymo has that happen twice at the same intersection in Mountain View. There's a tree in the median strip there which blocks the view from their roof sensor until the vehicle starts to enter the intersection. So the Waymo system advances cautiously until it has good sensor coverage, then accelerates or stops as required.

Humans tend not to do that, and, as a result, some fraction of the time they get T-boned.



> Humans tend not to do that

AI should absolutely mimic the behavior of real (good) drivers.

Although that's not what you're describing here, another problem for AI could result from it knowing more than an average driver; for example, if a high-mounted LIDAR were able to see around corners and let the car decide it's "safe" to do a turn that no human would attempt for lack of visibility, that could cause problems.

(Also, it's surprising that an autonomous car doesn't detect that another car is following it too closely, and slows down appropriately in anticipation. How is this not taken into account.)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: