From my understanding the reason the Waymos didn't handle this was because humans were breaking traffic rules and going when they shouldn't have been. If most humans navigated it correctly, then waynos would have handled this better.
It's mentioned in the article, the real problem was they kept trying to contact remote support to "verify" the light was out. Leading to a backlog of requests which they couldn't get through fast enough.
This attitude is exactly how the Waymos came to handle the problem so poorly in the first place. The principal Skinner "everyone else else is wrong" bit is just icing on the cake.
Can't just program it to be all "when confused copy others" because it will invariably violate the letter of the law and people will screech. So they pick the legally safe but obviously not effective option, have it behave like a teenager on day 1 of drivers ed and basically freeze up. Of course that most certainly does not scale whatsoever, but it covers their asses so it's what they gotta do.
traffic safety engineers often have influence on the letter of the law. We would all be better off if people followed it (humans are bad judges of the exceptions)
But over here in the reality of the world that we have to interact that, this concept of perfect rule-following will never, ever happen -- unless something first manages to wipe the last of the stain of humanity off of the earth's face.
From my understanding the reason the Waymos didn't handle this was because humans were breaking traffic rules and going when they shouldn't have been. If most humans navigated it correctly, then waynos would have handled this better.