The problem with autonomous driving is that even within the relatively “unexiting” environment of urban road networks, there is an astonishing range of elements that introduce quirkiness and confuse driving AIs to a debilitating point. The most recent example of that comes from one of Waymo’s self-driving taxis that operate in Phoenix, Arizona, and the culprit was an “innocuous” construction cone.
As seen in the following video, in minute 12, the problem appeared when the car attempted to make a right turn onto a four-lane road, but one of the lanes on that road was closed off with road cones. The car just sat there seemingly in a logical deadlock, unsure of what it was supposed to do.
After a few minutes of waiting, the rider called the Waymo Roadside Assistance, but until they got there the car made the turn and then sat like a duck in the middle of the road. The construction workers eventually removed the cones but the Waymo taxi refused to do anything. At some point, the Roadside Assistance team arrived but before they had the chance to intervene the car decided to just run away (minute 24).
The car then meets cones down the road, so it gets stuck once again. After another five minutes, the road assistance team arrives and the car runs away from them again. Eventually, Waymo’s on-road support team manages to reach it, unlock the doors, and assume control to complete the trip manually.
For Waymo, this was just an opportunity to identify a bug and fix it, improving their operational process as a result. For the rest of the world, this was another example underlining the fact that we’re not there yet. Until that 12-minute point comes, everything works so impressively well, but it is inevitable that some kind of complexity will manifest on the road sooner or later. For autonomous driving systems to be able to evaluate everything as a human mind does, it may still be a while.