A routine run has exposed a flaw in autonomobiles
And surprise, surprise, it was a human’s fault
In case you hadn’t noticed, computers are rather clever. But for all their incredible power, they are, at their core, machines. Pun absolutely intended.
This became obvious in Las Vegas, when an autonomous shuttle bus found itself flummoxed by a situation that a reasonably perceptive human could have avoided, almost out of reflex.
Sensing a delivery lorry stopping on the street, the autonomous systems stopped the bus and waited for it to move. So far, so good.
But the driver of the articulated lorry was actually backing up into an alleyway. The human thought process would go like this: observation of scene, recognition of intent, reaction. Unfortunately, as far as the tech has come along, the ‘recognition of intent’ part isn’t something computers can handle yet. So the inevitable happened: the truck backed, slowly, into the stationary bus.
So, yes, a human was at fault. But this also exposes a fairly serious flaw in the programming and electronic architecture of at least one autonomous vehicle: the use of contextual clues and past experience to predict possible future outcomes. Or, put simply, the ability to see a truck reversing and get clear, just in case.
We (well, at least some of us) are hopeful that autonomy will transform the commuter and utilitarian car landscape for the better. But it appears that left-field problems like this will keep popping up in the meantime.
Top Gear
Newsletter
Thank you for subscribing to our newsletter. Look out for your regular round-up of news, reviews and offers in your inbox.
Get all the latest news, reviews and exclusives, direct to your inbox.
Trending this week
- Electric