Tech pundits have predicted that within the next decade, the majority of vehicles on the road will be autonomous. Already, some are dreaming about what they can do when the onerous task of driving is gone.
Then of course, a Tesla car on autopilot got into a fatal accident on a highway.
If sudden death isn’t an indicator that the technology isn’t ready for widespread use, I don’t know what is.
But the accident doesn’t seem to have slowed the full-steam-ahead approach to autonomous driving taken by many car manufacturers, a group which includes Toyota, Volvo, Tesla and others.
But for those of you who dreaming about a driving-free life, here’s my advice: wait.
Why? There are several reasons.
Peel off the futuristic outlook on self-driving cars and you’ll see its full implications. Yes, you’re entrusting artificial intelligence with transport. But more importantly, you’re entrusting artificial intelligence with your life.
I can trust AI to predict my typing habits. But trust my life with it? Not yet. To be clear: I’m sure the engineers didn’t stick a self destruct mode into the code. However, bugs are pretty hard to avoid.
A bug on a smartphone might be frustrating, but a bug in a self-driving car might be fatal.
I don’t think any company can claim a bug-free program from day one. And if the autonomous driving industry keeps rushing around as fast as it is doing now, the chance of bugs in the first release is more of a certainty than a chance. And a bug on a self driving car is quite different than one on a smartphone. Because while a bug on a smartphone might cause an app to crash, a bug in self-driving car might, well, cause your car to crash.
Which might end in death, as Tesla’s recent example showed.
Then there’s the government. As far as I can tell, every country has legislation concerning vehicles. What happens when AIs become drivers? As far as the law goes, I think it’s pretty hard to charge a computer for manslaughter.
Of course, they could charge the companies. Try to imagine that for a moment. You could draw a comparison to when companies get in trouble for negligence on the workplace, but that’s still a far cry from their product killing people.
My point is governments will need time to write legislation involving autonomous vehicles. And if self-driving cars hit the road before that happens, there’s going to be some confusion in the legal system.
All in all, no matter how you put it, autonomous driving is still incredibly new. Coupled with the possibility of being incredibly dangerous, the safer option is to let car manufacturers sort out their problems before trying to drive one of their cars.
After all that, if you really want a self-driving car, I’ll have to recommend Volvo, simply because they said that they’ll take all the blame if there’s an accident.