For a long time I was convinced that the driverless car dream was nothing more than that. But between the accelerating development of Google's self-driving roadsters and the unveiling of Britain's autonomous pavement vehicles, I realise my nightmare is very quicky unfurling.
You see, I've got a bad feeling about driverless cars.
As someone who's forever paranoid that my sat nav is going to send me the wrong way up a one-way street (it's happened before), the idea of bequeathing my safety to a robot seems ludicrous.
Of course, plenty of you will disagree with me. I even put my concerns to geneticist and broadcaster Dr. Adam Rutherford, a bit of an expert on matters of human intelligence.
"Well it could be that AI in driverless cars reacts better than a human would," he tried to reassure me. "There's an assumption that our reflexes process complex information in a heartbeat and make the best decision.
"There's no reason to suppose that a sufficiently advanced AI could do better. If you think about it, so many road traffic accidents occur when humans are tired, or distracted or drunk, or high. An AI, we would hope, would not be susceptible to those factors."
Sure, they might be safer for those reasons, but I'm scared car AI is susceptible to more dangerous things - malfunctions, dodgy firmware updates, questionable moral choices, murderous dispositions - you know, the usual rational fears.
Robots and ethics are a tricky mix
Who gets to guide the car's moral compass? Does the car hit the child to ensure the safety of the middle-aged "driver", or does it swerve out the way and kill the driver who was probably only going to live another 20 years anyway? Will our cars be utilitarian or libertarian? It might sound like a silly point right now, but these are important, inescapable questions that won't have straightforward answers.
All it will take is that first accident before the 'driverless car kills humans' headlines are splashed across the front pages and the brake lights come on. Just imagine the legal mess when your driverless car hits another, let alone a person. Which insurance company in its right mind is going to cover these things when they first hit the road, without demanding a fortune? And who's to blame when the inevitable does happen?
Plus, with different cars being built by different manufacturers, there's the risk of fragmentation. I'll assume that our driverless cars will eventually all work on some universal safety protocol if it helps me sleep better at night, but what if the law allows car makers to use their own software?
Not to mention that movie car chases are going to be pretty drab. "OK Google Car, evade approaching henchmen" "Warning: updating firmware to Cardroid 4.7"
And what if, you know, they start talking to one another? The moment our cars declare independence is the day I want you to come back to this article and apologise for not taking me seriously.
But suppose our cars don't rise up against their masters, and suppose they're 100% safe all the time, what about the pure enjoyment of driving? My self-driving Lotus will suck out all the fun as it opts for the road less sinuous.
I love driving. There are few things better than hitting an open road at sunset with my "road rhythms" playlist blaring through the speakers. Letting someone else do it for me will just never be as good, regardless of how safe it is.