There’s a popular social media refrain that denotes sarcastic skepticism: What could go wrong? With the advent and rollout of driverless cars, I submit, what could go right?
Driverless cars. It sounds great. Just the imagery that’s conveyed suggests instant chauffeuring where you’ll luxuriate in the back seat of your trusty chariot while a veritable and virtual AI and robotic James tools you about. Even after you’ve tippled a tad excessively, no need to call Uber or a designated driver, your driverless car will be there to safely dispatch you and deposit you safe and sound. Now, whenever something sounds too good to be true it usually is, so the apothegm goes. So let’s review some not-so-obvious possibilities.
Let’s start with liability. In the event your driverless car is involved in an accident with another driverless car or a pedestrian, who’s liable? Who can be sued? As a practicing lawyer, I can attest that the elements of negligence that would normally apply to you and me may be a tough fit here. Why? It seems that the first element of negligence that a court or jury would apply is whether there’s a standard of care that was breached by you, the putative driver. But wait, you’re not the driver. You’re the passenger. But it’s your car. True, but you never programmed the computer algorithm or AI feature that ostensibly misfired. So who’s actually at fault? Who’s negligent? There will be a number of legal doctrines that will be introduced in this piece and the first one is the universal answer for any and all legal questions: It depends. If you, the passenger of your driverless car, never programmed or adjusted or tampered with the onboard computer, you’re off the hook. So who’s liable? On the count of three everyone: It depends.
So, who actually controls your car? Better yet, what parameters are configured in the operation of your car? Speeds, terrain, location, weather conditions, traffic congestion – these factors would be taken into consideration by the onboard system and therefore you’ve turned over control of your car to _____. And therein lies the next question: Who’s driving? Better yet, is it the company that manufactures your car or the local authorities, whoever that may be, or the software manufacturer who created the feature? Will a series of “land traffic controllers” override the autonomy of your vehicle similar to air traffic controllers at airports? And, when you decide to drive independent of the system, with your “autopilot” turned off, will you enjoy unfettered control and attendant liability?
Now, imagine you’re in hurry and find that you can’t go faster than 55 mph on the highway? That’s all well and good as that’s the speed limit, but say you’re rushing your wife who’s about to give birth to the hospital? Or, imagine you’re being pursued by a potential carjacker and you want to escape. The law will excuse certain behaviors under the doctrine of necessity or justification and that’s for a good reason. Sometimes exigent circumstances warrant and excuse your exceeding speed limits or otherwise violating a law. How are emergencies handled by your version of HAL 9000? Let me introduce the second legal doctrine: Lionel’s Law. Simply stated, the law always lags behind technology.
Let’s really get 1984, shall we? Imagine you either desire to be driven or drive yourself to a destination X. You either punch in the location’s coordinates or your car notes through its GPS system where you are. But in this situation, assume arguendo there’s a curfew or a governmental travel ban or, being completely paranoid yet hypothetical, you’re of a demographic that certain sectors care not for you to visit or frequent. You enter the coordinates and your vehicle declines to take you there. That’s right; your vehicle that you paid for blocks your travel for a host of theoretical reasons. You’re not welcome at destination X.
And then, track and tax. Your vehicle is now a data collection unit that can be directed to a host of agencies and bureaucracies including those who would seek to tax you on the amount of driving you actually do. Sound farfetched? Don’t tell that to the folks who brought you OReGO, the Oregon Department of Transportation’s new road usage charge program. And while we’re on the subject, in that your automobile is now a data collection funnel, were you to be able to drive excessively or recklessly, watch your insurance rates spike. It’s all being recorded.
Now, let’s really go dark. When systems go online they’re subject to being hacked. Look no further than the peculiar circumstances of controversial investigative journalist Michael Hastings' 2013 death in Los Angeles when he was driving a 2013 Mercedes C250 that he reportedly crashed into a tree. The facts raised a few eyebrows that inspired a number of “conspiracy theories.” Could his car have been hacked? Former US National Coordinator for Security, Infrastructure Protection, and Counter-terrorism Richard Clarke told The Huffington Post that what is known about the single-vehicle crash is "consistent with a car cyber attack.”
Could terrorists load a driverless car with explosives and program it to act as a guided missile? Would intelligence agencies have to apply for warrants when tapping into GPS and car data systems? And would the police car chase be a thing of the past or would they become more dangerous as criminals would be able to fire upon pursuing police while their driverless cars calculate escape routes? And how do you make driverless cars childproof? The questions are seemingly endless.
As we usually see, technology is always met with uproarious applause followed by an almost immediate deep sense of remorse and regret for not having anticipated that which is now starkly evident. Will we ever learn?
It depends.
Lionel is an Emmy Award winning lawyer, legal analyst and news decoder.
The statements, views and opinions expressed in this column are solely those of the author and do not necessarily represent those of RT.