Futurists have been raving about driverless cars for years. They foresee
a world where fuel efficient cars glide safely through cities, avoiding
the kinds of crashes routinely caused by humans. People won’t have
to own cars at all in that world. They’ll just summon them when
they need them, and relax while the car’s artificial intelligence
gets them where they want to go.
We all hope that day will arrive. Unfortunately, we got two grim reminders
last month that it’s still a long way off. First, a Uber self-driving
vehicle in Arizona struck and killed a pedestrian while in autonomous
mode. The pedestrian was jaywalking, but the critical point is that the
car’s robotic driving system failed to recognize her. It struck
her at highway speed and never even tried to swerve or slow down.
About a week later, a Tesla operating in autopilot mode crashed into a
concrete barrier on a California highway and burst into flames. The driver
survived the impact but died shortly afterward in the hospital.
Tesla took pains afterward to explain that autopilot is not a fully self-driving
system, and that the driver should have taken the wheel himself. Of course,
that’s part of the problem: many drivers just get seduced by the
computer. They misunderstand the system and assume they never have to
take back the controls at all.
Given these disturbing events, Florida drivers may not be thrilled to learn
the same thing could happen here. Florida lawmakers have given self-driving
cars almost free reign to use our state’s roads as a laboratory
(see previous blog on this subject,
Florida ahead of the curve on driver-less cars, January 9, 2018). Self-driving cars are being used by Ford on a test
basis to deliver pizzas in Miami. Plans are in place to introduce them
to the Villages, central Florida’s massive retirement development,
as taxis for residents. Thus, it may not be long before Florida drivers
find a self-driving car - even one with no backup driver inside - pull
up next to them at a traffic light.
While we all want progress, the Arizona and California crashes show us
self-driving cars may not be ready for prime time. We believe our lawmakers
in Tallahassee should reconsider our “anything goes” self-driving
vehicle law and require testing to take place in controlled environments first.
Lawmakers should also make owners of self-driving vehicles responsible
for the crashes their vehicles cause. Right now, it’s not clear
whether the vehicle owner, designer, or operator should be held legally
responsible when things go wrong.
The most sensible thing to do for now is to treat these cars like dangerous
animals. When one gets loose and causes harm, the owner becomes liable
even if he or she wasn’t “negligent” in the usual sense.
That’s the well-understood price of owning one. The alternative
- legal chaos and possibly no real accountability - is a recipe for disaster.
All new technologies go through growing pains. Self-driving vehicle technology
will obviously be no exception. Despite sophisticated and ever-vigilant
technology, the artificial intelligence running these cars still lacks
a human’s ability to perceive unique dangers. We need to make sure
we take reasonable steps to keep people safe, and create appropriate legal
protection, while the industry works toward improving the quality of all