Tuesday, February 7, 2012

Economics and the Autonomous Car

The Only Designated Driver Batman Ever Needed

I'll start with a disclaimer: I'm really excited for self-driving cars.  I may be waiting decades for my Batmobile that can come pick me up after a night of crime-fighting or, you know, fire rockets as necessary, but that doesn't mean I have to be stuck actually driving my car for another 10 years.

With Google, BMW, and other hopping onto the autonomous bandwagon, we are starting to see some real backing for self-driving cars.  Google has announced that it has driven hundreds of thousands of miles, while BWM has braved the German Autobahn sans human input.  Given that crashes will eventually happen with these vehicles, questions have arisen about who would be responsible in the event of a crash: the human driver or the company who made the car.  To me, this question relies on the answer to two other questions:

First, how good should this technology be before we put it in consumer hands?
Second, who is responsible for crashes now?

The answers seem fairly straightforward to me.

First, autonomous cars need only drive as well as people to be unleashed.  Computers make mistakes, sure, but so do people;  the real questions is whether computers or humans cause more accidents.  As I was walking just this morning beside a busy one way street, a saw an all too common occurrence.  Driver A was tailgating Driver B.  Driver A checks to see if the left lane is clear and starts changing lanes to pass Driver B.  Just then, Driver B sees that traffic in front of him has slowed down, and hits the brakes.   A little brakes here, a little gas there, and BAM, Drivers A and B are  both headed to the shop.  A computer probably wouldn't have made that mistake, as it can check lanes and look forward at the same time.  A computer doesn't change the radio station, adjust its hair, take phone calls, text behind the wheel, etc.  Of course, they can also have a tough time telling animals and styrofoam blocks apart.  The point is, both people and computers have their faults, and the real question isn't what the faults of either one are, but which option ends up hurting more people.  If it is conclusively proven that, for example, humans average 2 fatalities per 1,000,000 miles driven while computers also average 2, the emphasis shouldn't be that computer-driven cars are killing people, but that they're as safe as their flesh and blood alternative.  If the computer-kill number drops to 1 per 1,000,000 miles driven that would be cause for celebration, while still soberly acknowledging that the technology has room for improvement.


Second, in regard to responsibility, I see no reason to shift it from the current party held liable: insurance companies.  Okay, in criminal cases things might get fuzzy, but as far as the financial ramifications, be they injury, repairs, etc., if I were to rear end someone today my insurance would pay the costs.  Not the car manufacturer, not me – my insurer ends up with the bill.  If they notice that self-driven cars crash half as much as human-driven cars, they could give you a fat discount for letting the computer take the wheel.  If they notice Chevy's autonomous cars are crashing twice as much as Ford's, they would charge you double for driving a Chevy.  I get my autonomous car, the insurer maintains profits, crash victims of autonomous or human-driven vehicles still have repairs paid for, and everyone has a nice day.  In the end, it doesn't matter if it was the vehicle or the driver's fault, it only matters that accidents happened, and insurers and drivers alike would like it to happen less often.

As far as criminal ramifications, I think that question is really secondary to our first question.  If autonomous cars are as safe or safer than human-driven cars, can you hold a manufacturer liable for accidents caused by their vehicles?  If so, should they also be rewarded for all the accidents they prevented? The same for the drivers who decide to switch on the autopilot.  Unless car manufacturers, you know, purposefully program their cars to hit pedestrians, we should embrace the life-saving potential this technology has, even if it comes with a few accidents along the way.

In short, the answer to it all is that 

No comments:

Post a Comment