Wednesday, March 20, 2013

Whose Fault is a driverless Car Crash?

Here is the problem:
States haven't yet said whether a manufacturer or passenger is responsible in a driverless-car crash, so insurance markets can't price the risk. "We don't yet know how judges and juries will react." says Bryant Walker Smith of Stanford Law School. ----- end quote from page 44 of the March 25 2013 Time Magazine.

The solution they say is to make the manufacturers liable. However, to me this is crazy because why would I buy a car that "ONLY" can drive itself? So, the only people who would buy or rent a car like this would be people who could not drive and didn't want to drive.  And I think you would have a lot more people afraid to ride in a self driving car because they might die just like many people are afraid of flying in a passenger plane.

So, what I would recommend is a car that is switchable in liability. When you turn it to "Human Driver" you, the driver would be liable. But when you turn it to "Manufacturer" then the manufacturer would be liable. So, the better and safer designed manufacturers would reap the profits in the vehicles they manufactured. And the better driver you are while you are driving the better your insurance rates would be.

However, on just a cost basis analysis likely it would be cheaper to ride in a driverless car and never learn to drive yourself at all which I think is really unfortunate for the human race. Because if you don't think "Because computers do that for you and you don't drive because robot cars do that for you", at a certain point the human race just becomes silly like a cartoon or slaves for the computers.
It is a slippery slope to becoming pets for computers and robots for humans where all intrinsic value in actually being a human is lost.

To read more about "Regulate the Robots" please read more on pages 44 and 45 of the time magazine article.

I personally don't want to be driving (especially on a freeway) with driverless cars. I'm actually less afraid of driving around them at slower speeds. But, because of what I know about the really crazy types of mistakes (completely illogical) that programs often can make when electrical glitches or magnetism from any source or electrical coronas affect programming I can easily imagine what could happen on any freeway at 70 mph here in the U.S. or abroad. Also, at the very least it is now legal for driverless cars to be on the highway anywhere in California or Nevada. And I think in Nevada you don't even have to have a real driver watching to keep things safe. So, watch out world, driverless cars and trucks of all sizes are coming and likely thousands and eventually millions of people who drive for a living are bound to lose their jobs as a result worldwide eventually.

Note: Imagine for example you are driving in a driverless car and a bolt of lightning out of the blue crashes next to you  or hits the car and fries the operating system while you are being driven 70 mph down the freeway by a computer driven car. What happens when the corona of the lightning bolt completely fries the operating system? This is just one of thousands of potential problems with people dying in driverless cars.

It is true that under "Ideal Circumstances" one might actually be safer in a driverless car, but life is never ideal. It is experimental always at best and that is why a human driver that is rested and not on drugs or intoxicated would always be the best driver for any vehicle.

Much later:  April 15th 2013: After reading about the Navy's Laser that can bring down and armed drone at a distance of several miles away, I was thinking about cars and trucks that were driverless. What if a laser (that will only get more powerful through time) was directed at a driverless car or truck of any size? What happens then if you cook the software or even burn through the motherboard that drives the car in the CPU? What happens then? There are more questions than answers regarding both driverless cars and trucks in relation to lasers being used and more powerful every day too.

Imagine a bus with 50 people in it and someone with a hand held laser that wants it to crash. Since there is no driver there wouldn't be a problem in making it crash with a powerful enough laser. Or you could just use a sniper's rifle to shoot through the CPU and make the vehicle crash too. More thinking about these types of things must be considered regarding driverless vehicles too.

No comments: