Monday, March 26, 2018

Most people think that self driving cars will be safer than human drivers: I don't think so

Why?

Because the types of accidents caused by self driving vehicles are completely different than the causes of human caused accidents.

So, though eventually it is possible that self driving vehicles MIGHT be safer than human drivers I believe that is 20 to 50 years away for now for a variety of reasons.

One person in the news today said it best:

"People who drive cars have to prove to examiners both through a written test and a driving test that they know what they are actually doing.

Now, this doesn't mean that that person who just passed their driving test can't go home and get drunk and go kill someone the first day they drive a car. No.

But, it does mean that if that driver isn't completely crazy, they will be a somewhat safe driver.

There is NOTHING in the laws now to test individual self driving vehicles with driving test DMV instructors. Until you do something like this where drive testers from the local DMV certify each vehicle by riding with it on city streets and freeways at least 5 to 20 miles each then we aren't going to know if ANY of these self driving vehicles are safe.

They might all look the same but the equipment in each self driving vehicle has to be designed properly and interfacing with itself and it's environment safely. And until each vehicle is certified (at least once a year by a tester) I don't think it should be carrying people anywhere in the U.S.

Also, I think Self driving vehicles need to be capable of a self diagnosis every time they power up.

In other words:

Are my tires flat?
Is all my equipment working properly?
are there any malfunctions I need to report?
Is the sun having a Solar Flare that could affect my sensors?

Etc. Etc. ETc.

Because any of these things that is not diagnosed could cause a crash or multiple deaths during that day or night of self driving.

Also, tables of how long each part that could cause a fatal accident is likely to fail should also exist.

Why? because if there are no backup human drivers in this vehicle to watch over it, it could be driven theoretically 24 hours a day for years at a time with no one looking after it's road worthiness at all. And with this much wear something is liable to go wrong from weather, temperature, humidity, or dryness etc.


For example, for right now Bumpy roads are the biggest thing that would damage electronic components. They generally are very small so bumps could jar any electrical connection and create problems. So, you probably don't want self driving vehicles driving on bumpy roads (unless they have been specifically designed to withstand these kinds of shocks in an ongoing way.

Another problem is static electricity. For example, if you open up your desktop computer and you are not properly grounded you can in 1 second destroy the CPU of your computer just by touching the wrong component while dragging your foot on a wool carpet for example, because you aren't aware of this problem.

So, in desert or dry conditions static electricity could be a problem that could knock a component out or cause a cascade of problems starting with the static discharge corrupting memory which the self driving vehicle needs to operate. One static discharge could be fatal to all passengers in the car or any bystanders or people in other cars.

So, you likely are going to see many "chain reaction" accidents when one car goes haywire and this causes a chain reaction of 100 or more accidents on a freeway over time.

However, with self driving car manufacturers limiting liability through Congress it might be very difficult for insurance companies to assign blame. So, unless insurance companies find a way to protect themselves from the self driving car manufacturers this could be a real problem in the future not only for the families of the people killed by self driving cars but also for insurance companies trying to assign blame to mass accidents caused by a software or hardware failure (usually it would be a software failure causing a hardware failure) in a self driving vehicle, which then causes a chain reaction accident involving 10 or 100 or more vehicles both self driven and human driven.


The other thing would be that any error in a self driving vehicle likely would be "SUDDEN" which means that it would be likely the same as someone having a stroke or heart attack at the wheel. In other words there would be no driver trying to save people inside and outside of the vehicle during the crash. So, this likely means accidents could be much worse than we see with humans generally speaking, where humans are desperately trying to save their own and others lives in the moments during which a crash might occur.

No comments: