People Will Die

It's alive!!!

Just a few years ago, self driving cars were the stuff of science fiction.  Now, they’re about to make the bold step into reality.  Not if I have anything to say about it.  The scientists and engineers behind these monstrosities should watch more science fiction movies before they unleash them onto the public.


Google: Driverless cars are mastering city streets
LOS ANGELES (AP) - Google says it has turned a corner in its pursuit of a car that can drive itself.
The tech giant's self-driving cars already can navigate freeways comfortably, albeit with a driver ready to take control. But city driving - with its obstacle course of jaywalkers, bicyclists and blind corners - has been a far greater challenge for the cars' computers.
In a blog entry posted Monday, the project's leader said test cars now can handle thousands of urban situations that would have stumped them a year or two ago.
"We're growing more optimistic that we're heading toward an achievable goal - a vehicle that operates fully without human intervention," project director Chris Urmson wrote.
Urmson's post was the company's first official update since 2012 on progress toward a driverless car, a project within the company's secretive Google X lab.
The company has said its goal is to get the technology to the public by 2017. In initial iterations, human drivers would be expected to take control if the computer fails. The promise is that, eventually, there would be no need for a driver. Passengers could read, daydream, even sleep - or work - while the car drives.


The driverless car has all the ingredients of a classic science fiction movie.  You got the scientists/inventors/engineers, who may have at one time, meant well, but are now driven by something other than the betterment of mankind.  You got the “creation” that threatens the health, life, and safety of the general public, and you have a textbook case of a scientific ethics dilemma - does having the ability to create, give you the license to create?




How many times in all your years and miles of diving have you encountered a totally unique emergency situation, a one in a million thing that no computer program could ever have a contingency response for?  A tire blowout, a vehicle malfunction, a deer running in front of you, with a car behind you and another car approaching from the opposite direction?  The list of possibilities is endless and the number of possibilities that we could never think of is much higher than the the number that we could.  


There is no way a driverless car could possibly react as well as a human being in those one in a million scenarios that could never be taken into account in advance, and that’s assuming that the driverless car would always function perfectly.  I have had many instances of vehicle failure that lead to situations required split second decision making.  The type of vehicle failure and the environment in which it occurred made for a completely unique situation that will probably never be exactly duplicated again ever in the history of mankind.  How the hell are you going to incorporate that into a computer program?  Sometimes a unique situation calls for the opposite action that would be used in 999 out of 1000 other similar cases.


Think about this.  What’s been in the headlines recently?  It appears that GM couldn’t even make an ignition switch properly, even when they knew they had a problem.  Now what’s more complicated - an ignition switch or whatever is necessary to make a vehicle operate without a driver?  Are you going to trust your life to people who can’t even make an ignition switch?  Are you going to just sit there and let your government allow others to put their trust in such people and the faulty equipment they produce?  What are you going to do when one of these driverless cars malfunctions and it’s heading towards you at 60 mph?

If you think you can trust driverless cars to be safe and reliable, take a look at the dashboard of your own car.  If you don’t see this right now, give it a few months.


You know what this means. Something's gone wrong. Eventually, when something goes wrong with a driverless car, people will die.

2 comments:

  1. The first time a driverless car runs over a kid the WHOLE THING will come to a screeching halt. There isn't a company on the planet with deep enough pockets to take the risk of continuing to put such vehicles on the roads. Unless .GOV passes legislation granting such
    companies immunity the very first death caused by one of these contraptions will bankrupt the company that built it. Google has REALLY deep pockets but the very first time they have to
    pay out multi millions in settlement costs will be the day the shelve this project.

    ReplyDelete
    Replies
    1. Very true. Also, the LAST thing we need is something that will encourage drivers to pay LESS attention to the road.

      Delete