There is an appeal to autonomous cars. One can simply walk out to the car, get in and go. Yes, they promise to bring safety, convenience and independence. No need to go through driver’s ed, get a learner’s permit and then have a provisional driver’s license. Just hop in the car and go wherever.
Automakers have been testing autonomous technology for years. It started with cruise control, then it was adaptive cruise control, pedestrian warning systems and lane departure warning systems. The list goes on.
According to an International New York Times report, Google autonomous cars have been involved in 16 accidents since 2009. While most of them have been minor, it shows the technology can be dangerous. There was one accident where the Google car was at fault. A Google employee was driving the car at the time of the accident.
Google has programmed the cars to follow traffic laws to the letter. This could be a detriment in a life-or-death situation. What if the car needs to suddenly brake or speed up? It would be incapable of making such decisions. Autonomous automotive technology will advance to the point where there will be no need for steering wheels, brake or gas pedals, or anything that drivers use. This would mean that there would be no way for the human passengers to stop the car or take control of it in the event of an emergency.
Another thing to keep in mind with autonomous cars is the death equation. This equation determines who dies in the event of an unavoidable death, who dies. For example, imagine you are in an autonomous car that loses control and will hit another car. There is a family in a car coming towards you in one lane. In another lane is a car with a single passenger in it. The computer has to decide in a matter of seconds who dies. Is it you? Is it the family? Is it the single passenger? Humans can be faced with this problem as well. How do us humans, let alone a computer, decide who lives and who dies? Would you trust a computer with your life? I don’t think I could bring myself to let a computer determine whether I live or die.
As an automotive enthusiast, I lament the fact that cars with manual transmissions are becoming a thing of the past. Manual transmissions allow the driver to exert far more control over the car. You can stop quicker, accelerate quicker and you have the ability to truly enjoy the car. You won’t be able to do any of that when autonomous cars take over. Many people would rather get into an autonomous car that does all of the work for them, but the underlying problem is we won’t be able to teach others a valuable skill. What happens if you go camping in the middle of nowhere, hours from civilization and your autonomous car won’t work and you need help?
While there is a benefit to autonomous cars for the disabled and elderly, there is no other large benefit for the rest of the population. Buying an autonomous car won’t be cheap. This technology is so cutting-edge and carries so much liability with it that the prices of autonomous cars will likely be astronomical. Many people won’t be able to afford an autonomous car, or at least not until the price drops. It will become yet another plaything of the wealthy, and the rest of the population will decry the rich for taking something so valuable away.
I won’t willingly embrace the autonomous car. I love cars with a passion. I won’t be able to share that passion with others once autonomous cars become the cars on the road. I’m afraid that our roads will become a bleak, dystopian future.
Read the Pro argument at: https://www.theoakleafnews.com/opinion/2015/09/17/pro-self-driven-automobiles/
Tom • Sep 22, 2015 at 2:47 am
I am in favour of self-driving cars, but agree that some situations need to be thought through very well, as mentioned in this article.
But on the topic of the death equation: What would make a death unavoidable? What would be the reason for the car to lose control? What is the scenario and can it be avoided by changes in the infrastructure or increased safety margins for the car? What may happen needs to be discussed, but also what caused it to happen in the first place. A driver making a phone call, or trying to avoid a distracted pedestrian? A software glitch or sensor failure in the self-driving car?
In addition I believe that in order to make self-driving cars a reality, either all cars need to be self-driving, or all cars need to be at least very aware of what is going on around them. The former will take some time because of the cars being quite expensive (rightly argued in the article: it will be available only for the rich, just like the current crop of electric cars, or like other car-safety technology from the past only available in high-end cars). The latter, a connected car, may be something we could be seeing sooner.
Alternatively we may see components enter the car that make them slowly evolve into self-driving cars and see how things work out, instead of putting a self-driving out there immediately.
Eric T • Sep 17, 2015 at 2:21 pm
As long as you realize that every single case of the Google self driving car being involved in an accident was a human’s fault, you see pretty quickly why we need mainstream adoption of self driving cars. If anything, the Google car is maybe overly cautious. Human drivers drive like idiots at time and it can fool the self driving car to make an evasive maneuver when it doesn’t need to. Bottom line is the self driving car isn’t distracted and will always be a better driver than the average population. I imagine as car enthusiasts are too old to drive, you will see more and more adoption of this.
Bottom line is yes, you lose a valuable skill, but is that skill really valuable anymore? Spear hunting was probably a top skill 5,000 years ago, but plays a very minor part in today’s society.
Brandon Nancoo • Sep 17, 2015 at 8:20 pm
lovely article. this is more in response to Eric. while self driving cars are a lovely idea, most likely one would still need to have a license of some sort to operate one, in a case of a glitch, the driver would have to be able to take over and drive. which means you would still have to be vigilant for other cars that are malfunctioning or your own, one would also have to be vigilant for pedestrians (this is assuming that all cars on the road where autonomous). at this point the only thing the occupant of the car isn’t doing would be giving input on the steering and acceleration yet would still have to be ready to take over and also be just as if not even more vigilant because of the randomness of technology. might as well just drive…
i like your way of seeing driving as a skill and comparing it to spear hunting but you fail to realize that driving and cars arent only means for survival its a passion, a joy, stress relief and a big part of many peoples lives that cant just be replaced by something deemed “more efficient”.
one would also have to make a program that could make ethical decisions, in a case where crash in inevitable except for one scenario that would end in your (person or persons in the car) survival but the death of bystanders/pedestrian, the car would have to weigh your life vs other peoples lives and sometimes choose to kill you instead. in essence its a nice idea to be delivered everywhere like a bourgeois but it isnt very practical.