Sunday, July 31, 2016

The Autonomous Debate

Take a look at the picture below. It's a representation of how people saw the future of cars in the 1960's during what was called the "Golden Age of American Futurism". Along with electric solar cars, weather control and jet-pack mailmen, self-driving cars were a vision of the near future at the time. Turns out, you can get one today! But should you?


On May 7th, Tesla's autopilot was blamed for the death of Joshua Brown after it wasn't able to detect the truck that hit his car. Many have pointed out the dangers of the autopilot and have doubted the decision to have it available so soon to the public. But, as Elon Musk said, the autopilot is not yet good enough to completely replace the human driver. The owner is asked to keep his/her hands on the wheel and to be always prepared to take over. The technology is not to replace human drivers (yet); it's just a security upgrade that is meant to assist the driver in any situation (for now).

Something we shouldn't forget is that the autopilot is not perfect. But even then, it's still better at avoiding accidents than most human drivers are. I believe having such a technology is a must for achieving almost zero car accidents everywhere in the world. Doubting the autopilot is like doubting an experienced professional swimmer that is swimming next to you as you cross a fearsome ocean filled with numerous perils and immense predators. That being said, recently, a Tesla Model X saved the life of attorney Joshua Neally by using the autopilot to drive him safely to a hospital. A normal car could not have done that. This shows how useful the autopilot can be in certain situations.


Another interesting problem that autonomous driving brings is the question of who is responsible for the accidents? From an insurance and legal point of view, this requires an answer. Well, considering that we don't have the full autopilot yet, each person is responsible for the actions of its car except if the car accidentally made a wrong move. Then the cause of the accident will appear in the car's autonomous log. If full autonomous driving is announced, that means the company is responsible for any damage or accident caused by the car.

That being said, an autonomous car is basically a computer on wheels. So by creating a computer-car, you basically inherit all of the problems from computers. The big one is hacking. What if your car is hacked while you're doing 90 on a freeway? Hackers have fooled a Tesla into believing that no obstacle was in front of it by using radio interference. And that's just the tip of the iceberg. As you can see, security measures are needed more than ever to stop such things from happening and we still have a long way to go until driving in a computer is 100% safe.

I'll leave with something to think about. If an autonomous car has the ability to take action and make choices, what happens when those choices involve huge moral dilemmas. Would you prefer to have your car avoid a pedestrian but kill you; or would you like to live and sacrifice the pedestrian? Considering the fact that the car should serve its owner, then the latter seems more logical for a car maker. But still, it becomes a bit scary when that decision has to be taken by a robot.

For a bit more drops of future, check out: