21798665468_be99997b20_b

If you’ve been keeping up with the latest advancements in technology, you’ve probably heard the news that self-driving or autonomous cars are taking over the roads. While major companies like Uber, Google, Tesla, Nissan and more are jumping head first into developing cars capable of driving themselves, the public remains a bit hesitant.

The uncertainty that many people feel about autonomous vehicles isn’t unwarranted. From fear of losing jobs to safety concerns, many people are wondering if self-driving cars are really the right way to go. Which is one of the reasons why apps like lyft whose discount codes you can find it here, still believe and use people-driven cars to provide utmost safety to the people who use their service.

With the invention of the car came the invention of the car accident. Soon after the Model T took over America’s roads, car-related fatalities soared to nearly 20,000 a year and have remained within a few thousand ever since. When drivers are distracted, accidents are bound to happen. Handing over the steering wheel to a computer, according to tech companies, may be a way to finally reduce the five-figure annual death tolls.

Not even those companies selling self-driving vehicles are arguing that driving deaths will be totally eliminated. Mistakes can happen with both humans and computers; it’s fair to wonder how autonomous cars will respond to unexpected scenarios.

The problem here is that when you get in your car and need to avoid an accident, you react to the situation at hand.  If you were in completely self-driving car, the situation requires the car to make a decision.  A decision requires forethought. This leads the ethical dilemma surrounding self-driving cars. Cars could soon choose between hitting a bystander, taking the brunt of the blow (potentially risking the lives of it’s passengers) or sacrificing the few in favor of saving the majority, also known as the Trolley Problem.

Autonomous cars also pose the question of who would be responsible for an accident? Will the owner of the vehicle be to blame if something goes wrong, or does the manufacturer take the fault? The answer to this question will differ, depending on the degree of automation.

The National Highway Traffic Safety Administration recently released a set of standards that each self-driving vehicle must adhere to. These rules and regulations differ depending on where the vehicle falls on the automated driving system scale, with Level 0 being a standard car where a human is entirely in control and Level 5 indicating that the vehicle is completely driverless, even in extreme conditions.

With minimal autonomous features like cruise control or power steering, most of our driving experiences stay around a Level 1 or a Level 2. At these levels, the driver is still in charge of the majority of the vehicle’s functions. However, the new self-driving vehicles being developed are at a Level 3 or a Level 4. Unfortunately, many people operate their Level 1 or 2 car like it is a Level 3 or 4 car. One needs only to turn to YouTube to find proof of people ignoring the road while behind the wheel.

Human drivers aren’t perfect, but neither are human programmers and engineers. Automating something as complex as driving will never be perfect. Thinking through the ethics and accountability scenarios for self-driving cars should keep pace with technological development. Are we really ready to put our safety in the same hands as companies who struggle to create cell phones, apps, and programs that don’t malfunction? With so much hesitation around self-driving vehicles, we may be better off if they don’t get rolled out at all. However, we know that won’t be the case.

Most people though, remain suspicious about the helpfulness of self-driving cars.   “The handoff” from the computer to a human driver can be a dangerous few seconds. If you’re traveling down the highway at 55mph and your car alerts you, how quickly can you take the wheel in an emergency situation?

As you read that last sentence, did you think of a response? In that time you’ve lost precious seconds. If you do that while driving the problem compounds itself. While you may think it’s no big deal, remember it only takes 4.6 seconds at 55 miles an hour to travel the length of a football field. Your reaction would have been immediate in a regular car.  While you think this could lead to the demise of the autonomous car, it will actually create a push for more advanced Level 5 cars.

As most of America warms to the idea of letting algorithms take the wheel, there remains a sizeable portion of Americans who have good reason to resist autonomous vehicles. People who drive for a living — from taxies to semi-trucks — are worried what this could mean for their jobs. With the introduction of self-driving trucks that can eliminate the restrictions surrounding human drivers, truck drivers fear that their jobs may be replaced.

Cab drivers and Uber drivers also worry they will find themselves out of a job if autonomous vehicles really begin taking off. Uber is already expanding their self-driving fleet to other cities. It’s only a matter of time before unemployment becomes the unfortunate reality for taxi and truck drivers.

The fact of the matter is, humans aren’t perfect. The irony is, some people think that an autonomous car, programmed by a human, will perform correctly 100% of the time. There will always be a margin of error. With the ethical dilemmas and job loss surrounding these vehicles, are they really the road to the future we want to travel?

Megan Ray Nichols is a freelance science writer. She’s a regular contributor to Datafloq and The Energy Collective. Megan also writes weekly on her personal blog Schooled By Science where she discusses the latest news in science and technology.  Subscribe to her blog for the latest news and follow her on Twitter, @nicholsrmegan, to join the discussion.

Image Source