The Lowdown On Self-Driving Cars

You know that self-driving cars are coming. You keep hearing a quiet buzz in the background, but what exactly is going on? How would they work? Are they safe?

Let’s take a look at the landscape of self-driving cars so we know what’s already happening and what should be happening within the next few years.

How Do Self-Driving Cars Work?

The concept of a car driving itself around the road can feel a little scary, right? Just how do they know where the other cars are on the road? How do they know when to turn?Excellent questions, really, so let’s take a look:

A self-driving car is able to sense its surroundings and navigate accordingly. In most cases, that means a GPS, an inertial navigation system, and more sensors than you can shake a stick at. The sensors take in an exorbitant amount of information — more than you actually take in with your own senses — and the GPS and inertial navigation system build a 3D map of the environment.

From there, it relies on its programming to interpret that 3D map and determine the best course to get you to your destination, taking into account pedestrians, other cars, roadblocks, and everything else a driver would normally factor in.

In short, a self-driving car functions much the same way a car with a driver functions. The driver uses his or her senses to map the environment and his or her brain to interpret that map to make decisions about where to go. Self-driving cars work much the same way — they just have “eyes” in every direction and a computer for a brain.

Who Are The Key Players?

In a sense, everyone. Countless car manufacturers are bullish on self-driving cars, so until they are mass-produced and available on the road, many of these companies have a chance at being the first company to get there. With that said, here are the players that appear to have a more legit chance at beating others to the finish line:

●      Tesla: Elon Musk’s company has already introduced automated elements to their newer models and has been open about its goal of making totally automated cars in the future.

●      Google: The search engine giant has been testing self-driving cars more than any other company, completing more than 1.8 million miles on the road.

●      Uber: The taxi disruptor has also invested millions of dollars to bring self-driving cars to market.

Are They Dangerous?

This is the question that’s on everyone’s mind. Presently, there’s a knee jerk reaction that says, “Whoa, that feels too futuristic to be safe.” A recent survey Kelley Blue Book conducted on American drivers revealed that:

●      64 percent said they wanted to be in “full control” of their vehicles

●      When it came to self-driving cars, 51 percent said their biggest concern was personal control; 49 percent said it was safety.

●      80 percent wanted to have an option to take manual control of the car

Those results paint a picture of concern. But what about reality?

Google has been test-driving self-driving cars on the road for a while now. Spokeswoman Jacquelyn Miller framed it this way:  “We just got rear-ended again yesterday while stopped at a stoplight in Mountain View. That’s two incidents just in the last week where a driver rear-ended us while we were completely stopped at a light! So that brings the tally to 13 minor fender-benders in more than 1.8 million miles of autonomous and manual driving — and still, not once was the self-driving car the cause of the accident.”

Let’s work out the numbers. The average person drives about 15,000 miles per year and files, on average, a collision claim every 17.9 years. That’s one collision—your fault or not—every 268,500 miles. Google’s numbers work out to 138,461 miles per collision. While on the surface that would make Google’s car sound less safe, it’s not really a perfect comparison. People don’t file claims for every fender bender, so we can’t account for the ones that go unfiled, and it’s important to note that none of Google’s collisions have been their own fault.

There’s also a different vulnerability: hackers. Chinese researchers successfully hacked a Tesla Model S and were able to interfere with the car’s locks, brakes, and other electronics. Tesla, who is making a huge push into automated driving, quickly released a security update.

Beyond that, though, Tesla came under fire when Joshua Brown died in a fatal crash while his Tesla was in autopilot mode. The Model S’s autopilot sensors failed to distinguish a white semi truck from a bright sky. With all the data we have available, it’s difficult to draw conclusions. How many people die in fatal crashes because they fail to see something against a bright sky? Is it more or less frequent than what happened with the Model S?

Until we have more data and more widespread testing, answers will be tough to come by. Self-driving cars are a fascinating development. The concept of being able to use commute times more productively, and the concept of computers working together to ensure our safety, is alluring. We aren’t there yet, but with continued investment, development, and acceptance, we will be.