Can we trust self-driving cars? | The Tylt
After an Arizona pedestrian was struck and killed by one of Uber's self-driving cars, the company decided to halt all autonomous vehicle tests. Deloitte reports that three in four Americans don't trust self-driving cars—they'd prefer a human being in control over some software. However, self-driving car enthusiasts are quick to point out most accidents are from human error, and self-driving cars will save lives in the big picture. Would you trust a self-driving car? 🚦 🚗

Many people are completely unsurprised that self-driving cars have resulted in a fatality.
Wait, who the hell thought self-driving #Uber cars were a good idea? https://t.co/8dGuMFSaTm
— Edan Clay 🌊 (@EdanClay) March 19, 2018
A self-driving Uber killed a woman in Arizona. Not only should we consider the safety of self-driving cars, but also the cost in lost jobs as well.
— David Edward Burke (@DavidEBurke) March 19, 2018
Just because a technology is new doesn't mean it's good for us as a whole.
Statistics show human error is a lot more dangerous than any self-driving car.
Some ninety percent of motor vehicle crashes are caused at least in part by human error.
And while the idea of a self-driving car is a little freaky, the numbers don't lie: humans are the most dangerous element on the roads.
You are apparently speaking from a position of fear and ignorance, not knowledge. Statistically speaking, they are safer than human-operated cars... and if you want to ban them, you should logically be calling for a ban on all motor vehicles. https://t.co/NtGilSihkg
— Reuben Woodruff (@reubenwoodruff) March 19, 2018