In recent weeks there have been two car crashes where Tesla’s Autopilot feature may have been involved. The National Transportation Safety Board and the National Highway Traffic Safety Administration are investigating both incidents.\n\nWould you trust a self-driving car?\n\nObjectively, a self-driving car is safer than a human driver. Nearly 90% of all accidents are caused by or have human factors contributing. And in an ideal world, self-driving cars would be safer because self-driving cars don’t get distracted, fatigued or drunk. But none of this matters if people don’t trust the self-driving cars. \n\nDoubters are saying that the crashes reveal blind spots behind Tesla’s Autopilot, and self-driving cars in general. Mobileye, the company Tesla partners with for Autopilot, stated that the braking technology was not designed for the scenario which led to the fatal accident. Incidents like these are eroding some of the public’s faith in self-driving cars. \n\nWould you #TrustAutopilot or do you #DontTrustAutopilot?It's about reducing human error, not being perfect. For some it's just a personal thing. It's about reducing human error, not being perfect. For some it's just a personal thing.