Skip to main content

Obama Administration Set to Enact Plan to Help Get Self-Driving Cars off the Ground

Sadly, not in a literal flying car sense.

Recommended Videos

In case you had any doubt that self-driving cars are a real, near-future technology instead of just a lofty, ridiculous concept, the government will soon step in with some regulations on them. That may not sound exciting, but it’s a good way to give the convenient, potentially extremely safe transportation method a solid path to acceptance and widespread adoption.

Reuters reports that Mark Rosekind, head of the National Highway Traffic Safety Administration, said that Transportation Secretary Anthony Foxx will talk about what the current administration plans to do to help self-driving cars along in Detroit tomorrow. Driverless car makers would like to avoid having to deal with different regulations on their work all over the country with rules put in place on a federal level, so this comes as good news to them.

It could also help them with legal issues when something goes wrong in testing a self-driving car, which is a big concern with a technology that really needs to be put into practice in the wild to be perfected. Reuters also mentions that a Google spokesperson will be involved in the announcement tomorrow, and other car makers from Detroit are likely to participate.

Meanwhile, a new report from Google shows that their human test drivers have had to take control of “driverless” vehicles 13 times to avoid collisions. In testing from September 2015 to November 2015, test drivers had 341 “disengagements,” or instances where they took manual control—like when you’re learning to drive and your parents super helpfully dive for the wheel, only with robots. However, 272 of the instances were due to software detecting that sensors might be acting up or other small computer problems, which alerts the human driver to take control in case something goes wrong.

The remaining 69 disengagements were ones in which the test driver decided to take manual control, whether due to an unsafe situation, erratic or unsafe behavior of other drivers, or for any other reason. Out of that number, only 13 would have resulted in contact with another vehicle—or, in two of the cases, a traffic cone. Google says that its cars were to blame in ten of the incidents, with other drivers being at fault for the other three.

Google’s cars have also been involved in 11 actual collisions, but they were all the fault of other drivers. The autonomous cars are still crashing at a higher rate than human drivers, according to The Verge, but they improve with every failure, and some day they will hopefully be much safer drivers than us faulty meat sacks.

(image via Google)

—Please make note of The Mary Sue’s general comment policy.—

Do you follow The Mary Sue on Twitter, Facebook, Tumblr, Pinterest, & Google +?

Have a tip we should know? tips@themarysue.com

Author
Dan Van Winkle
Dan Van Winkle (he) is an editor and manager who has been working in digital media since 2013, first at now-defunct Geekosystem (RIP), and then at The Mary Sue starting in 2014, specializing in gaming, science, and technology. Outside of his professional experience, he has been active in video game modding and development as a hobby for many years. He lives in North Carolina with Lisa Brown (his wife) and Liz Lemon (their dog), both of whom are the best, and you will regret challenging him at Smash Bros.

Filed Under:

Follow The Mary Sue:

Exit mobile version