So you probably heard by now that an autonomous Uber killed a woman in Arizona. There was a human operator behind the wheel, but the car was in autonomous mode.
Of course we don't yet know what really happened (EDIT: preliminary police statement indicates likely not fault of self-driving car | UPDATE: the video is pretty damning - suggest huge problem for Uber).
The promise of autonomous mobility is fewer deaths. I'm very excited about self-driving cars for other reasons as well. I love the vision of living further outside of cities that are less full of cars, with fewer traffic jams and less wasted time commuting.
But what I don't understand, on the road to self-driving (sorry), is why we're letting proprietary tech take over such an important part of our lives yet again. Or, as a comment on Hacker News put it, "Why can a private, for-profit company "test" their systems on the public roads?"
Socialize the losses, privatize the profits? That doesn't seem right. And so, as with other software that has such enormous potential, as long as it's using public infrastructure we should probably insist that it be free software (as in freedom, not beer). You should be free to inspect, to modify and to share both the training algorithms and the machine learning models that result. At least as long as you're purchasing or leasing the vehicle. The same should probably be true for government licensing bodies above all.
Otherwise we'd have no idea what that software is meant to do. Which is crazy if you think about it.