The implications of Hotz, a “lone wolf” programmer being able to create an AI powerful enough to drive a car really brings home what Nick Bostrom hypothesizes in Superintelligence. What if a genius like Hotz creates a powerful AI without realizing the need for safeguards? Hotz, w…
Lucas van Lierop
This is a great point, and with the ease of access to weapons in the US it wouldn’t surprise me if someone was able to create a really dangerous tool at some point in the future.
The same thing has been true in the gun control debate for a long time though, and I think we need to maintain the mindset that the government will always have enough of a resource advantage over dangerous individuals to keep us safe. Check out Elon Musk and Stephen Hawkings statement on military AI!