The idea of a superintelligent AI taking over the world and wiping out the human race in the process has long occupied humanity’s imagination. And, up until now, imagination is where this whole concept has remained confined to. Doomsday scenarios never actually came to pass, and we’re still firmly in control. But that hasn’t stopped people from continuing to sound the alarm about the dangers of AI. Is there actually any truth to what they’re saying? Are we really close to developing an AI capable of taking over the world?