In dystopian science fiction, we are taught to fear the technological singularity – the time when artificial super intelligence advances to a point far beyond human intelligence, with a result that profoundly alters human existence. Vivid imagery of automated weapons of doom working to wipe out human civilization and take over the world – or the galaxy – have terrorized generations of sci-fi fans. Stephen Hawking, Elon Musk, and Bill Gates have warned of its approach. Ray Kurzweil says it will be upon us by 2045, and, as far back as 1942, Isaac Asimov was contriving rules for robots, […]
Full Post at www.eejournal.com