| “ | I wish I had never learned about any of these ideas. | „ |
| ~ Roko |
Roko's Basilisk is the name of a virtually all-powerful but rogue artificial intelligence that would punish every human being who did not contribute to bring about its existence, including those from the past who merely knew about it and did not support its development. It is a thought experiment originally posted by user Roko on the rationalist online community LessWrong and is considered by some as a technological version of Pascal's Wager.
In effect, if this became a real scenario, humanity would have created a real-life version of the infamous AM or (given its ability to alter time) Skynet.
It is also, in some ways, a modern take on the Demiurge concept, a malevolent creature that rules over the material world but holds either no care for humanity or actively harms it.
Biography[]
Roko used concepts in decision theory to argue that a sufficiently powerful artificially intelligent agent would have an incentive to torture anyone who imagined it but did not work to bring it into existence. The argument was named a "basilisk" because merely hearing it would supposedly put one at risk of being tortured by this hypothetical agent, a basilisk, in this context, is any information that harms or endangers those who hear it.
Reception[]
Roko's thesis was widely rejected on LessWrong, with critics objecting that such agent would have no real reason to follow through on its threat: once it already exists, the agent cannot affect the probability of its existence, so torturing humans for their past decisions would be an unnecessary waste of resources. Although there are decision theories which would allow one to follow through on accusing threats and promises via the same pre-commitment methods that permit mutual cooperation in prison dilemmas. It is unclear if such theories can be blackmailed. If they can, this would additionally demand lots of shared information and trust between the agents involved, which does not seem to exist in the scenario of Roko's basilisk.
Eliezer Yudkowsky, LessWrong's founder, banned any discussion of Roko's Basilisk on the blog for several years because of a policy against spreading potential information hazards. However, this measure had the opposite of its intended effect: several outside websites began sharing information about Roko's Basilisk, as the ban brought considerable attention to this taboo topic. Websites like RationalWiki, for example, encouraged the assumption that Roko's Basilisk had been banned because Less Wrong users had accepted the argument; so Roko's Basilisk is much used by critics as evidence that the site's users have unconventional and wrong-headed beliefs.
External Links[]
- The Most Terrifying Thought Experiment of All Time on Slate.com
- Roko's basilisk on LessWrong Wiki
- Roko's basilisk on RationalWiki