Singularitarianism
From Wikipedia, the free encyclopedia
| This article may not meet the general notability guideline or one of the following specific guidelines for inclusion on Wikipedia: Biographies, Books, Companies, Fiction, Music, Neologisms, Numbers, Web content, or several proposals for new guidelines. If you are familiar with the subject matter, please expand or rewrite the article to establish its notability. The best way to address this concern is to reference published, third-party sources about the subject. If notability cannot be established, the article is more likely to be considered for redirection, merge or ultimately deletion, per Wikipedia:Guide to deletion. This article has been tagged since April 2008. |
Singularitarianism is a moral philosophy based upon the belief that a technological singularity — the technological creation of smarter-than-human intelligence — is possible, and advocating deliberate action to bring it into effect and ensure its safety. While many futurists and transhumanists speculate on the possibility and nature of this type of singularity (often referred to as the Singularity), Singularitarians believe it is not only possible, but desirable if and only if guided safely. Accordingly, they might sometimes "dedicate their lives" to acting in ways they believe will contribute to its safe implementation.
Originally the term singularitarian was defined in 1991 by Extropian Mark Plus to mean "one who believes the concept of a Singularity", this term has since been redefined to mean "Singularity activist" or "friend of the Singularity"; that is, one who acts so as to bring about the Singularity. [1]
[edit] Beliefs
In his essay "Singularitarian Principles" (2000), Eliezer Yudkowsky writes that there are four qualities that define a Singularitarian:
- A Singularitarian believes that the Singularity is possible and desirable.
- A Singularitarian actually works to bring about the Singularity.
- A Singularitarian views the Singularity as an entirely secular, non-mystical process — not the culmination of any form of religious prophecy or destiny.
- A Singularitarian believes the Singularity should benefit the entire world, and should not be a means to benefit any specific individual or group.
In July 2000, Eliezer Yudkowsky, Brian Atkins, and Sabine Atkins founded the Singularity Institute for Artificial Intelligence to work towards the creation of self-improving Friendly AI of human-similar intelligence. The Singularity Institute's writings argue for the idea that an AI of human-similar intelligence with the ability to improve upon its own design would rapidly lead to superintelligence. Singularitarians believe that reaching the Singularity swiftly and safely is the best possible way to minimize net existential risk.
Many believe a technological singularity is possible without adopting Singularitarianism as a moral philosophy. Although the exact numbers are hard to quantify, Singularitarianism is presently a small movement. Other prominent Singularitarians include Ray Kurzweil and Nick Bostrom.
[edit] See also
- Seed AI — a theory closely associated with Singularitarianism
- Simulated reality — analysis of potential technologically based reality
- Extropianism
[edit] External links
- Why Work Towards the Singularity? by Eliezer Yudkowsky
- Ethical Issues in Advanced Artificial Intelligence by Nick Bostrom

