Singularitarianism

From Wikipedia, the free encyclopedia

Singularitarianism is a moral philosophy based upon the belief that a technological singularity — the technological creation of smarter-than-human intelligence — is possible, and advocating deliberate action to bring it into effect and ensure its safety. While many futurists and transhumanists speculate on the possibility and nature of this type of singularity (often referred to as the Singularity), Singularitarians believe it is not only possible, but desirable if and only if guided safely. Accordingly, they might sometimes "dedicate their lives" to acting in ways they believe will contribute to its safe implementation.

Originally the term singularitarian was defined in 1991 by Extropian Mark Plus to mean "one who believes the concept of a Singularity", this term has since been redefined to mean "Singularity activist" or "friend of the Singularity"; that is, one who acts so as to bring about the Singularity. [1]

[edit] Beliefs

In his essay "Singularitarian Principles" (2000), Eliezer Yudkowsky writes that there are four qualities that define a Singularitarian:

  • A Singularitarian believes that the Singularity is possible and desirable.
  • A Singularitarian actually works to bring about the Singularity.
  • A Singularitarian views the Singularity as an entirely secular, non-mystical process — not the culmination of any form of religious prophecy or destiny.
  • A Singularitarian believes the Singularity should benefit the entire world, and should not be a means to benefit any specific individual or group.

In July 2000, Eliezer Yudkowsky, Brian Atkins, and Sabine Atkins founded the Singularity Institute for Artificial Intelligence to work towards the creation of self-improving Friendly AI of human-similar intelligence. The Singularity Institute's writings argue for the idea that an AI of human-similar intelligence with the ability to improve upon its own design would rapidly lead to superintelligence. Singularitarians believe that reaching the Singularity swiftly and safely is the best possible way to minimize net existential risk.

Many believe a technological singularity is possible without adopting Singularitarianism as a moral philosophy. Although the exact numbers are hard to quantify, Singularitarianism is presently a small movement. Other prominent Singularitarians include Ray Kurzweil and Nick Bostrom.

[edit] See also

[edit] External links

Languages