Category Archives: Singularitarianism

Singularitarianism | Prometheism.net | Futurist Transhuman …

Posted: January 14, 2017 at 8:56 pm

Ray Kurzweil is a genius. One of the greatest hucksters of the age. Thats the only way I can explain how his nonsense gets so much press and has such a following. Now he has the cover of Time magazine, and an article called 2045: The Year Man Becomes Immortal. It certainly couldnt be taken seriously anywhere else; once again, Kurzweil wiggles his fingers and mumbles a few catchphrases and upchucks a remarkable prediction, that in 35 years (a number dredged out of his compendium of biased estimates), Man (one, a few, many? How? He doesnt know) will finally achieve immortality (seems to me youd need to wait a few years beyond that goal to know if it was true). Now weve even got a name for the Kurzweil delusion: Singularitarianism.

Theres room inside Singularitarianism for considerable diversity of opinion about what the Singularity means and when and how it will or wont happen. But Singularitarians share a worldview. They think in terms of deep time, they believe in the power of technology to shape history, they have little interest in the conventional wisdom about anything, and they cannot believe youre walking around living your life and watching TV as if the artificial-intelligence revolution were not about to erupt and change absolutely everything. They have no fear of sounding ridiculous; your ordinary citizens distaste for apparently absurd ideas is just an example of irrational bias, and Singularitarians have no truck with irrationality. When you enter their mind-space you pass through an extreme gradient in worldview, a hard ontological shear that separates Singularitarians from the common run of humanity. Expect turbulence.

Wow. Sounds just like the Raelians, or Hercolubians, or Scientologists, or any of the modern New Age pseudosciences that appropriate a bit of jargon and blow it up into a huge mythology. Nice hyperbole there, though. Too bad the whole movement is empty of evidence.

One of the things I do really despise about the Kurzweil approach is their dishonest management of critics, and Kurzweil is the master. He loves to tell everyone whats wrong with his critics, but he doesnt actually address the criticisms.

Take the question of whether computers can replicate the biochemical complexity of an organic brain. Kurzweil yields no ground there whatsoever. He does not see any fundamental difference between flesh and silicon that would prevent the latter from thinking. He defies biologists to come up with a neurological mechanism that could not be modeled or at least matched in power and flexibility by software running on a computer. He refuses to fall on his knees before the mystery of the human brain. Generally speaking, he says, the core of a disagreement Ill have with a critic is, theyll say, Oh, Kurzweil is underestimating the complexity of reverse-engineering of the human brain or the complexity of biology. But I dont believe Im underestimating the challenge. I think theyre underestimating the power of exponential growth.

This is wrong. For instance, I think reverse-engineering the general principles of a human brain might well be doable in a few or several decades, and I do suspect that well be able to do things in ten years, 20 years, a century that I cant even imagine. I dont find Kurzweil silly because Im blind to the power of exponential growth, but because:

Kurzweil hasnt demonstrated that there is exponential growth at play here. Ive read his absurd book, and his data is phony and fudged to fit his conclusion. He cheerfully makes stuff up or drops data that goes against his desires to invent these ridiculous charts.

Im not claiming he underestimates the complexity of the brain, Im saying he doesnt understand biology, period. Handwaving is not enough if hes going to make fairly specific claims of immortality in 35 years, there had better be some understanding of the path that will be taken.

There is a vast difference between grasping a principle and implementing the specifics. If we understand how the brain works, if we can create a computer simulation that replicates and improves upon the function of our brain, that does not in any way imply that my identity and experiences can be translated into the digital realm. Again, Kurzweil doesnt have even a hint of a path that can be taken to do that, so he has no basis for making the prediction.

Smooth curves that climb upward into infinity can exist in mathematics (although Kurzweils predictions dont live in state of rigor that would justify calling them mathematical), but they dont work in the real world. There are limits. Weve been building better and more powerful power plants for aircraft for a century, but they havent gotten to a size and efficiency to allow me to fly off with a personal jetpack. I have no reason to expect that they will, either.

While I dont doubt that science will advance rapidly, I also expect that the directions it takes will be unpredictable. Kurzweil confuses engineering, where you build something to fit a predetermined set of specifications, with science, in which you follow the evidence wherever it leads. Look at the so-called war on cancer: it isnt won, no one expects that it will be, but what it has accomplished is to provide limited success in improving health and quality of life, extending survival times, and developing new tools for earlier diagnosis thats reality, and understanding reality is achieved incrementally, not by sudden surges in technology independent of human effort. It also generates unexpected spinoffs in deeper knowledge about cell cycles, signaling, gene regulation, etc. The problems get more interesting and diverse, and its awfully silly of one non-biologist in 2011 to try to predict what surprises will pop out.

Kurzweil is a typical technocrat with limited breadth of knowledge. Imagine what happens IF we actually converge on some kind of immortality. Who gets it? If its restricted, what makes Kurzweil think he, and not Senator Dumbbum who controls federal spending on health, or Tycoon Greedo the trillionaire, gets it? How would the world react if such a capability were available, and they (or their dying mother, or their sick child) dont have access? What if its cheap and easy, and everyone gets it? Kurzweil is talking about a technology that would almost certainly destroy every human society on the planet, and he treats it as blithely as the prospect of getting new options for his cell phone. In case he hadnt noticed, human sociology and politics shows no sign of being on an exponential trend towards greater wisdom. Yeah, expect turbulence.

Hes guilty of a very weird form of reductionism that considers a human life can be reduced to patterns in a computer. I have no stock in spiritualism or dualism, but we are very much a product of our crude and messy biology we percieve the world through imprecise chemical reactions, our brains send signals by shuffling ions in salt water, our attitudes and reactions are shaped by chemicals secreted by glands in our guts. Replicating the lightning while ignoring the clouds and rain and pressure changes will not give you a copy of the storm. It will give you something different, which would be interesting still, but its not the same.

Kurzweil shows other signs of kookery. Two hundred pills a day? Weekly intravenous transfusions? Drinking alkalized water because hes afraid of acidosis? The man is an intelligent engineer, but hes also an obsessive crackpot.

Oh, well. Ill make my own predictions. Magazines will continue to praise Kurzweils techno-religion in sporadic bursts, and followers will continue to gullibly accept what he says because it is what they wish would happen. Kurzweil will die while brain-uploading and immortality are still vague dreams; he will be frozen in liquid nitrogen, which will so thoroughly disrupt his cells that even if we discover how to cure whatever kills him, there will be no hope of recovering the mind and personality of Kurzweil from the scrambled chaos of his dead brain. 2045 will come, and those of us who are alive to see it, will look back and realize it is very, very different from what life was like in 2011, and also very different from what we expected life to be like. At some point, I expect artificial intelligences to be part of our culture, if we persist; theyll work in radically different ways than human brains, and they will revolutionize society, but I have no way of guessing how. Ray Kurzweil will be forgotten, mostly, but records of the existence of a strange shaman of the circuitry from the late 20th and early 21st century will be tucked away in whatever the future databases are like, and people and machines will sometimes stumble across them and laugh or zotigrate and say, How quaint and amusing!, or whatever the equivalent in the frangitwidian language of the trans-entity circumsolar ansible network might be.

And thatll be kinda cool. I wish I could live to see it.

Go here to read the rest:

Singularitarianism? Pharyngula

Read the original post:

Singularitarianism | Prometheism.net

. Bookmark the

.

Read more from the original source:

Singularitarianism | Prometheism.net | Futurist Transhuman …

Posted in Singularitarianism | Comments Off on Singularitarianism | Prometheism.net | Futurist Transhuman …

Singularitarianism Wikipedia – euvolution.com

Posted: December 14, 2016 at 11:54 pm

Singularitarianism is a movement[1] defined by the belief that a technological singularitythe creation of superintelligencewill likely happen in the medium future, and that deliberate action ought to be taken to ensure that the Singularity benefits humans.

Singularitarians are distinguished from other futurists who speculate on a technological singularity by their belief that the Singularity is not only possible, but desirable if guided prudently. Accordingly, they might sometimes dedicate their lives to acting in ways they believe will contribute to its rapid yet safe realization.[2]

Time magazine describes the worldview of Singularitarians by saying that they think in terms of deep time, they believe in the power of technology to shape history, they have little interest in the conventional wisdom about anything, and they cannot believe youre walking around living your life and watching TV as if the artificial-intelligence revolution were not about to erupt and change absolutely everything.[1]

Inventor and futurist Ray Kurzweil, author of the 2005 book The Singularity Is Near: When Humans Transcend Biology, defines a Singularitarian as someone who understands the Singularity and who has reflected on its implications for his or her own life; he estimates the Singularity will occur around 2045.[2]

Singularitarianism coalesced into a coherent ideology in 2000 when artificial intelligence (AI) researcher Eliezer Yudkowsky wrote The Singularitarian Principles,[2][3] in which he stated that a Singularitarian believes that the singularity is a secular, non-mystical event which is possible and beneficial to the world and is worked towards by its adherents.[3]

In June 2000 Yudkowsky, with the support of Internet entrepreneurs Brian Atkins and Sabine Atkins, founded the Machine Intelligence Research Institute to work towards the creation of self-improving Friendly AI. MIRIs writings argue for the idea that an AI with the ability to improve upon its own design (Seed AI) would rapidly lead to superintelligence. These Singularitarians believe that reaching the Singularity swiftly and safely is the best possible way to minimize net existential risk.

Many people believe a technological singularity is possible without adopting Singularitarianism as a moral philosophy. Although the exact numbers are hard to quantify, Singularitarianism is a small movement, which includes transhumanist philosopher Nick Bostrom. Inventor and futurist Ray Kurzweil, who predicts that the Singularity will occur circa 2045, greatly contributed to popularizing Singularitarianism with his 2005 book The Singularity Is Near: When Humans Transcend Biology .[2]

What, then, is the Singularity? Its a future period during which the pace of technological change will be so rapid, its impact so deep, that human life will be irreversibly transformed. Although neither utopian or dystopian, this epoch will transform the concepts we rely on to give meaning to our lives, from our business models to the cycle of human life, including death itself. Understanding the Singularity will alter our perspective on the significance of our past and the ramifications for our future. To truly understand it inherently changes ones view of life in general and ones particular life. I regard someone who understands the Singularity and who has reflected on its implications for his or her own life as a singularitarian.[2]

With the support of NASA, Google and a broad range of technology forecasters and technocapitalists, the Singularity University opened in June 2009 at the NASA Research Park in Silicon Valley with the goal of preparing the next generation of leaders to address the challenges of accelerating change.

In July 2009, many prominent Singularitarians participated in a conference organized by the Association for the Advancement of Artificial Intelligence (AAAI) to discuss the potential impact of robots and computers and the impact of the hypothetical possibility that they could become self-sufficient and able to make their own decisions. They discussed the possibility and the extent to which computers and robots might be able to acquire any level of autonomy, and to what degree they could use such abilities to possibly pose any threat or hazard (i.e., cybernetic revolt). They noted that some machines have acquired various forms of semi-autonomy, including being able to find power sources on their own and being able to independently choose targets to attack with weapons. They warned that some computer viruses can evade elimination and have achieved cockroach intelligence. They asserted that self-awareness as depicted in science fiction is probably unlikely, but that there were other potential hazards and pitfalls.[4] Some experts and academics have questioned the use of robots for military combat, especially when such robots are given some degree of autonomous functions.[5] The President of the AAAI has commissioned a study to look at this issue.[6]

Science journalist John Horgan has likened singularitarianism to a religion:

Lets face it. The singularity is a religious rather than a scientific vision. The science-fiction writer Ken MacLeod has dubbed it the rapture for nerds, an allusion to the end-time, when Jesus whisks the faithful to heaven and leaves us sinners behind. Such yearning for transcendence, whether spiritual or technological, is all too understandable. Both as individuals and as a species, we face deadly serious problems, including terrorism, nuclear proliferation, overpopulation, poverty, famine, environmental degradation, climate change, resource depletion, and AIDS. Engineers and scientists should be helping us face the worlds problems and find solutions to them, rather than indulging in escapist, pseudoscientific fantasies like the singularity.[7]

Kurzweil rejects this categorization, stating that his predictions about the singularity are driven by the data that increases in computational technology have been exponential in the past.[8]

See original here:

Singularitarianism Wikipedia

. Bookmark the

.

Continued here:

Singularitarianism Wikipedia – euvolution.com

Posted in Singularitarianism | Comments Off on Singularitarianism Wikipedia – euvolution.com

Singularitarianism – Lesswrongwiki

Posted: September 20, 2016 at 7:10 pm

Wikipedia has an article about

Singularitarianism refers to attitudes or beliefs favoring a technological singularity.

The term was coined by Mark Plus, then given a more specific meaning by Eliezer Yudkowsky in his Singularitarian principles. “Singularitarianism”, early on, referred to an principled activist stance aimed at creating a singularity for the benefit of humanity as a whole, and in particular to the movement surrounding the Machine Intelligence Research Institute.

The term has since sometimes been used differently, without it implying the specific principles listed by Yudkowsky. For example, Ray Kurzweil’s book “The Singularity Is Near” contains a chapter titled “Ich bin ein Singularitarian”, in which Kurzweil describes his own vision for technology improving the world. Others have used the term to refer to people with an impact on the Singularity and to “expanding one’s mental faculties by merging with technology”. Others have used “Singularitarian” to refer to anyone who predicts a technological singularity will happen.

Yudkowsky has (perhaps facetiously) suggested that those adhering to the original activist stance relabel themselves the “Elder Singularitarians”.

Visit link:

Singularitarianism – Lesswrongwiki

Posted in Singularitarianism | Comments Off on Singularitarianism – Lesswrongwiki

Singularitarianism /r/Singularitarianism – reddit

Posted: July 29, 2016 at 3:10 am

Welcome to /r/Singularitiarianism

A subreddit devoted to the social, political, and technological movement defined by the belief that deliberate action ought to be taken to ensure that an Intelligence Explosion benefits human civilization.

The theory of Singularitarianism is that our human species is an infant waiting to be born. An infant that is unaware of an outside world beyond the womb. The hope, purpose, and meaning in the creation of greater-than-human intelligence is our will to be born. The birth of humanity, the birth of the infant, is the evolution of the intelligence of our man and machine civilization.

Singularitarianism is a non-religious, decentralized futurist and transhumanist movement. Singularitarianism is faith in scientific skepticism and admiration for the biological phenomenon of human intelligence. From this biological intelligence comes the awe, responsibility, and capability of creating non-biological machine intelligence.

The Singularity places a horizon across humanity’s understanding because we are still discovering the scientific nature of our own intelligence. Not until we understand and improve upon the biological heritage of our intelligence can we begin to understand the meaning of superintelligence. Ultimately, this reverence for universal forms of intelligence and sentience is our safeguard against mysticism, fanaticism, and ideology. Understanding and improving intelligence is simultaneously our greatest imperative and our guiding principle. This movement does not believe in God- but that simply man is a bridge and not an end- that instead we can become the Gods themselves. The human future(s) are infinite.

See the rest here:

Singularitarianism /r/Singularitarianism – reddit

Posted in Singularitarianism | Comments Off on Singularitarianism /r/Singularitarianism – reddit

Talk:Singularitarianism – Wikipedia, the free encyclopedia

Posted: June 27, 2016 at 6:24 am

Green Anarchist[edit]

The green anarchist line is identical in the lede and in the body. I’ve removed it from the article body but not the lede. While the lede should reference the content of the article, it should not be a verbatim copy. IRWolfie- (talk) 22:00, 25 April 2011 (UTC)

These inclusions still require third party sources to establish they are not a fringe view. IRWolfie- (talk) 13:19, 11 May 2011 (UTC)

The paragraph beginning “In July 2009, academics and technical experts, some of whom were Singularitarians …” appears a bit off topic, or at least a bit too much info on it not related to this Singularitarianism movement. Does anyone else agree? IRWolfie- (talk) 09:49, 12 May 2011 (UTC)

Wikilinking to new religious movement is inappropiate. Loremaster, do not revert my edits without some form of comment please. IRWolfie- (talk) 21:37, 12 May 2011 (UTC)

I have problems with this section of the lead –

“Desirablity” is just one kind of Singularitairanism. A better definition is that a Singularitarian is a person who strongly believes in the likelihood of a technological singularity in the medium term future, and that this raises issues and attitudes which often arise in theology and extreme forms of existentialism. The belief in near term inevitability and its religious and existential aspects are what define the Singularitarian, who may not find it desireable, or who might want to guide it, but does not have faith in an ability to do so. There are also many other Singularitarian perspectives. Does anyone have any sources to correct the lead? PPdd (talk) 01:18, 14 March 2012 (UTC)

Other pages covering similar topics have had this same confusion between biological and technological singularity. The reference to “The Singularity is Near”, by Raymond Kurzweil seems out of place to me, since his book seems to cover biological singularity, while this article would seem to more be referencing technological singularity Dreamstohack (talk) 18:37, 4 February 2013 (UTC)

If we’re going to include the claim in the lede that Singularitarian is a religion, then we should also clearly state in that same paragraph that Singularitians themselves do not agree with that claim, otherwise we are violating the “neutral point of view” rule. Either both points of view should be in that paragraph, or that paragraph should be removed completely and that comment left only in the “criticism” section. (Yosarian2 (talk) 17:00, 1 December 2013 (UTC))

Original post:

Talk:Singularitarianism – Wikipedia, the free encyclopedia

Posted in Singularitarianism | Comments Off on Talk:Singularitarianism – Wikipedia, the free encyclopedia

Singularitarianism? Pharyngula

Posted: March 27, 2016 at 1:47 am

Ray Kurzweil is a genius. One of the greatest hucksters of the age. Thats the only way I can explain how his nonsense gets so much press and has such a following. Now he has the cover of Time magazine, and an article called 2045: The Year Man Becomes Immortal. It certainly couldnt be taken seriously anywhere else; once again, Kurzweil wiggles his fingers and mumbles a few catchphrases and upchucks a remarkable prediction, that in 35 years (a number dredged out of his compendium of biased estimates), Man (one, a few, many? How? He doesnt know) will finally achieve immortality (seems to me youd need to wait a few years beyond that goal to know if it was true). Now weve even got a name for the Kurzweil delusion: Singularitarianism.

Theres room inside Singularitarianism for considerable diversity of opinion about what the Singularity means and when and how it will or wont happen. But Singularitarians share a worldview. They think in terms of deep time, they believe in the power of technology to shape history, they have little interest in the conventional wisdom about anything, and they cannot believe youre walking around living your life and watching TV as if the artificial-intelligence revolution were not about to erupt and change absolutely everything. They have no fear of sounding ridiculous; your ordinary citizens distaste for apparently absurd ideas is just an example of irrational bias, and Singularitarians have no truck with irrationality. When you enter their mind-space you pass through an extreme gradient in worldview, a hard ontological shear that separates Singularitarians from the common run of humanity. Expect turbulence.

Wow. Sounds just like the Raelians, or Hercolubians, or Scientologists, or any of the modern New Age pseudosciences that appropriate a bit of jargon and blow it up into a huge mythology. Nice hyperbole there, though. Too bad the whole movement is empty of evidence.

One of the things I do really despise about the Kurzweil approach is their dishonest management of critics, and Kurzweil is the master. He loves to tell everyone whats wrong with his critics, but he doesnt actually address the criticisms.

Take the question of whether computers can replicate the biochemical complexity of an organic brain. Kurzweil yields no ground there whatsoever. He does not see any fundamental difference between flesh and silicon that would prevent the latter from thinking. He defies biologists to come up with a neurological mechanism that could not be modeled or at least matched in power and flexibility by software running on a computer. He refuses to fall on his knees before the mystery of the human brain. Generally speaking, he says, the core of a disagreement Ill have with a critic is, theyll say, Oh, Kurzweil is underestimating the complexity of reverse-engineering of the human brain or the complexity of biology. But I dont believe Im underestimating the challenge. I think theyre underestimating the power of exponential growth.

This is wrong. For instance, I think reverse-engineering the general principles of a human brain might well be doable in a few or several decades, and I do suspect that well be able to do things in ten years, 20 years, a century that I cant even imagine. I dont find Kurzweil silly because Im blind to the power of exponential growth, but because:

Kurzweil hasnt demonstrated that there is exponential growth at play here. Ive read his absurd book, and his data is phony and fudged to fit his conclusion. He cheerfully makes stuff up or drops data that goes against his desires to invent these ridiculous charts.

Im not claiming he underestimates the complexity of the brain, Im saying he doesnt understand biology, period. Handwaving is not enough if hes going to make fairly specific claims of immortality in 35 years, there had better be some understanding of the path that will be taken.

There is a vast difference between grasping a principle and implementing the specifics. If we understand how the brain works, if we can create a computer simulation that replicates and improves upon the function of our brain, that does not in any way imply that my identity and experiences can be translated into the digital realm. Again, Kurzweil doesnt have even a hint of a path that can be taken to do that, so he has no basis for making the prediction.

Smooth curves that climb upward into infinity can exist in mathematics (although Kurzweils predictions dont live in state of rigor that would justify calling them mathematical), but they dont work in the real world. There are limits. Weve been building better and more powerful power plants for aircraft for a century, but they havent gotten to a size and efficiency to allow me to fly off with a personal jetpack. I have no reason to expect that they will, either.

While I dont doubt that science will advance rapidly, I also expect that the directions it takes will be unpredictable. Kurzweil confuses engineering, where you build something to fit a predetermined set of specifications, with science, in which you follow the evidence wherever it leads. Look at the so-called war on cancer: it isnt won, no one expects that it will be, but what it has accomplished is to provide limited success in improving health and quality of life, extending survival times, and developing new tools for earlier diagnosis thats reality, and understanding reality is achieved incrementally, not by sudden surges in technology independent of human effort. It also generates unexpected spinoffs in deeper knowledge about cell cycles, signaling, gene regulation, etc. The problems get more interesting and diverse, and its awfully silly of one non-biologist in 2011 to try to predict what surprises will pop out.

Kurzweil is a typical technocrat with limited breadth of knowledge. Imagine what happens IF we actually converge on some kind of immortality. Who gets it? If its restricted, what makes Kurzweil think he, and not Senator Dumbbum who controls federal spending on health, or Tycoon Greedo the trillionaire, gets it? How would the world react if such a capability were available, and they (or their dying mother, or their sick child) dont have access? What if its cheap and easy, and everyone gets it? Kurzweil is talking about a technology that would almost certainly destroy every human society on the planet, and he treats it as blithely as the prospect of getting new options for his cell phone. In case he hadnt noticed, human sociology and politics shows no sign of being on an exponential trend towards greater wisdom. Yeah, expect turbulence.

Hes guilty of a very weird form of reductionism that considers a human life can be reduced to patterns in a computer. I have no stock in spiritualism or dualism, but we are very much a product of our crude and messy biology we percieve the world through imprecise chemical reactions, our brains send signals by shuffling ions in salt water, our attitudes and reactions are shaped by chemicals secreted by glands in our guts. Replicating the lightning while ignoring the clouds and rain and pressure changes will not give you a copy of the storm. It will give you something different, which would be interesting still, but its not the same.

Kurzweil shows other signs of kookery. Two hundred pills a day? Weekly intravenous transfusions? Drinking alkalized water because hes afraid of acidosis? The man is an intelligent engineer, but hes also an obsessive crackpot.

Oh, well. Ill make my own predictions. Magazines will continue to praise Kurzweils techno-religion in sporadic bursts, and followers will continue to gullibly accept what he says because it is what they wish would happen. Kurzweil will die while brain-uploading and immortality are still vague dreams; he will be frozen in liquid nitrogen, which will so thoroughly disrupt his cells that even if we discover how to cure whatever kills him, there will be no hope of recovering the mind and personality of Kurzweil from the scrambled chaos of his dead brain. 2045 will come, and those of us who are alive to see it, will look back and realize it is very, very different from what life was like in 2011, and also very different from what we expected life to be like. At some point, I expect artificial intelligences to be part of our culture, if we persist; theyll work in radically different ways than human brains, and they will revolutionize society, but I have no way of guessing how. Ray Kurzweil will be forgotten, mostly, but records of the existence of a strange shaman of the circuitry from the late 20th and early 21st century will be tucked away in whatever the future databases are like, and people and machines will sometimes stumble across them and laugh or zotigrate and say, How quaint and amusing!, or whatever the equivalent in the frangitwidian language of the trans-entity circumsolar ansible network might be.

And thatll be kinda cool. I wish I could live to see it.

Go here to read the rest:

Singularitarianism? Pharyngula

Posted in Singularitarianism | Comments Off on Singularitarianism? Pharyngula

Singularitarianism | Prometheism.net

Posted: March 26, 2016 at 8:44 am

Singularitarianism is a movement[1] defined by the belief that a technological singularitythe creation of superintelligencewill likely happen in the medium future, and that deliberate action ought to be taken to ensure that the Singularity benefits humans.

Singularitarians are distinguished from other futurists who speculate on a technological singularity by their belief that the Singularity is not only possible, but desirable if guided prudently. Accordingly, they might sometimes dedicate their lives to acting in ways they believe will contribute to its rapid yet safe realization.[2]

Time magazine describes the worldview of Singularitarians by saying that they think in terms of deep time, they believe in the power of technology to shape history, they have little interest in the conventional wisdom about anything, and they cannot believe youre walking around living your life and watching TV as if the artificial-intelligence revolution were not about to erupt and change absolutely everything.[1]

Inventor and futurist Ray Kurzweil, author of the 2005 book The Singularity Is Near: When Humans Transcend Biology, defines a Singularitarian as someone who understands the Singularity and who has reflected on its implications for his or her own life; he estimates the Singularity will occur around 2045.[2]

Singularitarianism coalesced into a coherent ideology in 2000 when artificial intelligence (AI) researcher Eliezer Yudkowsky wrote The Singularitarian Principles,[2][3] in which he stated that a Singularitarian believes that the singularity is a secular, non-mystical event which is possible and beneficial to the world and is worked towards by its adherents.[3]

In June 2000 Yudkowsky, with the support of Internet entrepreneurs Brian Atkins and Sabine Atkins, founded the Machine Intelligence Research Institute to work towards the creation of self-improving Friendly AI. MIRIs writings argue for the idea that an AI with the ability to improve upon its own design (Seed AI) would rapidly lead to superintelligence. These Singularitarians believe that reaching the Singularity swiftly and safely is the best possible way to minimize net existential risk.

Many people believe a technological singularity is possible without adopting Singularitarianism as a moral philosophy. Although the exact numbers are hard to quantify, Singularitarianism is a small movement, which includes transhumanist philosopher Nick Bostrom. Inventor and futurist Ray Kurzweil, who predicts that the Singularity will occur circa 2045, greatly contributed to popularizing Singularitarianism with his 2005 book The Singularity Is Near: When Humans Transcend Biology .[2]

What, then, is the Singularity? Its a future period during which the pace of technological change will be so rapid, its impact so deep, that human life will be irreversibly transformed. Although neither utopian or dystopian, this epoch will transform the concepts we rely on to give meaning to our lives, from our business models to the cycle of human life, including death itself. Understanding the Singularity will alter our perspective on the significance of our past and the ramifications for our future. To truly understand it inherently changes ones view of life in general and ones particular life. I regard someone who understands the Singularity and who has reflected on its implications for his or her own life as a singularitarian.[2]

With the support of NASA, Google and a broad range of technology forecasters and technocapitalists, the Singularity University opened in June 2009 at the NASA Research Park in Silicon Valley with the goal of preparing the next generation of leaders to address the challenges of accelerating change.

In July 2009, many prominent Singularitarians participated in a conference organized by the Association for the Advancement of Artificial Intelligence (AAAI) to discuss the potential impact of robots and computers and the impact of the hypothetical possibility that they could become self-sufficient and able to make their own decisions. They discussed the possibility and the extent to which computers and robots might be able to acquire any level of autonomy, and to what degree they could use such abilities to possibly pose any threat or hazard (i.e., cybernetic revolt). They noted that some machines have acquired various forms of semi-autonomy, including being able to find power sources on their own and being able to independently choose targets to attack with weapons. They warned that some computer viruses can evade elimination and have achieved cockroach intelligence. They asserted that self-awareness as depicted in science fiction is probably unlikely, but that there were other potential hazards and pitfalls.[4] Some experts and academics have questioned the use of robots for military combat, especially when such robots are given some degree of autonomous functions.[5] The President of the AAAI has commissioned a study to look at this issue.[6]

Science journalist John Horgan has likened singularitarianism to a religion:

Lets face it. The singularity is a religious rather than a scientific vision. The science-fiction writer Ken MacLeod has dubbed it the rapture for nerds, an allusion to the end-time, when Jesus whisks the faithful to heaven and leaves us sinners behind. Such yearning for transcendence, whether spiritual or technological, is all too understandable. Both as individuals and as a species, we face deadly serious problems, including terrorism, nuclear proliferation, overpopulation, poverty, famine, environmental degradation, climate change, resource depletion, and AIDS. Engineers and scientists should be helping us face the worlds problems and find solutions to them, rather than indulging in escapist, pseudoscientific fantasies like the singularity.[7]

Kurzweil rejects this categorization, stating that his predictions about the singularity are driven by the data that increases in computational technology have been exponential in the past.[8]

Go here to see the original:

Singularitarianism WOW.com

See the article here:

Singularitarianism | Prometheism.net

Posted in Singularitarianism | Comments Off on Singularitarianism | Prometheism.net

Singularitarianism – WOW.com

Posted: at 3:45 am

Singularitarianism is a movement[1] defined by the belief that a technological singularitythe creation of superintelligencewill likely happen in the medium future, and that deliberate action ought to be taken to ensure that the Singularity benefits humans.

Singularitarians are distinguished from other futurists who speculate on a technological singularity by their belief that the Singularity is not only possible, but desirable if guided prudently. Accordingly, they might sometimes dedicate their lives to acting in ways they believe will contribute to its rapid yet safe realization.[2]

Time magazine describes the worldview of Singularitarians by saying that “they think in terms of deep time, they believe in the power of technology to shape history, they have little interest in the conventional wisdom about anything, and they cannot believe you’re walking around living your life and watching TV as if the artificial-intelligence revolution were not about to erupt and change absolutely everything.”[1]

Inventor and futurist Ray Kurzweil, author of the 2005 book The Singularity Is Near: When Humans Transcend Biology, defines a Singularitarian as someone “who understands the Singularity and who has reflected on its implications for his or her own life”; he estimates the Singularity will occur around 2045.[2]

Singularitarianism coalesced into a coherent ideology in 2000 when artificial intelligence (AI) researcher Eliezer Yudkowsky wrote The Singularitarian Principles,[2][3] in which he stated that a Singularitarian believes that the singularity is a secular, non-mystical event which is possible and beneficial to the world and is worked towards by its adherents.[3]

In June 2000 Yudkowsky, with the support of Internet entrepreneurs Brian Atkins and Sabine Atkins, founded the Machine Intelligence Research Institute to work towards the creation of self-improving Friendly AI. MIRI’s writings argue for the idea that an AI with the ability to improve upon its own design (Seed AI) would rapidly lead to superintelligence. These Singularitarians believe that reaching the Singularity swiftly and safely is the best possible way to minimize net existential risk.

Many people believe a technological singularity is possible without adopting Singularitarianism as a moral philosophy. Although the exact numbers are hard to quantify, Singularitarianism is a small movement, which includes transhumanist philosopher Nick Bostrom. Inventor and futurist Ray Kurzweil, who predicts that the Singularity will occur circa 2045, greatly contributed to popularizing Singularitarianism with his 2005 book The Singularity Is Near: When Humans Transcend Biology .[2]

What, then, is the Singularity? It’s a future period during which the pace of technological change will be so rapid, its impact so deep, that human life will be irreversibly transformed. Although neither utopian or dystopian, this epoch will transform the concepts we rely on to give meaning to our lives, from our business models to the cycle of human life, including death itself. Understanding the Singularity will alter our perspective on the significance of our past and the ramifications for our future. To truly understand it inherently changes one’s view of life in general and one’s particular life. I regard someone who understands the Singularity and who has reflected on its implications for his or her own life as a singularitarian.[2]

With the support of NASA, Google and a broad range of technology forecasters and technocapitalists, the Singularity University opened in June 2009 at the NASA Research Park in Silicon Valley with the goal of preparing the next generation of leaders to address the challenges of accelerating change.

In July 2009, many prominent Singularitarians participated in a conference organized by the Association for the Advancement of Artificial Intelligence (AAAI) to discuss the potential impact of robots and computers and the impact of the hypothetical possibility that they could become self-sufficient and able to make their own decisions. They discussed the possibility and the extent to which computers and robots might be able to acquire any level of autonomy, and to what degree they could use such abilities to possibly pose any threat or hazard (i.e., cybernetic revolt). They noted that some machines have acquired various forms of semi-autonomy, including being able to find power sources on their own and being able to independently choose targets to attack with weapons. They warned that some computer viruses can evade elimination and have achieved “cockroach intelligence.” They asserted that self-awareness as depicted in science fiction is probably unlikely, but that there were other potential hazards and pitfalls.[4] Some experts and academics have questioned the use of robots for military combat, especially when such robots are given some degree of autonomous functions.[5] The President of the AAAI has commissioned a study to look at this issue.[6]

Science journalist John Horgan has likened singularitarianism to a religion:

Let’s face it. The singularity is a religious rather than a scientific vision. The science-fiction writer Ken MacLeod has dubbed it the rapture for nerds, an allusion to the end-time, when Jesus whisks the faithful to heaven and leaves us sinners behind. Such yearning for transcendence, whether spiritual or technological, is all too understandable. Both as individuals and as a species, we face deadly serious problems, including terrorism, nuclear proliferation, overpopulation, poverty, famine, environmental degradation, climate change, resource depletion, and AIDS. Engineers and scientists should be helping us face the world’s problems and find solutions to them, rather than indulging in escapist, pseudoscientific fantasies like the singularity.[7]

Kurzweil rejects this categorization, stating that his predictions about the singularity are driven by the data that increases in computational technology have been exponential in the past.[8]

Go here to see the original:

Singularitarianism – WOW.com

Posted in Singularitarianism | Comments Off on Singularitarianism – WOW.com

Singularitarianism – Wikipedia, the free encyclopedia

Posted: March 24, 2016 at 8:45 am

Singularitarianism is a movement[1] defined by the belief that a technological singularitythe creation of superintelligencewill likely happen in the medium future, and that deliberate action ought to be taken to ensure that the Singularity benefits humans.

Singularitarians are distinguished from other futurists who speculate on a technological singularity by their belief that the Singularity is not only possible, but desirable if guided prudently. Accordingly, they might sometimes dedicate their lives to acting in ways they believe will contribute to its rapid yet safe realization.[2]

Time magazine describes the worldview of Singularitarians by saying that “they think in terms of deep time, they believe in the power of technology to shape history, they have little interest in the conventional wisdom about anything, and they cannot believe you’re walking around living your life and watching TV as if the artificial-intelligence revolution were not about to erupt and change absolutely everything.”[1]

Inventor and futurist Ray Kurzweil, author of the 2005 book The Singularity Is Near: When Humans Transcend Biology, defines a Singularitarian as someone “who understands the Singularity and who has reflected on its implications for his or her own life”; he estimates the Singularity will occur around 2045.[2]

Singularitarianism coalesced into a coherent ideology in 2000 when artificial intelligence (AI) researcher Eliezer Yudkowsky wrote The Singularitarian Principles,[2][3] in which he stated that a Singularitarian believes that the singularity is a secular, non-mystical event which is possible and beneficial to the world and is worked towards by its adherents.[3]

In June 2000 Yudkowsky, with the support of Internet entrepreneurs Brian Atkins and Sabine Atkins, founded the Machine Intelligence Research Institute to work towards the creation of self-improving Friendly AI. MIRI’s writings argue for the idea that an AI with the ability to improve upon its own design (Seed AI) would rapidly lead to superintelligence. These Singularitarians believe that reaching the Singularity swiftly and safely is the best possible way to minimize net existential risk.

Many people believe a technological singularity is possible without adopting Singularitarianism as a moral philosophy. Although the exact numbers are hard to quantify, Singularitarianism is a small movement, which includes transhumanist philosopher Nick Bostrom. Inventor and futurist Ray Kurzweil, who predicts that the Singularity will occur circa 2045, greatly contributed to popularizing Singularitarianism with his 2005 book The Singularity Is Near: When Humans Transcend Biology .[2]

What, then, is the Singularity? It’s a future period during which the pace of technological change will be so rapid, its impact so deep, that human life will be irreversibly transformed. Although neither utopian or dystopian, this epoch will transform the concepts we rely on to give meaning to our lives, from our business models to the cycle of human life, including death itself. Understanding the Singularity will alter our perspective on the significance of our past and the ramifications for our future. To truly understand it inherently changes one’s view of life in general and one’s particular life. I regard someone who understands the Singularity and who has reflected on its implications for his or her own life as a singularitarian.[2]

With the support of NASA, Google and a broad range of technology forecasters and technocapitalists, the Singularity University opened in June 2009 at the NASA Research Park in Silicon Valley with the goal of preparing the next generation of leaders to address the challenges of accelerating change.

In July 2009, many prominent Singularitarians participated in a conference organized by the Association for the Advancement of Artificial Intelligence (AAAI) to discuss the potential impact of robots and computers and the impact of the hypothetical possibility that they could become self-sufficient and able to make their own decisions. They discussed the possibility and the extent to which computers and robots might be able to acquire any level of autonomy, and to what degree they could use such abilities to possibly pose any threat or hazard (i.e., cybernetic revolt). They noted that some machines have acquired various forms of semi-autonomy, including being able to find power sources on their own and being able to independently choose targets to attack with weapons. They warned that some computer viruses can evade elimination and have achieved “cockroach intelligence.” They asserted that self-awareness as depicted in science fiction is probably unlikely, but that there were other potential hazards and pitfalls.[4] Some experts and academics have questioned the use of robots for military combat, especially when such robots are given some degree of autonomous functions.[5] The President of the AAAI has commissioned a study to look at this issue.[6]

Science journalist John Horgan has likened singularitarianism to a religion:

Let’s face it. The singularity is a religious rather than a scientific vision. The science-fiction writer Ken MacLeod has dubbed it the rapture for nerds, an allusion to the end-time, when Jesus whisks the faithful to heaven and leaves us sinners behind. Such yearning for transcendence, whether spiritual or technological, is all too understandable. Both as individuals and as a species, we face deadly serious problems, including terrorism, nuclear proliferation, overpopulation, poverty, famine, environmental degradation, climate change, resource depletion, and AIDS. Engineers and scientists should be helping us face the world’s problems and find solutions to them, rather than indulging in escapist, pseudoscientific fantasies like the singularity.[7]

Kurzweil rejects this categorization, stating that his predictions about the singularity are driven by the data that increases in computational technology have been exponential in the past.[8]

See more here:

Singularitarianism – Wikipedia, the free encyclopedia

Posted in Singularitarianism | Comments Off on Singularitarianism – Wikipedia, the free encyclopedia