Singularitarian FAQ          Locations of visitors to this page  

Google


WWW PIVOT.NET       (whole words work best)

Site Map       Parent Level

Transportation ] Keys to the Universe ] Space ] Knowledge Management ] The Singularity ] Artificial Life ] Nanotechnology ] Technology ] Science ] Life Extension ] Brain Food ]

Same Level

[ Singularitarian FAQ ] When we will be like the gods ] Staring Into The Singularity ] Subjective time ] Damien Broderick ] Brain Scanning ] DARPA's supersoldiers ] Cyborg future ] Music from the Singularity ] First alien will be AI ]

Child Level

Singularitarian FAQ 1.0 (1.16)

 Original source: 

Introduction

The purpose of this FAQ is to provide short answers to common questions about Singularitarians. A lot of it focuses on the most common questions about the Singularity in general, but also answers questions concerning the people who call themselves Singularitarians. This document is intended for reading by an audience familiar with Transhumanist ideas, so you may want to go read the Transhumanist FAQ before you continue.

The FAQ is edited by Gordon Worley and you should contact him if you have a question about the FAQ itself, but not with your own questions about the Singularity (unless you are suggesting that they become part of the FAQ). I get a lot of e-mail and even if I did respond it wouldn't be a very good response due to time constraints. See the follow up questions for more information on what to do with your additional questions.

Below is the table of contents for this document.

  1. Introduction
  2. The Singularity
  3. Singularitarians
  4. Follow ups
 

The Singularity

 

What is the Singularity?

Throughout history the rate of technological progress has been increasing. It took millions of years to develop language, a few more millennia to produce the printing press, a few centuries to get digital computers, and a couple decades to produce the Internet. Extrapolated, this means a time will come when progress will take essentially no time at all. In essence, this is what the (technological) Singularity is: the time when technology progresses infinitely fast.

After reading that last paragraph, you're likely full of reasons to disprove me, so I'll elaborate. The technological Singularity (hereafter this is almost always what is meant by the Singularity, since it is the shortest path to Singularity compared to an alternate path, like the biological one) is the result of a positive feedback loop created by artificial intelligence (AI). First, an AI is written that is generally intelligent and is then taught how to design computer processors. Then, the AI designs a processor that runs twice as fast as the one ve is currently running on. Next, ve puts the new 2x processor in vis computer. Ve proceeds to design another processor twice as fast as the last one (4x), but in half the time because ve is now running twice as quickly. With the assistance of nanotechnology, it's not long before the Singularity can be achieved by doubling processing power and halving the time to do so with each iteration.

The Singularity, though, is not quite so simple. It is also a turning point in the history of the universe. It is the time when things change in a way that makes everything different. It is like a horizon, but one that can be approached. When the horizon is crossed, that's when everything changes.

This is the basic idea behind the Singularity, but depending on your newness to the idea you may need more information than this FAQ is intended to supply. A good start is to read Vernor Vinge's paper that sparked thinking about the Singularity. A good introduction was written by Dani Eder. For more details than you can shake a stick at, try Eliezer Yudkowsky's The Low Beyond. Finally, reading a critical discussion of Vinge's conception of the Singularity is a good way to gain some perspective.

What will life be like after the Singularity?

In all honesty, no one is really sure, but this makes sense: just as we don't know what it's like inside the singularity of a black hole, we don't know what it will be like post Singularity because we can't go there, viz. it is an environment so different from our own that any attempts at simulation will be flawed and biased by our own sense of reality. We do, however, have some good ideas. In particular, we expect to see general enlightenment and an end to pain and suffering. People will be able to live forever or die after a while if they feel like it. Basically, Singularitarians view the Singularity as coming about as close to Utopia as possible, maybe even reaching Apotheosis, which has a lot to do with why Singularitarians want the Singularity.

When will the Singularity happen?

This is the most asked question of Singularitarians. While Singularitarians vary in their answers, all fall between now and somewhere around 2030. Those who claim it could happen now do so because they hold that we currently have the hardware to produce the Singularity but not the software. At any rate, a near present hard take off is not feasible given the current state of nanotechnology. The nearest term predictions for a hard take off fall between 2005 and 2010, when they think that artificial intelligence and hardware will first be advanced enough to result in the Singularity. Other predictions are more conservative, reaching into 2030. Most Singularitarians agree that if the Singularity has not occurred by 2030 then there has likely been an existential disaster that prevents the Singularity from ever happening or at least not for a long, long time.

Despite the above statement, by 2040 at the latest it should be possible to brute force the Singularity, viz., via some kind of genetic algorithm, try all code until a Friendly AI that can initiate a hard take off and generate the Singularity is created. Note, though, that the brute force method is a last resort, only to be used if all else fails (and fail really does mean fail, such as discovering that humans are not smart enough to write an AI of the level needed).

How much computing power does the Singularity require?

That depends on whom you ask. While different people use different numbers to get a different number of cycles per second, the real answer is that no one is completely sure. It is the editor's opinion, though, that the key for AI to reach the Singularity lies not in hardware power but in software power. Once we have the right software, the only amount of computing power of interest will be the point at which a hard take off becomes possible.

If the Singularity goes wrong, how bad could it be?

Worst case scenario is basically the end of all computation in the universe. Yes, very unlikely and it may turn out to not be a real threat in hindsight, but for now this is about as bad as it gets. More likely and just about as bad is the end of all intelligent computation, but at least then the chance for intelligence to reemerged remains. These generally occur due to something along the lines of grey goo, the result of dumb nanotechnology consuming all of the matter/energy in the universe to reproduce. Also, there is the possibility of dumb AIs getting on really good hardware and then doing something idiotic, like turning the universe into copies of their creator after being told to love thy creator. These are all minor concerns, though, since they are simple enough to prevent that any serious effort to reach the Singularity will avoid them with ease. The real issue is not that of dumb AI, but evil AI that would do something like convert all matter/energy in the universe for vis own personal use, including the matter being used by other intelligences. Currently, the best means of preventing this is Friendly AI and, once post Singularity, the Sysop.

The chances of the Singularity going wrong are currently seen as high, but will decrease with time. The reason is that as of today, if humanity produced an AI to take it to the Singularity and beyond, the chances of it being unFriendly, dumb, or evil are all very high. We just don't have the experience with writing general intelligence, let alone Friendly AI, to be able to do it safely enough to set it on the task of creating the Singularity. Yet, within the next few years, the chances of this will decrease significantly with the realization of Friendly AI. In fact, as the chances of the Singularity occurring today go up, the chances of it ending in disaster go down because research being done into making the Singularity happen both brings it closer and makes it safer. This is not, however, a suggestion that delaying the Singularity is a good idea. We want to bring the Singularity about as soon as possible and as safely as possible, since it will save millions of lives from an empty death in the harsh reality humanity lives in.

Why should I care about the Singularity?

The Singularitarian answer is that you should care because it is what is going to happen. Singularitarians view the Singularity as inevitable, thus the reason for putting an emphasis on making it safe rather than just happening. Of course, folks who are more skeptical about the occurrence of the Singularity might not find any reason to even be worried about it, but if you don't care about the future now, you lose your right to complain when we live in a dystopia because you didn't help promote a safe technological Singularity. As James Gunn stated in his book The Science of Science-Fiction Writing, "those who ignore the future are destined to be its victims".

What's with all these short 'v' words?

Those are sex neutral pronouns, as proposed by Greg Egan. They are ve for third person subject, ver for third person object, vis for third person possessive, and verself for third person self reference. Generally they are used to refer to an AI of at least human level, but some folks also choose to use them for a human of unspecified sex.

 

Singularitarians

 

What/who is a Singularitarian?

A Singularitarian is a person who advocates and works to produce the Singularity. A person can think that the Singularity will occur but not necessarily be a Singularitarian. While this is as official as any a definition gets, there are also the Singularitarian Principles, which outline the qualities almost all Singularitarians have (or should, anyway ;-)).

How are Singularitarians related to Extropians/Transhumanists/etc.?

Like Extropians, Singularitarians are an outgrowth of the Transhumanist community, but more on technological than philosophical grounds. Unlike other groups of futurists, Singularitarians focus on bringing about the Singularity as quickly and safely as possible, working only on those technologies that will get us there (letting other advancements either become moot post Singularity or wait for rapid development post Singularity). Most questions about Singularitarians are really questions about Transhumanists and only those questions unique to Singularitarians are addressed here.

Is there any philosophy/political view/etc. shared by Singularitarians?

Yes and no. Eliezer Yudkowsky has written the Singularitarian Principles, but these are just his ideas about how Singularitarians should be. In addition, Singularitarians are part of the larger group of Transhumanists, so, while sharing traits with them, there are some important differences because the Singularity means that not everything need be done today, because it can be better done post Singularity. It is safe to say, though, that Singularitarians are generally intelligent and have spent time developing their personal philosophies which, while complex, are fairly consistent and close to complete, from a human perspective.

How many Singularitarians does it take to screw in a light bulb?

13.

I'm amazed at these ideas. I want to be a Singularitarian. How can I help?

The best place to start is the To Do List, but first you should try to educate yourself before you run out and make contributions. Singularitarians are not interested in spreading their ideas to everyone (though that will eventually happen naturally on its own), but in finding those who can help them achieve the Singularity. That is the purpose of literature like this: hoping it will be read by someone who will go on to help code the Seed AI rather than die a starving artist in some gutter. With all that in mind, look at the To Do List and see what you can do to help bring the Singularity closer, especially if you have the technical knowledge to be of assistance in actually getting things done.

How do Singularitarians respond to opposition?

A lot of people, when their beliefs are questioned, get really upset. Singularitarians, however, are different. Most do not attach much, if any, emotional values to what they believe, resulting in the disuse of the word 'belief'. If you tell a Singularitarians that ve is wrong, ve will ask why and then go on to explain if ve can. If ve can't, then ve'll admit that there isn't a good answer to the point, or at least not that ve knows of, and suggest that you either do more research on your own or wait until ve or someone else finds an answer. There may truly be holes in Singularitarian ideas, so shoot away, but it's best to do so only after reading more than just this FAQ on your own.

Are there any famous Singularitarians?

A common technique of political parties is to gain support by showing off their more famous supporters, which plays on a vulgar publics' interest in celebrity. Singularitarians, though, place more emphasis on ideas than people, thus our elite are ideas like the Singularity, Friendly AI, and the Sysop, not people like Eliezer Yudkowsky, Ben Goertzel, or even yours truly. But, to answer the question, yes, there are some pretty well known folks who are Singularitarians, but I'm not going to tell you who.

Why should I trust you Singularitarians?

You are left to your own heuristics when it comes to deciding whether an individual Singularitarian is trustworthy, just as you do with any other being. When it comes to Singularitarian ideas, though, the means of establishing trust is by reading about the ideas and forming an opinion based on the validity you perceive of the ideas. Singularitarians do their best to come up with solid ideas about the Singularity which they trust and hope that you will, too. If you are worried that you are letting your future be trusted to others, learn about the Singularity and do what you can to see to it coming about safely.

 

Follow ups

 

I'm scared. What should I do?

There are several possibilities. One is that, even though you are a transhumanist, the shock is just to great. Fear not, for you are only experiencing mild future shock and it should pass. Another possibility is that the Singularity really does scare you and you are honestly concerned about letting the Singularity happen. In this case, I urge you to show restraint and refrain from trying to block our efforts to progress. Instead, work towards producing safe versions of technologies needed to reach the Singularity, such as artificial intelligence and nanotechnology, so that the chances of a safe Singularity which will allow you to continue to live Singularity free much as you do today are greater while others will be free to transcend.

I have more questions. Who do I ask?

The best resources are the readings and papers listed on this site. If they still don't produce a response, you may want to ask the Extropian mailing list or the SL4 mailing list, but it is suggested that you read or at least search their archives first to see if your question as already been answered.


Site Meter

 

 

 

 

   Singularitarian FAQ     

Google
WWW PIVOT.NET       (whole words work best)

Site Map       Parent Level

Transportation ] Keys to the Universe ] Space ] Knowledge Management ] The Singularity ] Artificial Life ] Nanotechnology ] Technology ] Science ] Life Extension ] Brain Food ]

Same Level

[ Singularitarian FAQ ] When we will be like the gods ] Staring Into The Singularity ] Subjective time ] Damien Broderick ] Brain Scanning ] DARPA's supersoldiers ] Cyborg future ] Music from the Singularity ] First alien will be AI ]

Child Level