General Question

aphilotus's avatar

What do people think of the Singularity?

Asked by aphilotus (2926points) October 23rd, 2009
13 responses
“Great Question” (0points)

Vernor Vinge thought up the idea. In short, the thought is that something will happen in the future (probably the invention of hard AI) after which all of our current models of prediction will not work. That is to say, there will be some shift in how things work that we pre-singularitarians cannot even grasp.

Observing members: 0
Composing members: 0

Answers

Jayne's avatar

Is there any reason to believe this is true?

Christian95's avatar

what that hard Al?

BhacSsylan's avatar

@Jayne Well, I just read up on it, and it’s an interesting concept. Not sure if amphilotus explained it entirely. In general, it means (according to Vinge), that once AI reaches the ‘superhuman’ state, we will no longer be capable of understanding it (because if we could, then we would be superhuman), and we wouldn’t be able to predict it’s behavior, for the same reason. The general outcome is that then we die out for being obsolete, or something of the sort.

So, I can’t say I’m convinced. Interesting argument, but you’ll have to show me evidence for AI first. We’re getting good at virtual AI, but it’s mostly just more and complicated algorithms, as opposed to a program actually capable of thought and growth. He also bases this on moore’s law, which has broken down at this point, since we are no longer capable of making processors any smaller, due to electron tunneling effects. Quantum computing holds some promise, but that’s still in infancy, so I’ll wait on that. Still don’t think we’ll hit it by 2023, which was Vinge’s prediction.

aphilotus's avatar

@BhacSsylan sorry I was testing the ask-by-Instant-Message-Flutherbot and it sort of cut me off, so yeah, thanks for the extra explaination.

@Christian95 Hard AI is Artificial Intelligence where certain mathematical and computational problems have been solved- it is AI in the futuristic “computers with human-like brains” sense, rather than the AI we have today- which are “smart” only to a point- they only “understand” things to the extent that that understanding is coded in. They can, for example, manipulate language (translate from spanish to english or whatever) but they do not yet understand what it is they are translating – it is just bits, not ideas or thoughts.

BhacSsylan's avatar

@aphilotus No problem. Glad to see I’ve apparently explained it well.

drdoombot's avatar

@BhacSsylan According to Wikipedia, Moore’s Law is still in effect and expected to remain so at least until 2015. That’s far enough away that new techniques might allow for Moore’s Law to continue beyond that point.

Also, we don’t know if we’re going to die out from being obsolete. As I understand it, the post-Singularity AI will be used to solve problems we haven’t been able to find solutions to, such as interstellar engines, a cure for cancer, extending human life, , making solar panels more efficient, mapping the human brain, altering human genes, etc. To put it simply: the problems that could not be solved by the human brain will be tackled by a super-brain (which will continue to upgrade itself according Moore’s law).

Some have called it the Rapture for Nerds, and I can see why. There’s a part of me that thinks much coolness will come after the Singularity, and I think there’s a good chance it will happen.

wundayatta's avatar

It makes me wonder if the growth in computing power will become so great that we will not be able to use if effectively. Or will our imaginations keep us always hungry for more computing power?

BhacSsylan's avatar

@drdoombot Well, I can’t say that I really have an effective source for why I say Moore’s law is broken, I’ve just noticed the ever-lacking trend in personal computers. We’ve about topped out at 4 gigs, and I have yet to see a faster processor. It’s been a little while. But, I’ll back off in deference to the guy with the source.

As far as the ‘die off from being obsolete’ comment, I was just making up an outcome since it’s technically impossible to predict past the singularity (that’s the effect, after all). So i just picked something random. I wasn’t meant to actually be my stance on the subject. Sorry.

jackm's avatar

Everyone interested in the sigularity should read “The Singularity is Near” by Raymod Kurzweil.

Its a fascinating book where he puts forth all his reasonings for when and why the singularity will happen. If I recall correctly he places it at about 2035.

Jayne's avatar

Ah, I see, thanks for the clarification, @BhacSsylan; the question, or I, glossed over the AI thing, and it sounded like he was talking about some metaphysical change in reality, which kind of seemed like ass-pullage. Well, I don’t think it’s unreasonable to imagine AI becoming sufficiently complex and self-directed that we can no longer understand or predict them. After all, our brains operate on the same laws of physics, and they produce behavior that we cannot account for quantitatively, so a computer can theoretically do the same.

EmpressPixie's avatar

It’s the nerd rapture. Seriously, Christian is to the rapture as nerd is to the singularity.

Obviously RK believes in it full throttle and is taking a million (slight exaggeration) pills a day to live to see it. Personally, it represents a bit of a disagreement between my fiance who is all about it and me. I’m not convinced it’s going to be better than sliced bread, if you get my drift.

BBQsomeCows's avatar

FAIL
FAIL

you cited wiki*dia

FAIL

wundayatta's avatar

[rolls eyes]

Answer this question

Login

or

Join

to answer.

Mobile | Desktop


Send Feedback   

`