andrewducker: (Default)
[personal profile] andrewducker
Over here Vernor Vinge talks about The Singularity. His personal definition is:
Humans, using technology, will be able to create, or become, creatures of superhuman intelligence.
[Poll #1390606]

Date: 2009-04-27 09:53 am (UTC)
From: [identity profile] meihua.livejournal.com
But we need to be planning for it now.

Date: 2009-04-27 10:03 am (UTC)
ext_58972: Mad! (Default)
From: [identity profile] autopope.livejournal.com
You missed out the option "some other answer which I will deliver in comments".

(I'm plumping for "within 50 years, but there are tech imponderables that we can't see past just yet -- notably on the neurobiology side". And I'm only giving that answer on Tuesdays.)

Date: 2009-04-27 10:08 am (UTC)
From: [identity profile] drjon.livejournal.com
Define "superhuman intelligence"--I'd argue we still don't have a good understanding of "intelligence", let alone "consciousness"...

Date: 2009-04-27 11:21 am (UTC)
From: [identity profile] drainboy.livejournal.com
I'm slightly gobsmacked by the (currently) five people who think that it'll be impossible for us to create (or become) super human intelligences ever.

I wonder if that's because they think there's something innately special about our wetware that we can't mimic or improve upon or something flawed enough that we (being that flawed wetware) aren't capable of improving on it or that we'll all die in some imminent nuclear/swine-flu disaster.

That last one is the only reason I think we might not hit some sort of singularity (as described by Vinge) at least eventually.

Date: 2009-04-27 12:50 pm (UTC)
ext_58972: Mad! (Default)
From: [identity profile] autopope.livejournal.com
We don't know what human intelligence is.

And even if we did -- seagulls fly. Boeing 737s fly. But Boeings (our engineered solution to the "let's go flying" problem) are radically different from seagulls, and you can't clearly extrapolate the divergent properties of the one from the other.

Date: 2009-04-27 12:59 pm (UTC)
From: [identity profile] princealbert.livejournal.com
thank fuck
imagine a 737s crap hitting your car

Date: 2009-04-27 07:02 pm (UTC)
ext_58972: Mad! (Default)
From: [identity profile] autopope.livejournal.com
It largely seems to be used to handwave a "Human Intelligence is Spoooooky - maybe it's beyond man's ability to meddle with!", which I assume isn't the kind of viewpoint you'd take.

Indeed not.

(In fact, that was my throw-book-at-wall moment with Roger Penrose -- when he glibly announced, in effect, that there was no reason to define consciousness in order to prove that a machine couldn't do it.)

My point is, we're talking about emulating or exceeding the performance of a trait we don't understand, which may be a whole lot different from what we think it is -- an emergent consequence of a bunch of different things.

Date: 2009-04-27 03:20 pm (UTC)
From: [identity profile] drainboy.livejournal.com
The way I see it the only useful way to view human intelligence is by looking at human behaviour (including talking to humans about their cognitive processes, feelings, understandings and so on), we can then try to mimic that behaviour computationally. If you made a list of all the things a human seemed to do behaviourally, then made a machine that could do them all you'd have a machine which exhibited human intellifence. Would that be enough? Would you need to define human intelligence other than by a list of behaviours?

I certainly think that there's a lot going on under the hood in the human mind that we're miles from understanding. I was reading an article in the economist the other day suggesting that you could pick up on someone (using the correct equipment) as they're about to have a eureka moment up to 8 seconds before they consciously realise the solution to a problem.

In terms of how our brains might take a conscious quest for an answer and then come to a conclusion unconsciously, throwing a solution at our conscious minds when we don't really know we've been thinking about it, we've got probably centuries of understanding to unravel. However that doesn't mean we can't see the sort of behaviour exhibited by humans and mimic it to an ongoing finer degree of granularity in the meantime.

I do think we're more likely to improve on our own intelligence before we make a machine that has human intelligence for two reasons. Firstly because the human mind is so plastic that I'm fairly certain it can adapt to be the filling in any form of perception/action loop. I'm reasonably convinced that evolution has given us brains that could be born into any body and, ignoring certain bits of specialisation (such as the visual cortex...which I believe is still usefully adapted by individuals born blind and possibly made blind in later life) we could certainly adapt to hardware upgrades as and when they become pragmatic (in terms of installing them, connecting them to our brains without damage or infection and finding hardware that would augment our intelligence).
Secondly because I think the concept of human intelligence is so wound up in the specific biology of humans, that to term a computer emulating human behaviour as "having human intelligence" is a huge misnomer. I believe that human intelligence is closely coupled to having human sensors and effectors and wrapped up in the experience of being human that to build a machine with human intelligence in any meaningful way would require building a human being.

I guess Vinge's use of the word "superhuman" can easily imply that the intelligence just has to be something measurable as better than human intelligence in some reasonable objective way, not that it has to be in any way associated with human intelligence. Then don't we have some systems which can do things far better than humans already? To have superhuman intelligence do you really need to be able to do everything better than every human? If you could do everything better than every human but you played chess worse, couldn't you still take the prize for being super human? Where does the cut off line come? When you can do more than 50% of things better than humans?

I guess I just find a lot of the definitions and cut-off points (as far as we have concrete ones) somewhat arbitrary and unsatisfactory. Perhaps this is just because some of my ideas on the subject are somewhat half formed :)

Date: 2009-04-27 12:51 pm (UTC)
From: [identity profile] princealbert.livejournal.com
Not in poll:

The HHGTTG answer. Already happened. Hiding it well. Leaving soon.

Date: 2009-04-27 01:25 pm (UTC)
From: [identity profile] drdoug.livejournal.com
Bit hit-and-run, which is annoying for a controversial answer, but I'd say we're already well past the Singularity.

With the aid of technology, we can already perform feats of superhuman intelligence and do so the whole time.

Focusing on whether computers are as (or more) intelligent like humans is silly. It's like trying to decide whether individual cell organelles are alive or not. It's a pointless question: the interesting phenomena in those terms are a level or two of description up from there.

To take one domain I happen to know well: no unaided human could conceivably sequence a single gene or elucidate a single protein structure. But nowadays any structural biology postgrad worth their salt could do that for you in fairly short order. And they could use that knowledge to invent something to address a particular biological challenge. With, of course, the aid of an astonishingly complex network of technology.

And at the other end of the experience/expertise spectrum for the human component of the system: my toddler can manipulate the ferromagnetic microstructure of a small piece of coated plastic, and the pattern of charge on many tens or hundreds of millions of tiny capacitors crammed together in to a piece of doped silicon, on the other side of the planet, without a second thought. He can even use this mechanism to transmit his image and voice to a grandparent in New Zealand in real time.

Date: 2009-04-27 03:33 pm (UTC)
From: [identity profile] meihua.livejournal.com
It sounds like you're using a different definition of "Singularity" from the article cited in the OP.

Date: 2009-04-27 02:24 pm (UTC)
From: [identity profile] anton-p-nym.livejournal.com
Arguably we hit "a" singularity when the Gutenberg press came out; now every human being on the planet can, with training and relatively minor expense, possess eidetic memories of events to which they were not personally present. That makes for a radically different world-view.

Another singularity would be the introduction of broadcast radio, which extends human hearing to encompass the world.

And the Internet could count, as it takes the printing press and radio combined and then squares and cubes it by making it accessable to anyone of remarkably modest means.

-- Steve isn't certain that a Vingean "rapture of the geeks" singularity will strike, ever, but is certain that we're probably going to reinvent what it means to be human more than once in his remaining lifespan.

Date: 2009-04-27 03:33 pm (UTC)
From: [identity profile] meihua.livejournal.com
It sounds like you're using a different definition of "Singularity" from the article cited in the OP.

Date: 2009-04-27 04:53 pm (UTC)
From: [identity profile] anton-p-nym.livejournal.com
It's quite possible; alas, I can't reach the article on my work computer.

-- Steve was working from recollection of prior definitions of "singularity", and recollection is one of those pre-singularity soft spots.

Date: 2009-04-27 04:18 pm (UTC)
From: [identity profile] andrewhickey.livejournal.com
I think it entirely possible that the 'singularity' would happen within my lifetime. I also think it entirely possible that the human race will render the planet Earth uninhabitable before that happens. I suspect the chances of the latter are higher than the chances of the former, sadly, but intend to do what little I can to tip things the other way...

Date: 2009-04-28 10:36 pm (UTC)
From: [identity profile] random-redhead.livejournal.com
I don't see how it will affect me. we live in a stratified society and I'm in the lower end. the people at the bottom will always work and live and seek pleasure and never have enough. it is the way of the world.

September 2025

S M T W T F S
  12 3 4 5 6
7 8 9 10 11 12 13
14 15 16 17 181920
21222324252627
282930    

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Sep. 18th, 2025 07:58 pm
Powered by Dreamwidth Studios