Date: 2010-08-19 07:58 am (UTC)
simont: A picture of me in 2016 (Default)
From: [personal profile] simont
A contrary opinion: have you seen this before?

Date: 2010-08-19 08:19 am (UTC)

Date: 2010-08-19 08:26 am (UTC)
From: [identity profile] ciphergoth.livejournal.com
Incidentally, if you want to know more about how hard it might actually be to duplicate the brain using a computer, have a look at the Brain Emulation Roadmap (also commentary on my blog) and the Blue Brain Project.

Date: 2010-08-19 08:35 am (UTC)
From: [identity profile] ciphergoth.livejournal.com
I'm not entirely convinced by the scanning problems raised, but I'd like to get more expert eyes on them before trying to say anything more confident.

Date: 2010-08-19 09:43 am (UTC)
From: [identity profile] andrewhickey.livejournal.com
I think anyone with even a schoolkid's knowledge of biology knows that simulation or emulation of the brain is a very, very hard problem, composed of many smaller very, very hard problems.
Unfortunately, the Singularity Summit types *don't* have a schoolkid's knowledge of biology. Ray Kurzweil is quite possibly the stupidest human being ever to have lived, making *exactly* the same mistake those people in the 1950s made when they graphed the fastest speeds ever achieved and said "FTL by the 1990s!". He makes quite good pianos though. Meanwhile Eliezer Yudkowsky is incredibly bright, but an incredibly bright autodidact high-school dropout who regards disagreement with himself as enmity to the human race.
If there ever *is* a 'Singularity', it'll be nothing to do with 'futurists', but with people who do basic scientific research or solve engineering problems, plodding along slowly and carefully in their dull day jobs, not people who go around saying they're the most important person in the history of the universe...

Date: 2010-08-19 09:50 am (UTC)
From: [identity profile] andrewhickey.livejournal.com
Incidentally, despite my criticism of Yudkowsky there, his LessWrong site is one I read avidly and link from my blog. He's often wrong, but interestingly and thought-provokingly wrong. Kurzweil, on the other hand, is a cretin.

Date: 2010-08-19 09:57 am (UTC)
From: [identity profile] ciphergoth.livejournal.com
Kurzweil is not stupid, but his approach to futurism is crazily overconfident. "regards disagreement with himself as enmity to the human race" is silly. If there is a Singularity that is not the result of the work of futurists, it's very unlikely to end well for us.

Date: 2010-08-19 10:03 am (UTC)
From: [identity profile] ciphergoth.livejournal.com
Also, I'd be interested to know about any schoolboy biology errors - or any biology errors - made by anyone in SIAI, who host the Singularity summit. For example, Eliezer Yudkowsky has written extensively about evolutionary biology, so I'd be interested to know about errors there; or FHI who are closely allied produced the Whole Brain Emulation Roadmap linked above, so errors there would again be interesting. Thanks.

Date: 2010-08-19 10:23 am (UTC)
From: [identity profile] andrewhickey.livejournal.com
"Kurzweil is not stupid, but his approach to futurism is crazily overconfident. "

It is, but it goes further than that. In his book, for example, he talks about being able to build replicators which, when programmed, could create replicas of literally any physical object down to the subatomic level, and then goes on to say that when such things are created we must look for ways of protecting the intellectual property of the people who come up with programs for them. That's a whole, special, unique kind of stupid...

""regards disagreement with himself as enmity to the human race" is silly."
It would be if it weren't true. See the recent events where someone posted a relatively innocuous post on LessWrong. Yudkowsky not only removed it, but forbade anyone from ever linking to it on the site, and gave strongly-worded warnings not to go looking for that information elsewhere, on the grounds that it would make it more likely that an unfriendly AI would be created and take over the world.

"If there is a Singularity that is not the result of the work of futurists, it's very unlikely to end well for us."
*If* there is a singularity (something for even the possibility of which there is no real evidence, though in my own view there are no basic physical laws forbidding it) that is the work of even moderately competent engineers, they will have built in most of the safeguards that someone like Yudkowsky talks about, because the threat of paperclipping the universe *is* a real (and obvious) one.

On the other hand, Yudkowsky and the SIAI are the only futurist group I've seen which have anything even remotely similar to a plan to actually *do* anything, rather than just sit around and talk about how great everything is going to be Real Soon Now, and their plan looks to me rather more like a pyramid-scheme-cum-cult than an actual plan, what with Yudkowsky having been talking it up for more than a decade and still not having got round to starting the basic work. (Plus he's planning on trying to write a greater-than-human AI in *JAVA* - that's like trying to build a space elevator out of matchsticks).

Real progress will come, as it always has, from the combination of people doing real basic scientific research and people trying to solve real-world engineering problems, not from people who just want to talk about how fantastic and whizz-bang awesome it'll be when they're immortal gods...

Date: 2010-08-19 10:28 am (UTC)
From: [identity profile] ciphergoth.livejournal.com
It was removed by the original author after Yudkowsky pointed out what a potentially destructive post it was to make. There are other inaccuracies in this account, but it's not a good idea.

No-one knows how to build the safeguards Yudkowsky talks about, and the opinion of mainstream AI is that there is no such danger.

SIAI are not coding an AI at all. The "cult" bogeyman is incredibly tiresome; it's addressed here.

Date: 2010-08-19 10:31 am (UTC)
From: [identity profile] andrewhickey.livejournal.com
I have seen Yudkowsky repeat Kurzweil's claims that the total information needed for coding the brain's structure can be found in the human genome. This is roughly akin to stating that the total information found in Beethoven's Ninth Symphony can be encoded in the string 'playsound /home/andrew/Beethoven-9.mp3' because typing that into a computer will reproduce the symphony...

Date: 2010-08-19 10:32 am (UTC)
From: [identity profile] ciphergoth.livejournal.com
You believe that the Kolmogorov complexity of the brain can be very much higher than 6Gb?

Date: 2010-08-19 10:38 am (UTC)
From: [identity profile] andrewhickey.livejournal.com
I've seen absolutely no evidence that it's lower. The human brain is, as far as I'm aware, the most complex known structure in the universe. Even with the best compression techniques available, I can't get my collection of Beach Boys MP3s down much below 10g. Until I see actual evidence - not just speculation, or statements involving genomes but not taking into account the way those genomes interact with the wider environment - I'll continue to consider it likely that the most complex object in the universe is less complex than my Beach Boys MP3s, yes...

Date: 2010-08-19 10:39 am (UTC)
From: [identity profile] andrewhickey.livejournal.com
"to consider it *UN*likely"...

Date: 2010-08-19 10:59 am (UTC)
From: [identity profile] andrewhickey.livejournal.com
"It was removed by the original author"
I don't recall it that way, but that's rather a side issue if, as you say, "Yudkowsky pointed out what a potentially destructive post it was to make". That confirms my basic point - he believed a disagreement with himself to be 'destructive'.

"No-one knows how to build the safeguards Yudkowsky talks about"
No-one knows how to build an AI which would need such safeguards either.

"and the opinion of mainstream AI is that there is no such danger."
Who is this 'mainstream AI'? As far as I can tell, most mainstream AI researchers don't believe there's a danger because what they're doing is not going to lead to anything close to general intelligence for many, many decades. That's not a judgement on the danger of a malicious artificial intelligence, but on the likelihood of one being created at all in the near future.
Put it this way - I'm entirely convinced that a vampire hiding under my bed would be horribly dangerous, but I don't take a wooden stake and garlic to bed with me every night...

"SIAI are not coding an AI at all"
Kind of my point. But Yudkowsky has stated several times that that is their eventual aim, and that he thinks Java would be the best tool to do the work in.

"The "cult" bogeyman is incredibly tiresome; it's addressed here."
What that essentially boils down to is that Yudkowsky doesn't consider 'cult' a useful term. I do, and if someone states that he and only he has the ability to save the universe, that donating money to his organisation is the single most important thing anyone else in the world can do, and that minor disagreement endangers the whole human race, then 'cult' seems to me a reasonable descriptor.

None of which is to say that Yudkowsky doesn't have some good ideas - he can, when he wishes, explain a lot of rather complex scientific ideas in very easily-comprehensible terms. I also believe he very sincerely believes everything he writes and says. But that doesn't make his organisation any less of a dead end.

But I should probably continue this on my own blog tonight, rather than filling up Andrew's comments with this...

Date: 2010-08-19 11:12 am (UTC)
From: [identity profile] ciphergoth.livejournal.com
It seems counterintuitive to say the least to assert that coding for the procedure that turns genomes into animals would require more vastly more bits than the size of the genome itself. PZ points out that it would be very computationally expensive and there's still a lot we don't know about it, but that's not the same as being very large in bits. Of course encoding a specific person's brain would take a lot more bits, because that contains a lot of information which comes from accidents of history.

Date: 2010-08-19 11:17 am (UTC)
From: [identity profile] andrewhickey.livejournal.com
Firefox crashed and ate my original, longer reply to this :-/

Put simply, I'm not arguing that the brain *must* be more complex than that - I think it quite likely that it isn't. But I think it more likely that it is.

Lots of things are irreducibly complex - I used part of my MP3 collection as an example. Other things, e.g. the complex patterns created by cellular automata - are not and can be described in a tiny number of lines of code.

I don't know which category the human brain goes in, but my hunch, based on the facts that it is very complex, that it consists of multiple, semi-independent, interacting complex systems, and that nobody seems to have come up with any practicable suggestions as to how we would find the underlying simpler principles, is that it fits into the MP3 category rather than the cellular automaton category. If so, then I would be surprised if it used fewer bytes to describe than my Beach Boys MP3s, though more in the slightly-raised eyebrow way than the collapsing in shock way.

Date: 2010-08-19 11:28 am (UTC)
From: [identity profile] andrewhickey.livejournal.com
"It seems counterintuitive to say the least to assert that coding for the procedure that turns genomes into animals would require more vastly more bits than the size of the genome itself"

Not to me it doesn't, given that that procedure involves an entire other animal to start with, plus that animal's external environment.

It's not a matter of 'turning genomes into animals' - the genome is useless without a cell (containing vastly more information than the genome) which in turn is useless without a mother (containing vastly more information than the cell).

If you want to think of the genome as being the 'program' that creates an organism (which is a huge oversimplification), then the environment in which that genome operates is analogous to the hardware and operating system that it runs on. I don't consider it counterintuitive at all that a full description of the hardware and OS on my desktop would take far more bits than the code for the web browser it's running. It might not, but I don't see any obvious reason why it shouldn't.

Date: 2010-08-19 12:20 pm (UTC)
From: [identity profile] andrewhickey.livejournal.com
Thanks, and I'm glad you think so. The 'tonight' part still applies though - this isn't what my employers are paying for ;)

Date: 2010-08-20 06:50 am (UTC)
djm4: (Default)
From: [personal profile] djm4
Have any of the cryonics experts you were going to point at my objections been able to explain why I'm wrong? I haven't seen any comment at all, and it's been a while now, so I'm genuinely interested.

Date: 2010-08-20 06:54 am (UTC)
From: [identity profile] ciphergoth.livejournal.com
At the H+UK 2010 in April, Anders Sandberg promised me that he had that very web page open on his desktop awaiting a reply. I've mailed him since, but still no joy.

Date: 2010-08-20 09:48 am (UTC)
simont: A picture of me in 2016 (Default)
From: [personal profile] simont
I have now :-) It's quite well done, though I wasn't completely convinced by the idea that they still had to painstakingly explain concepts like "making meat sounds" to each other when not only had they both been sitting in a room for a while with other people doing it but they were also doing it themselves!

March 2026

S M T W T F S
1 2 3 4 56 7
8 9 10 11 12 13 14
15161718192021
22232425262728
293031    

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Mar. 15th, 2026 01:58 pm
Powered by Dreamwidth Studios