Page Summary
andrewducker - (no subject)
simont - (no subject)
andrewducker - (no subject)
ciphergoth.livejournal.com - (no subject)
ciphergoth.livejournal.com - (no subject)
andrewducker - (no subject)
ciphergoth.livejournal.com - (no subject)
andrewhickey.livejournal.com - (no subject)
andrewhickey.livejournal.com - (no subject)
ciphergoth.livejournal.com - (no subject)
ciphergoth.livejournal.com - (no subject)
andrewhickey.livejournal.com - (no subject)
ciphergoth.livejournal.com - (no subject)
andrewhickey.livejournal.com - (no subject)
ciphergoth.livejournal.com - (no subject)
andrewhickey.livejournal.com - (no subject)
andrewhickey.livejournal.com - (no subject)
andrewducker - (no subject)
andrewhickey.livejournal.com - (no subject)
ciphergoth.livejournal.com - (no subject)
andrewhickey.livejournal.com - (no subject)
andrewhickey.livejournal.com - (no subject)
andrewducker - (no subject)
andrewhickey.livejournal.com - (no subject)
djm4 - (no subject)
ciphergoth.livejournal.com - (no subject)
simont - (no subject)
Active Entries
- 1: Interesting Links for 10-03-2026
- 2: Life with two children: Gideon updates
- 3: Photo cross-post
- 4: Interesting Links for 14-03-2026
- 5: Interesting Links for 13-03-2026
- 6: I need to know when it's okay to tell your partner you love them
- 7: Interesting Links for 11-03-2026
- 8: Interesting Links for 12-03-2026
- 9: Links Extra: More data than you ever wanted.
- 10: Interesting Links for 09-03-2026
Style Credit
- Style: Neutral Good for Practicality by
Expand Cut Tags
No cut tags

no subject
Date: 2010-08-19 07:53 am (UTC)Mostly I just love the line "Thinking and meat are best friends."
no subject
Date: 2010-08-19 07:58 am (UTC)no subject
Date: 2010-08-19 08:00 am (UTC)I even have the t-shirt:
no subject
Date: 2010-08-19 08:19 am (UTC)no subject
Date: 2010-08-19 08:26 am (UTC)no subject
Date: 2010-08-19 08:30 am (UTC)Any simulation we create will suffer from abstractions - and I can see us creating one based on reasonably high-level abstractions that produces something worthwhile, it just won't react exactly the same way as a physical brain would. Growing it from scratch would allow it to "get used" to itself - the traits it learned would fit itself, if that makes sense.
no subject
Date: 2010-08-19 08:35 am (UTC)no subject
Date: 2010-08-19 09:43 am (UTC)Unfortunately, the Singularity Summit types *don't* have a schoolkid's knowledge of biology. Ray Kurzweil is quite possibly the stupidest human being ever to have lived, making *exactly* the same mistake those people in the 1950s made when they graphed the fastest speeds ever achieved and said "FTL by the 1990s!". He makes quite good pianos though. Meanwhile Eliezer Yudkowsky is incredibly bright, but an incredibly bright autodidact high-school dropout who regards disagreement with himself as enmity to the human race.
If there ever *is* a 'Singularity', it'll be nothing to do with 'futurists', but with people who do basic scientific research or solve engineering problems, plodding along slowly and carefully in their dull day jobs, not people who go around saying they're the most important person in the history of the universe...
no subject
Date: 2010-08-19 09:50 am (UTC)no subject
Date: 2010-08-19 09:57 am (UTC)no subject
Date: 2010-08-19 10:03 am (UTC)no subject
Date: 2010-08-19 10:23 am (UTC)It is, but it goes further than that. In his book, for example, he talks about being able to build replicators which, when programmed, could create replicas of literally any physical object down to the subatomic level, and then goes on to say that when such things are created we must look for ways of protecting the intellectual property of the people who come up with programs for them. That's a whole, special, unique kind of stupid...
""regards disagreement with himself as enmity to the human race" is silly."
It would be if it weren't true. See the recent events where someone posted a relatively innocuous post on LessWrong. Yudkowsky not only removed it, but forbade anyone from ever linking to it on the site, and gave strongly-worded warnings not to go looking for that information elsewhere, on the grounds that it would make it more likely that an unfriendly AI would be created and take over the world.
"If there is a Singularity that is not the result of the work of futurists, it's very unlikely to end well for us."
*If* there is a singularity (something for even the possibility of which there is no real evidence, though in my own view there are no basic physical laws forbidding it) that is the work of even moderately competent engineers, they will have built in most of the safeguards that someone like Yudkowsky talks about, because the threat of paperclipping the universe *is* a real (and obvious) one.
On the other hand, Yudkowsky and the SIAI are the only futurist group I've seen which have anything even remotely similar to a plan to actually *do* anything, rather than just sit around and talk about how great everything is going to be Real Soon Now, and their plan looks to me rather more like a pyramid-scheme-cum-cult than an actual plan, what with Yudkowsky having been talking it up for more than a decade and still not having got round to starting the basic work. (Plus he's planning on trying to write a greater-than-human AI in *JAVA* - that's like trying to build a space elevator out of matchsticks).
Real progress will come, as it always has, from the combination of people doing real basic scientific research and people trying to solve real-world engineering problems, not from people who just want to talk about how fantastic and whizz-bang awesome it'll be when they're immortal gods...
no subject
Date: 2010-08-19 10:28 am (UTC)No-one knows how to build the safeguards Yudkowsky talks about, and the opinion of mainstream AI is that there is no such danger.
SIAI are not coding an AI at all. The "cult" bogeyman is incredibly tiresome; it's addressed here.
no subject
Date: 2010-08-19 10:31 am (UTC)no subject
Date: 2010-08-19 10:32 am (UTC)no subject
Date: 2010-08-19 10:38 am (UTC)no subject
Date: 2010-08-19 10:39 am (UTC)no subject
Date: 2010-08-19 10:55 am (UTC)A processing system can be vastly simpler than the thing it produces (or experiences).
I'm not saying that the human brain is, or isn't simpler than the beach boys - I'm saying that comparing them seems pretty silly to me.
no subject
Date: 2010-08-19 10:59 am (UTC)I don't recall it that way, but that's rather a side issue if, as you say, "Yudkowsky pointed out what a potentially destructive post it was to make". That confirms my basic point - he believed a disagreement with himself to be 'destructive'.
"No-one knows how to build the safeguards Yudkowsky talks about"
No-one knows how to build an AI which would need such safeguards either.
"and the opinion of mainstream AI is that there is no such danger."
Who is this 'mainstream AI'? As far as I can tell, most mainstream AI researchers don't believe there's a danger because what they're doing is not going to lead to anything close to general intelligence for many, many decades. That's not a judgement on the danger of a malicious artificial intelligence, but on the likelihood of one being created at all in the near future.
Put it this way - I'm entirely convinced that a vampire hiding under my bed would be horribly dangerous, but I don't take a wooden stake and garlic to bed with me every night...
"SIAI are not coding an AI at all"
Kind of my point. But Yudkowsky has stated several times that that is their eventual aim, and that he thinks Java would be the best tool to do the work in.
"The "cult" bogeyman is incredibly tiresome; it's addressed here."
What that essentially boils down to is that Yudkowsky doesn't consider 'cult' a useful term. I do, and if someone states that he and only he has the ability to save the universe, that donating money to his organisation is the single most important thing anyone else in the world can do, and that minor disagreement endangers the whole human race, then 'cult' seems to me a reasonable descriptor.
None of which is to say that Yudkowsky doesn't have some good ideas - he can, when he wishes, explain a lot of rather complex scientific ideas in very easily-comprehensible terms. I also believe he very sincerely believes everything he writes and says. But that doesn't make his organisation any less of a dead end.
But I should probably continue this on my own blog tonight, rather than filling up Andrew's comments with this...
no subject
Date: 2010-08-19 11:12 am (UTC)no subject
Date: 2010-08-19 11:17 am (UTC)Put simply, I'm not arguing that the brain *must* be more complex than that - I think it quite likely that it isn't. But I think it more likely that it is.
Lots of things are irreducibly complex - I used part of my MP3 collection as an example. Other things, e.g. the complex patterns created by cellular automata - are not and can be described in a tiny number of lines of code.
I don't know which category the human brain goes in, but my hunch, based on the facts that it is very complex, that it consists of multiple, semi-independent, interacting complex systems, and that nobody seems to have come up with any practicable suggestions as to how we would find the underlying simpler principles, is that it fits into the MP3 category rather than the cellular automaton category. If so, then I would be surprised if it used fewer bytes to describe than my Beach Boys MP3s, though more in the slightly-raised eyebrow way than the collapsing in shock way.
no subject
Date: 2010-08-19 11:28 am (UTC)Not to me it doesn't, given that that procedure involves an entire other animal to start with, plus that animal's external environment.
It's not a matter of 'turning genomes into animals' - the genome is useless without a cell (containing vastly more information than the genome) which in turn is useless without a mother (containing vastly more information than the cell).
If you want to think of the genome as being the 'program' that creates an organism (which is a huge oversimplification), then the environment in which that genome operates is analogous to the hardware and operating system that it runs on. I don't consider it counterintuitive at all that a full description of the hardware and OS on my desktop would take far more bits than the code for the web browser it's running. It might not, but I don't see any obvious reason why it shouldn't.
no subject
Date: 2010-08-19 11:34 am (UTC)no subject
Date: 2010-08-19 12:20 pm (UTC)no subject
Date: 2010-08-20 06:50 am (UTC)no subject
Date: 2010-08-20 06:54 am (UTC)no subject
Date: 2010-08-20 09:48 am (UTC)