Date: 2022-08-08 12:03 pm (UTC)
nancylebov: (green leaves)
From: [personal profile] nancylebov
#2. Seems fairly sound, though it fails to mention pushback against longtermism from within rationalism and EA.

I would think that one of the strongest arguments against longtermism is that we don't know that much about the future, so building a sounder present is more solidly based in knowledge.

Maybe there's a parallel between getting hypnotized by the future potential of the human or post-human race and getting hypnotized by "luxury space communism".

I've posted that link and my comment at astralcodexten.

https://astralcodexten.substack.com/p/open-thread-236/comment/8242094

Date: 2022-08-08 04:34 pm (UTC)
danieldwilliam: (Default)
From: [personal profile] danieldwilliam
I was reading the article about longtermism thinking to myself 1) discount rates and 2) the future digital posthumans are going to be awfully white if we let all the poor people from the Global South die of climate change in the 21st century.

Date: 2022-08-09 08:56 am (UTC)
nancylebov: (green leaves)
From: [personal profile] nancylebov
I've seen a claim that most of EA is spending money on current problems. Longtermism exists, but it's not that influential-- this is something to check on.

I'm more concerned about what happens to people rather than demographic outcomes. One thing about the future-- Africans are having children much more than other people, so who knows?

Date: 2022-08-09 02:52 pm (UTC)
nancylebov: (green leaves)
From: [personal profile] nancylebov
If Africans are the last to have widespread education (I'm in favor of them having education), then the proportion of Africans will increase relative to everyone else.

I think that eventually, people will need to find a way to increase the reproductive rate a bit while having widespread education.

Date: 2022-08-09 10:23 am (UTC)
bens_dad: (Default)
From: [personal profile] bens_dad
In https://nickbostrom.com/astronomical/waste (one of the papers cited in the longtermist article) Nick Bostrom claims:
So long as the evaluation function is aggregative (does not count one person’s welfare for less just because there are many other persons in existence who also enjoy happy lives) and is not relativized to a particular point in time (no time-discounting), the conclusion will hold.

So we aren't allowed to have a discount rate, nor to say that half of all beings now equals half of all beings at some future when there are 10^54 (that would mean that 1 of 10^54 is worth less than one of 10^10).

The Boltzmann Brain theory says that the number of mental beings spontaneously coming into existence (quantum effects?) vastly exceeds the number of mental beings created by the cosmology and biology that we currently "believe". Whilst that doesn't mean that the loss of one in 10^54 does not matter, it might mean that the 10^54 is not significant compared with the number of Boltzmann Brains.

2) Definitely. Stress and diversity are likely to be significant factors in the speed at which we reach "singularity", so longtermism has to worry about being too white.

Bostrom seems to be trying to maximize the number of happy beings, without regard to the number of unhappy beings The ratio matters more to me than the absolute numbers; I'm tempted to say the ratio at each given time not just the ratio over the life of the universe (or all universes).

June 2025

S M T W T F S
1 2 3 4 5 6 7
8 9 10 11 12 13 14
15 16 17 18 19 20 21
22 23 24 25 26 27 28
2930     

Most Popular Tags

Page Summary

Style Credit

Expand Cut Tags

No cut tags
Page generated Jun. 28th, 2025 12:24 pm
Powered by Dreamwidth Studios