Suspect Insight Forums
Suspect Insight Forums
Would you like to react to this message? Create an account in a few clicks or log in to continue.

Go down
The Ellimist
The Ellimist
Level Five
Level Five

On the importance of the far future Empty On the importance of the far future

on February 3rd 2020, 3:40 am
Basically, the argument is this:

The expected number* of conscious beings who will exist in the future outnumbers those who exist right now by a stupendous amount. Like, even extremely conservative estimates would put the expected number of entities in the quadrillions (thinking about how long before the heat death). This would mean that something that only improves the lives of beings now by 50% would be of trivial importance compared to something that gives a 0.00000001% chance of improving the lives of future beings by 0.0001%.

You don't even have to take a pure utilitarian stance for this point to apply - you just have to think that future lives should hold some value. Intuitively, we all think this, unless if we don't care at all about issues like climate change and don't make any arrangements for future kids, etc. On a more philosophical level there are various reasons to justify why future lives should matter; one way to put it is that it makes no sense to pretend that this arbitrary moment in time when you happen to be making a moral decision should be the one that matters.

The most obvious manifestation of this calculus is in mitigating existential risk; reducing existential risk by like 1% would have a stupendous amount of impact on future lives. There are other implications, like on optimizing for future technologies, etc.

Now of course, trying to over-analyze this can be problematic; what happens right now can have butterfly effects into the future, and it's very difficult to plan that far out, so it's often actually the best strategy to make life better for the right now while keeping a reasonable eye out for how to make things better long term. I think that heuristic works well, but that getting more long-term thinking does need to happen more for very specific edge cases like climate change and AI.

Thoughts?

* This being (as an imprecise but sufficient definition) the potential lives averaged by probability. So even if you say "there's no guarantee it'll survive that long", changing the chance that it will has a huge impact. We all operate by probabilities on some level; you don't know for sure almost anything.

tl;dr: far far far more beings could exist in the future than now, so the future matters a lot more than now. Sometimes it's better to just optimize for now because you can't really predict very far into the future a lot of the time anyway though.
BigMouthPrick
BigMouthPrick

On the importance of the far future Empty Re: On the importance of the far future

on July 7th 2020, 9:34 pm
I live in the real world where human growth is only exponential in theory. The planet is nearing a breaking point in terms of resource limitations and carrying capacity. Utilitarianism is flawed in that it assumes that people are of equal value. As a young person, I acknowledge that I am more disposable, and more irrelevant in the overall scheme of society than a person in my position 100 years ago. Continued population growth will assure a worse outcome for my peer 100 years from now. So optimize for now and roll with the punches.
Back to top
Permissions in this forum:
You can reply to topics in this forum