Suspect Insight Forums

Go down
The Ellimist
The Ellimist
Moderator
Moderator

On the importance of the far future Empty On the importance of the far future

on Mon Feb 03, 2020 3:40 am
Basically, the argument is this:

The expected number* of conscious beings who will exist in the future outnumbers those who exist right now by a stupendous amount. Like, even extremely conservative estimates would put the expected number of entities in the quadrillions (thinking about how long before the heat death). This would mean that something that only improves the lives of beings now by 50% would be of trivial importance compared to something that gives a 0.00000001% chance of improving the lives of future beings by 0.0001%.

You don't even have to take a pure utilitarian stance for this point to apply - you just have to think that future lives should hold some value. Intuitively, we all think this, unless if we don't care at all about issues like climate change and don't make any arrangements for future kids, etc. On a more philosophical level there are various reasons to justify why future lives should matter; one way to put it is that it makes no sense to pretend that this arbitrary moment in time when you happen to be making a moral decision should be the one that matters.

The most obvious manifestation of this calculus is in mitigating existential risk; reducing existential risk by like 1% would have a stupendous amount of impact on future lives. There are other implications, like on optimizing for future technologies, etc.

Now of course, trying to over-analyze this can be problematic; what happens right now can have butterfly effects into the future, and it's very difficult to plan that far out, so it's often actually the best strategy to make life better for the right now while keeping a reasonable eye out for how to make things better long term. I think that heuristic works well, but that getting more long-term thinking does need to happen more for very specific edge cases like climate change and AI.

Thoughts?

* This being (as an imprecise but sufficient definition) the potential lives averaged by probability. So even if you say "there's no guarantee it'll survive that long", changing the chance that it will has a huge impact. We all operate by probabilities on some level; you don't know for sure almost anything.

tl;dr: far far far more beings could exist in the future than now, so the future matters a lot more than now. Sometimes it's better to just optimize for now because you can't really predict very far into the future a lot of the time anyway though.
Back to top
Permissions in this forum:
You can reply to topics in this forum