All moral decisions in life are on a heavy-tailed distribution

(epistemic status: Written in the spirit of “strong opinions, weakly held” and in a way that is accessible for people who are new to effective altruism. Also, I am pretty sure this idea isn’t original, but I haven’t seen it written up. So please point me there in case I missed something.)

Heavy tails explained

I think of recognizing heavy-tailed distributions in many places as one of the key insights of the effective altruism mindset. 

In a heavy-tailed distribution, a large part of the probability mass lies in the “tail”. It goes to zero more slowly than an exponential distribution. This means extreme values are likelier and outlier data points occur more often. 


Here are a few examples that indicate underlying heavy-tailed distributions in real life, as they clearly exhibit a significant skewness

The most effective health interventions are multiple orders of magnitude more cost-effective than the typical intervention.  

As most people know, global income is distributed highly unevenly. 

Some problems we can choose to work on are hundreds of times more important than others. For example, depending on your worldview (!), preventing pandemics might be 100x more important than global health, and mitigating risks from AI might again be 100x more important.

If you want to emit fewer tons of CO2, the lifestyle choice that makes by far the biggest difference is having one fewer child. But then, donating to certain climate charities is again many times more impactful than any lifestyle choice. 


Now, here is what I mean by “all moral decisions in life are on a heavy-tailed distribution”:

The intuition is that a few decisions in life are of vast moral importance compared to many others, which are way less important.

Now, what could those crucial decisions be? 80,000 Hours argues that career choice is among them. Donating a significant amount of your lifetime earnings to effective charities could also be high up there.

I often find it helpful to think about this. If this hypothesis is true and I want to live a moral life, I need to get the big things right. This insight should lead me to put a lot of time into thinking about what the massively important decisions are and how to make better choices about them.

Most everyday decisions probably matter less, and I ought not to worry much about, e.g., whether to take the train or a plane in a specific instance. Of course, these more minor things can also add up over a lifetime, and making moral choices on them is better than doing nothing. Nonetheless, I think many people should think less about these and take seriously the possibility that only a few of your judgment calls matter 1000x more than many others. 

Figure out what the big, morally important decisions are and try to get them right. 

Thanks to Simon Grimm, Moritz Hanke, and Katja Michlbauer for their helpful comments on this post. 


  1. Click here for a nice explainer of heavy-tailed distributions in video format by Anders Sandberg.
  2. You might have heard of other names of concepts that hint at a very similar idea as heavy-tailed distributions: Pareto principle, 80-20 rule, power laws, fat-tailed distribution, tail risk, and black swan events.

First published on the EA forum.