This week we will consider some of the ethical positions which inspire effective altruism, how a history of changing ethical norms might affect how we want to do good, and how our own values line up with the tools EAs use.
EAs often advocate for impartiality when doing good. For example, many EAs would claim that there's no intrinsic moral reason why people who are 1000 miles away are less worth helping than people right next to you (though they may be more or less worth helping for other reasons, for example, if you have better specific opportunities available to help one of these groups of people). There are many dimensions along which you might want to be impartial, such as across space, across time, and between species. Deciding which dimensions you think you should be impartial over might drastically change what you would prefer to work on, so this question is worth a lot of attention. For example, the priority you place on improving the conditions of animals in factory farms varies drastically depending on how much moral consideration you believe animals deserve.
We know our ethical beliefs have a big effect on how we do good, but perhaps we can't trust that our current ethical beliefs are good, or will line up with our future ethical beliefs. Societies’ moral beliefs have changed drastically over the course of history and there may be reason to believe they’ll change again. Should we act in line with the moral beliefs we hold right now? Or should we try to figure out where our moral beliefs might be mistaken, or might change in future, and let that inform our actions?
<aside> 💡 Please note that there’s an exercise in Week 3 that requires you to start immediately after Week 2’s discussion session is over.
</aside>
<aside> ❗ Please note that the exercise for Week 3 will require you to begin right after the end of your discussion session for Week 2.
</aside>
In this exercise, we will explore the idea of impartiality and where we might want to apply it. You will imagine you’re in a court which decides how much moral standing to give various groups, i.e., how much an altruistic person should want to help them.