Unflattering Reasons Why I'm Attracted to EA - EA Forum

@tags:: #lit✍/📰️article/highlights
@links::
@ref:: Unflattering Reasons Why I'm Attracted to EA - EA Forum
@author:: forum.effectivealtruism.org

2022-06-28 forum.effectivealtruism.org - Unflattering Reasons Why I'm Attracted to EA - EA Forum

Book cover of "Unflattering Reasons Why I'm Attracted to EA - EA Forum"

Reference

Notes

Quote

I feel guilty about my privilege in the world and I can use EA as a tool to relieve my guilt (and maintain my privilege)
- No location available
-

Quote

Affiliation with EA aligns me with high-status people and elite institutions, which makes me feel part of something special, important and exclusive (even if it's not meant to be)
- No location available
-

Quote

It is a way to feel morally superior to other people, to craft a moral dominance hierarchy where I am higher than other people
- No location available
-

Quote

EA lets me signal my values to like-minded people, and feel part of an in-group
- No location available
-

Quote

I don't have to get my hands dirty helping people, yet I can still feel as or more legitimate than someone who is actually on the front line
- No location available
-

Quote

Most of my personal and professional successes are due to EA
- No location available
-

Quote

Having a "noble" central purpose in my life makes the individual failures in (the rest of) my life feel more bearable
- No location available
- effective altruism (ea),
- [note::Can definitely relate to this - I think my involvement in EA has provided an excuse for cutting corners/not reaching my full potential in other areas of my life.]

Quote

Non-EA liberal Western society feels increasingly identity-driven, and I like to feel appreciated for my intellectual and community contributions, regardless of how I look
- No location available
-

Quote

We shouldn't beat ourselves up about these motivations, IMHO.  There's no shame in them. We're hyper-social primates, evolved to gain social, sexual, reproductive, and tribal success through all kinds of moralistic beliefs, values, signals, and behaviors. If we can harness those instincts a little more effectively in the direction of helping other current and future sentient beings, that's a huge win. We don't need pristine motivations. Don't buy into the Kantian nonsense that only disinterested or purely 'altruistic' reasons for altruism are legitimate. There is no naturally evolved species that would be capable of pure Kantian altruism. It's not an evolutionarily stable strategy, in game theory terms. We just have to do the best we can with the motivations that evolution gave us. I think Effective Altruism is doing the best we can.The only trouble comes if we try to pretend that none of these motivations should  have any legitimacy in EA. If we shame each other for using our EA activities to make friends, find mates, raise status, make a living, or feel good about ourselves, we undermine EA. And if we undermine the payoffs for any of these incentives through some misguided puritanism about what motives we can expect EAs to have, we might undermine EA.
- No location available
-

Quote

The first one is tricky, as affiliation with high-status people and organizations can be instrumentally quite useful for achieving impact--indeed, in some contexts it's essential--and for that reason we shouldn't reject it on principle. And just like I think it's okay to enjoy money, I think it's okay to enjoy the feeling of doing something special and important! The danger is in having the status become its own reward, replacing the drive for impact. I feel that this is something we need to be constantly vigilant about, as it's easy to mistake social signals of importance for actual importance (aka LARPing at impact.)
- No location available
-