That earlier conversation cleared up my own feelings about EA a bit, and they are as follows:
I give to charity – and to “effective” charities specifically – not because of any abstract argument I have been given, nor out of a feeling of moral guilt, but simply because once it was specifically pointed out to me that I could, it seemed like something I wanted to do. It is consistent with the values I already have.
I do it because it is something I “should” do, but it isn’t the sort of “should” that comes from some ethical theory baring mercilessly down on me. It’s the kind of “should” that makes me, during everyday life, actively want to help people. Which is an impulse I have – not an infinitely compelling one, not one to which I will sacrifice all comfort and beauty and all other good things, but still a natural impulse. Not something alien bolted on from the outside.
And the reason I want EA to spread is not that I think I have my hands on the correct ethical theory and want people to follow its dictates, their own values be damned. My guess is that EA-like giving is – as it was with me – already consistent with many people’s values. In which case telling them about EA would be less like guilting them into doing good, and more like making them aware that an action, of the sort they like to do, is available to them.
Imagine that it is discovered that whenever anyone says the world “bagel,” there was a 1/1000 chance that some randomly chosen person with cancer would be completely cured. What would happen? Certainly, this would precipitate some guilt crises – some people would feel bad about doing anything except saying the word bagel over and over again, and some people would argue that this feeling was right. But the main reaction, I think, would be joy, and not just for cancer sufferers and those who know them. People would think: my God, I can help someone so much, just by doing something so easy! They would say “bagel” again and again with glee – not to the point of destroying their lives, but often, when they get a spare moment. Such a complicated world, and such a simple way to help people so much! Bagel! Such a shining piece of unalloyed good in a very alloyed world! And what a way to make even our own little lives brighten the world!
We would not be quite so happy if, instead, saying “bagel” merely had a 1/1000 chance of curing a less serious chronic illness – say, Tourette Syndrome (which I have, so I am allowed to say this). In EA, this natural difference in feelings leads to cause prioritization, and puts the “E” in “EA.” It is not some strange alien philosophical thing. It is a normal aspect of our sentiments.
I find myself frustrated both by a lot of the EA movement and by a lot of anti-EA sentiment. The EA movement tends to focus on the more philosophical, controversial, guilt-inducing aspects of the issue – the equivalent of telling people about the bagel thing by haranguing them about how awful it is to do anything but devote your life to maximal bagel-saying. Anti-EA writing tends to focus entirely on opposition to those philosophical claims, which is fine in itself but pushes people away from discovering how wonderful a thing it can be to donate to effective charities. It’s the equivalent of writing an article about these weird people who want you to do nothing all day but say “bagel,” and the problematic moral axioms involved, without ever mentioning the fact that saying “bagel” is magic.
The reason I want the EA movement to exist is that I want more people to discover this new action that is consistent with their values. I don’t literally mean that people don’t know donation is possible, but in my experience it takes some initial push to make them realize that they could be doing it right now. Many non-religious people, including me, grow up never thinking about it as a possibility, because no one around them talks about it. Having a community of people doing it, as in religion, also helps – which is something the EA movement has the potential to do.
But the EA movement is certainly not ideal at selling itself for this purpose. I wish it had nothing to do with Peter Singer, with his “really we should all just be ascetics” extreme utilitarianism and his ill-informed views about disabled people. I wish it were less about “really we should all just be ascetics” in general. Forget about asceticism and just focus on communicating the very simple fact that because of the declining marginal utility of wealth (among other things), you can help other people in marvelous ways at extremely low cost to yourself. This should be a cause for joy. Bagel!
Since I only really care about the EA movement as a vehicle for making people aware that they can give and that they probably already want to, arguments over EA often look strange to me. Little or no attention is paid to whether the critics themselves donate, or whether EA caused them to donate. Notice, in the article I linked this morning, this astonishing buried lede:
The utilitarian was no longer a theoretical construction to do dialectical battle with; he was knocking at the door armed with pamphlets, asking me to sign away 10 percent of my income (I was happy to oblige) and, in the seminar room, claiming authority over how I was to live (which I respectfully declined to concede to him).
Reading on it is clear that the author is not on board with the notion of effectiveness. But it sounds like the EAs got him to donate.
This is a very strange situation: the fact that the author was awakened to a new way of doing good in the world is relegated to a sidenote (literally in parentheses), while he writes many paragraphs about the problems with the people who thus awakened him.
A man came to the door and told me about the bagel trick. What is the bagel trick, you ask? Unimportant, although I will note that I now practice it regularly. What is important is that this man’s philosophy is wrong and you should not listen to him.
What the hell is going on here?
You may have noticed that I don’t actually seem to be a utilitarian. If I was, I would be much less concerned with the absolute numbers of people donating and more concerned with the quantity of donations. I might be much more interested in strategies like “earning to give,” which are weird and scary in ways that push people away from EA, but which may have the potential to create more donations, overall, even when you take that pushing-away into account.
I’m not sure if I’m a utilitarian, but I think a utilitarian could still bear with me here. What I don’t want is for the EA movement to wither away into a strange, nerdy footnote that leaves most people cold. What I want is for it to flourish into an overall secular culture of giving that can engage a wide range of people. (Peter Singer wants this too, although I’m not sure he is effective at achieving it.) In the short term, yes, a few high-paid “earners-to-give” can numerically outweigh a bunch of well-meaning but lower-earning donors. In the long run, if we actually create a culture of giving, the number of EAs who just happen to work high-paying jobs will far numerically outweigh current earners-to-give, without anyone even having to adjust their career path.
EA, my EA, is a very normal sort of thing, one with broad appeal, and I hope it will be adopted very widely. I want a world where ordinary high-paid managers follow GiveWell recommendations because this is a normal thing for people to do. If we normalize giving – even those of us who can’t give much – the sheer masses of well-off normal people who give will far exceed anything that a small number of utilitarian nerds can do alone.
fwiw, although i have a number of issues with the EA subculture, i can’t find much to disagree with here.
(they still need to get the fuck rid of MIRI though. what the fuck. that shit’s indefensible when it’s literally that or the mosquito nets.)
There’s an AI at the end of the universe that will give you malaria if you don’t donate enough to mosquito nets.
Honestly, Effective Altruism is probably the number one reason why this neo-reactionary nonsense needs to be deconstructed and more importantly, have a good alternative offered to it for the type of people who want it. Effective Altruism is scarily influential as a movement in Silicon Valley, determining where some extremely rich people are donating millions of dollars. (I particularly like this article from Vox on it: http://www.vox.com/2015/8/10/9124145/effective-altruism-global-ai This was the first time I had heard of Roko’s Basilisk, although he doesn’t use that name.) While I suppose spending money to minimize the chance of some AI apocalypse is less bad than spending it on yachts, it’s a hell of a lot worse than spending it on charities that actually make a difference in the here and now. As someone who is heavily personally invested in both climate change and poverty alleviation (locally and internationally), the fact that money is going towards this nonsense instead of clean energy, local economic development, and social services is really depressing.
Effective altruism has a whole lot of other problems - namely that young, rich, white mostly boys are determining what is “effective” for people vastly different from them. But the neo-reactionary stuff is really bad.
MIRI aren’t neo-reactionaries. I don’t think there are any neo-reactionary charities. (There may be reactionary charities, but “neo-reaction" is a small, specific movement that mostly exists on the internet.)
But yes, it bothers me how many EAs give to MIRI. The kind of broad social movement I want is probably incompatible with this association. (This doesn’t mean that “kick MIRI out” is the right action in the short term – I don’t know the right action in the short term – but I can’t imagine the long term going the way I want without EA losing the association with godlike AIs it currently has in the public mind.)
(via storitellerb)
tchtchtchtchtch liked this
lyycernment liked this lloke liked this
almostcoralchaos reblogged this from rusalkii
almostcoralchaos liked this
henrysartain-blog liked this
warn-the-tadpoles liked this illidanstr liked this
skeletontemple reblogged this from nostalgebraist
shivvs liked this
highkettaishitpost liked this
birdybell liked this
nanoishuge liked this
shacklesburst liked this
agaraugur liked this
raginrayguns liked this
safer-jug-omens liked this nostalgebraist liked this
starlitsapphos liked this
adzolotl liked this
lostpuntinentofalantis reblogged this from nostalgebraist and added:
Hey, thanks for engaging. I believe I confused two separate numbers in my head, there was a SSC (which I cannot for the...
hylleddin liked this
averyboringnama liked this queeninthenorthwest liked this
davy-the-sorcerer-blog liked this blackcatgodess liked this
tangelotime liked this
phosphorescent-naidheachd liked this
chroniclesofrettek liked this
clatterbane liked this kelpforestdwellers liked this
2centjubilee reblogged this from nostalgebraist
nostalgebraist reblogged this from lostpuntinentofalantis and added:
What seems wrong to me with your view, on the basis of my personal experience, is that statistics say a lot of people...
athrelon liked this chtristyn liked this
blashimov liked this
- Show more notes
