Install Theme

theworldgate asked: To be honest on the stormingtheivory thing it's, like, classic internet communist "I am feel uncomfortable when we are not about m(y views)?". Also, I found zir comment about 'Drax the destroyer logic' hilariously terrible. Ze's reblogged stuff about how Drax is coded autistic. Ze was basically using 'autism logic' as an insult. In 2014. Like, if I was still following zir at that point I'd be upset (I'm autistic and have issues with hyperbole myself), but now it's just plain funny to me. (1/2)

(2/2) (or, rather, the comment in isolation is. like, is that the best retort you can manage? ooh big scary autisms? (literally big and scary since we’re talking Drax here). Once I remember that stormingtheivory actually did hurt hot-gay-rationalist I do feel bad though.).

Yeah, I felt similarly about it.

I’m not autistic AFAIK (though I wonder sometimes), but I get really prickly about this kind of rhetoric because I’ve had it slung at me a lot in the past.

Also I’m just really skeptical of the idea that more ambiguous or less literal forms of communication are somehow aligned with leftist politics.  This is an idea I feel like I see all over the place — it has contributed to, say, an increasing distance between leftist academics and the people they claim to support.  Yet it doesn’t make any sense to me and mostly just seems to result in less communication.

Anonymous asked: Speaking as a veteran of the TvTropes/Something Awful wars, tone deafness can be nice to allow you to meet weird cool people who might be kicked out of other places, but it also lets in terrible, terrible people who are adept at using the "simple country lawyer" routine to stay around while being humongous hateful racists or actual literal pedophiles. So it's a double-edged sword and you kinda have to decide how much you want the first group.

I think this an important concern and I defer to your experience here.

OTOH I guess I hold out the – possibly too idealistic – hope that once it’s clear that someone is this sort of person, that will become clear enough that they can be banned/whatever without violating community norms.  The kind of norm I’m thinking isn’t “no one gets banned ever,” it’s more “get to know the person and why they’re saying these kinds of things before you consider banning.”

And of course that will create a community that is very much not for everyone, but that’s what I’ve been saying about LW, that I don’t think it needs to be for everyone.

Like, that recent argument between stormingtheivory and others was weird for me to read because it seemed to me like stormingtheivory was saying “you should be less tonedeaf so people like me have more access to [unspecified thing that is the point of the Less Wrong community]”

For me, though, the tonedeafness is the point!  It is very very comforting to know people who will calmly hear me out when I don’t understand the reasoning behind some consensus opinion, rather than brandishing the familiar Rob Doesn’t Get It (my brain fills in: “Because On Some Fundamental Level He Never Does”) response.  This is a good thing for some people.  I’ve heard Scott, Ozy and Esther (and probably others I’m forgetting right now) express appreciation for this thing in similar terms.

On the other hand, the actual belief content of Less Wrong seems to me mostly either obvious (“many philosophical debates are the results of disagreements over semantics” – yeah, we know), false (hardcore Bayesianism, singularity is near enough to be reasoned about, Friendly AI), or benign self-help advice one can find in many other places (if perhaps not in Japanese).  I don’t care about giving more people access to these things because they’re useless.

Indeed, all of the actual non-ironic interest I have in LW as a community comes from norms like what I’ve been calling “tonedeafness,” not in the core texts (what, if I were less averse to the Scientology comparison here, I would be tempted to call “the tech”).  For me at least, throwing out the tonedeafness would be throwing out the baby in the name of the bathwater.

This is amusing and maybe ironic: my opinion on the “is Less Wrong inclusive enough” issue is nudged toward lack of concern by the fact that I don’t think LW’s version of “rationality,” as a sort of “technology” for achieving results, has anything really unique or valuable to offer.

As far as I can tell, the main positive effect of LW on people is to provide a safe space for people who have anxiety-related issues about “what if I say Evil Things and everyone learns I’m Fundamentally and Irremediably Evil Forever.”  To these people, the LW community norm of just plodding cheerfully along in arguments against arbitrarily nasty opinions rather than dismissing them is deeply reassuring: it means that if they have Evil Beliefs they can be disabused of them in a pleasant way rather than one that feeds their anxiety about being Fundamentally Evil at the core.

This is a legitimately useful function and I’ve seen LW serve it for a number of people of varying backgrounds.  Meanwhile, the “reading the Sequences will give you Bayesian superpowers” bluster and the like is bullshit, and so there’s no need to worry whether superpowers will be unequally distributed (i.e. given only to those with a sufficient tolerance for calmly reasoning with Evil).

Memories of a drunken evening many years ago with my beloved ciassicist/bluesman friend Sam: I had described transhumanism and the concept of “the singularity.”  I said that there were people today who considered their work a kind of preparation for the singularity.  He was delighted.

I said, drunkenly, “but it’s like a worldview based around the brittle certainty you feel right after you’ve had a lot of coffee in the morning!”

He said, drunkenly, “exactly!  And I spend all my time reading deconstructionists and thinking about how uncertain everything is, and that sounds so refreshing!  I love the singularity!”

Scott’s last two main blog posts are really good IMO.

(I mention this because I seem to spend a lot of time arguing with him or talking about he is too tolerant of neoreactionaries or whatever, and I feel like for balance I should give some examples of why I like his writing in general.)

dannythestreet:

nostalgebraist:

slatestarscratchpad:

nostalgebraist:

again, you’d think a group interested in the science of cognitive biases would talk more about the cognitive biases involved in, say, sexism, or racism, or homophobia, or (etc), what with those things being thoroughly documented in the literature…

I remember there being a big (and bizarre) argument about this way back on Overcoming Bias ca. 2008

I think the line being pushed at the time was “we only talk about biases in the most general sense and leave specific applications up to you.”  This may have been an attempt to make the site more apolitical than social science academia, at the cost of ignoring some of its work?  This idea may have fallen out of favor since then, I dunno.

(As I’ve mentioned before I think a lot of biases only go away if you think about them in the context of “specific applications,” so I think “leaving the applications up to you” kills a lot of the point of talking about bias.)

We’ve tried talking about racism and sexism a bunch of times. Just to start with, and limiting myself to top-level posts:

http://lesswrong.com/lw/533/manufacturing_prejudice/

http://lesswrong.com/lw/53/the_implicit_association_test/

http://lesswrong.com/lw/5d/fight_biases_or_route_around_them/

http://lesswrong.com/lw/4w/bogus_pipeline_bona_fide_pipeline/

http://lesswrong.com/lw/59i/offense_versus_harm_minimization/

http://lesswrong.com/lw/134/sayeth_the_girl/

http://lesswrong.com/lw/fmw/lw_women_entries_creepiness/

http://lesswrong.com/lw/efs/call_for_anonymous_narratives_by_lw_women_and/

http://lesswrong.com/lw/8e2/transhumanism_and_gender_relations/

http://lesswrong.com/lw/13j/of_exclusionary_speech_and_gender_politics/

http://lesswrong.com/lw/374/gender_identity_and_rationality/

Some points:

1. That is about ten times more top-level posts than have ever been posted about neoreaction.

2. So anyone who says there’s an “imbalance” between discussion of racism/sexism and some opposing view doesn’t want discussion split 50:50. They don’t even want it split 10:1. They want all discussion agreeing with them and no one else allowed to talk. Anything else is an “imbalance”. I don’t think this is dishonest. They’re just used to spaces where it works that way, and any other way seems weird and off-kilter and unfair.

3. There’s only so much you can say about something everyone else is already talking about everywhere.

4. Anyone continues to be able to post anything they want on Less Wrong. The “balance” of posts there reflects solely who chooses to write something up and press that “post” button. Instead of complaining how no one ever writes about your favorite bias on Less Wrong, why not write about your favorite bias for Less Wrong?

You’ve cast a wider net here than the one that I am casting or that (I think) twocubes is casting.  These all fall into the category of “about race and gender,” but not all into the category of “about racism and sexism” — e.g. there’s no way to read “Transhumanism and Gender Relations” as saying anything about present-day sexism.

Indeed, it’s exactly as (un)related to present-day sexism as the original EY post about catgirls; I’m not sure what makes “Transhumanism and Gender Relations” more worthy of inclusion than the catgirls post, except the fact that its futurist projection is more palatable to the average feminist than EY’s.  But I’m not asking for posts that feminists would like, or even posts by feminists — I’m asking for analyses of (in this case) sexism from a heuristics and biases perspective.

Several of these posts (four by my count) seem to be part of community-internal discussions of sexism within the LW community — one discussion from 2009 (you included EY’s and Alicorn’s posts from a larger series of posts) about possible sexism in LW as an online space, and a later conversation from 2012 about creepy male behaviors in LW meetups (you chose two posts resulting from this discussion).  I think it is good that this happens, but it’s not an “LW analysis of sexism,” it’s “analysis of sexism in LW.”  No turning the rationalist telescope out on the wider world here.

Then there are a bunch of posts by you.  It feels weird to raise that as an objection — it sounds like I’m just making up weird objections at this point.  However, I already like your writing, much more than I like “LW culture” in general; for instance, I read your blog regularly, but not the main site.  So these posts don’t convince me that any thinking of this sort is going on within LW except for the kind that is going on at SSC, and at the moment seems to have isolated itself to SSC.  Moreover, even these posts aren’t especially good examples of what I’m looking for: the IAT post for instance uses the IAT’s role in prejudice research only as a stepping stone to more “general rationality” applications when it’s precisely discussion of prejudice as prejudice that I think is important for the sake of reducing bias.

(There are two entries on the list I haven’t talked about.  One, “Manufacturing Prejudice,” is in fact about analyzing prejudice, and isn’t by you, but it is also not very interesting; modulo language choices, it’s the kind of post I would expect to see on tumblr and scroll by quickly because I already know and agree with the kind of thing it’s trying to say.  If this is what LW has to offer on these subjects, I’ll just stay on tumblr.  The other, “Gender Identity and Rationality,” is at least about analyzing gender from a uniquely “LW” perspective, so I’d say it’s the closest thing to a “hit” this list has by my criteria.)

That was a lot of words, but what it all comes down to is that I’m looking for serious investigation into sexism and racism, preferably with reference to the academic literature, and treated as topics of general interest (for people interested in understanding society) rather than things to be used as stepping stones to “general rationality” insights or as things to be reluctantly addressed when they arise as community problems and then ignored later.  If that’s a tall order, so be it.  It’s not the same as just “talking about racism and sexism,” though.

(The “equal time” thing is a bit complicated, but much of my objection follows from the above — equal time between “gender/race” and “neoreaction” is not a fair standard because “gender/race” as it’s defined above is a gigantic net that appears to exclude stuff like Eliezer’s catgirls post only because they are not leftist enough.  It’s as though I had said there weren’t enough posts about whether learning to code was a good idea and you had linked me to every post that talked about programming, or every such post that you perceived as being sufficiently “on my side.”  ”Post fits my specific, perhaps restrictive interest criteria” != “post is far enough to my side of some ideological spectrum to balance out some other post I don’t like in some quota agreement.”  There may be plenty of posts on my side yet few I find interesting to read.)

Topics aside, “Neoreaction” vs “The left” is still an absurd framing especially when Scott is suggesting that a 50:50 split would be reasonable. Imagine if the boot was on the other foot, and LessWrong had a significant Posadist minority (not unreasonable on a site devoted to existential risk and superhuman intelligence, and they are much less pro-apocalypse than Nick Land&co, at least qualitatively if not quantitatively) with many of the liberals on the site taking Posadist thought quite seriously and putting some effort into engaging with them. 

Imagine this parallel universe Less Wrong has it’s own Scott Alexander, who has written an extremely long and well researched anti-Posadist FAQ, and is often heard to remark that the superficial details of Posadism - the guff about dolphins and deformed workers states and so on - are mostly just a sort of labyrinth for deterring and/or detaining the stupid, like the “Adso admires a door” chapter from The Name of The Rose, and that the meta-level ideas, regarding class and ideology and the possibility of acausal communication with inhuman intelligences, was the really interesting bit.

And imagine this alternate universe Scott responding to a commentator worried about the sites left wing bias by listing of all the topics on Less Wrong devoted to taxes and bureaucracy and the definition of marriage, pointing out that these far outnumber the topics devoted to the life and thought of Juan Posadas, whose supporters are only about as numerous on the site as conservatives or libertarians. 

It’s almost as if, our goateed mirror universe Scott scoffs, they don’t want a 50:50 split between discussion of the size of government and some opposing view, they want all discussion agreeing with them and regard any mention of the positive benefits of an all consuming nuclear fire, a final and deadly confrontation between the proletariat and the bourgeoisie in which greater than half of mankind would perish, and a new and more noble humanity emerge from the ashes, forged in atomic fire, united by common purpose, and having obtained through unimaginable suffering a higher degree of consciousness, the illusions common to all previous ages of man shattered irrevocably by the intimate knowledge of the terrible power within their control, as being off-kilter and unreasonable.

I mean, it’s not really necessary to go that far, just imagine Crooked Timber tried to claim they were a politically unbiased site because they have about as many conservative commentators as they do communist ones. Nobody would take that seriously, would they? We’d agree that the mere fact they framed it that way indicated how far to the left they were.

This is perfect.  (And thanks for making me aware of the Posadists!)

(via clawsofpropinquity)

slatestarscratchpad:

nostalgebraist:

again, you’d think a group interested in the science of cognitive biases would talk more about the cognitive biases involved in, say, sexism, or racism, or homophobia, or (etc), what with those things being thoroughly documented in the literature…

I remember there being a big (and bizarre) argument about this way back on Overcoming Bias ca. 2008

I think the line being pushed at the time was “we only talk about biases in the most general sense and leave specific applications up to you.”  This may have been an attempt to make the site more apolitical than social science academia, at the cost of ignoring some of its work?  This idea may have fallen out of favor since then, I dunno.

(As I’ve mentioned before I think a lot of biases only go away if you think about them in the context of “specific applications,” so I think “leaving the applications up to you” kills a lot of the point of talking about bias.)

We’ve tried talking about racism and sexism a bunch of times. Just to start with, and limiting myself to top-level posts:

http://lesswrong.com/lw/533/manufacturing_prejudice/

http://lesswrong.com/lw/53/the_implicit_association_test/

http://lesswrong.com/lw/5d/fight_biases_or_route_around_them/

http://lesswrong.com/lw/4w/bogus_pipeline_bona_fide_pipeline/

http://lesswrong.com/lw/59i/offense_versus_harm_minimization/

http://lesswrong.com/lw/134/sayeth_the_girl/

http://lesswrong.com/lw/fmw/lw_women_entries_creepiness/

http://lesswrong.com/lw/efs/call_for_anonymous_narratives_by_lw_women_and/

http://lesswrong.com/lw/8e2/transhumanism_and_gender_relations/

http://lesswrong.com/lw/13j/of_exclusionary_speech_and_gender_politics/

http://lesswrong.com/lw/374/gender_identity_and_rationality/

Some points:

1. That is about ten times more top-level posts than have ever been posted about neoreaction.

2. So anyone who says there’s an “imbalance” between discussion of racism/sexism and some opposing view doesn’t want discussion split 50:50. They don’t even want it split 10:1. They want all discussion agreeing with them and no one else allowed to talk. Anything else is an “imbalance”. I don’t think this is dishonest. They’re just used to spaces where it works that way, and any other way seems weird and off-kilter and unfair.

3. There’s only so much you can say about something everyone else is already talking about everywhere.

4. Anyone continues to be able to post anything they want on Less Wrong. The “balance” of posts there reflects solely who chooses to write something up and press that “post” button. Instead of complaining how no one ever writes about your favorite bias on Less Wrong, why not write about your favorite bias for Less Wrong?

You’ve cast a wider net here than the one that I am casting or that (I think) twocubes is casting.  These all fall into the category of “about race and gender,” but not all into the category of “about racism and sexism” – e.g. there’s no way to read “Transhumanism and Gender Relations” as saying anything about present-day sexism.

Indeed, it’s exactly as (un)related to present-day sexism as the original EY post about catgirls; I’m not sure what makes “Transhumanism and Gender Relations” more worthy of inclusion than the catgirls post, except the fact that its futurist projection is more palatable to the average feminist than EY’s.  But I’m not asking for posts that feminists would like, or even posts by feminists – I’m asking for analyses of (in this case) sexism from a heuristics and biases perspective.

Several of these posts (four by my count) seem to be part of community-internal discussions of sexism within the LW community – one discussion from 2009 (you included EY’s and Alicorn’s posts from a larger series of posts) about possible sexism in LW as an online space, and a later conversation from 2012 about creepy male behaviors in LW meetups (you chose two posts resulting from this discussion).  I think it is good that this happens, but it’s not an “LW analysis of sexism,” it’s “analysis of sexism in LW.”  No turning the rationalist telescope out on the wider world here.

Then there are a bunch of posts by you.  It feels weird to raise that as an objection – it sounds like I’m just making up weird objections at this point.  However, I already like your writing, much more than I like “LW culture” in general; for instance, I read your blog regularly, but not the main site.  So these posts don’t convince me that any thinking of this sort is going on within LW except for the kind that is going on at SSC, and at the moment seems to have isolated itself to SSC.  Moreover, even these posts aren’t especially good examples of what I’m looking for: the IAT post for instance uses the IAT’s role in prejudice research only as a stepping stone to more “general rationality” applications when it’s precisely discussion of prejudice as prejudice that I think is important for the sake of reducing bias.

(There are two entries on the list I haven’t talked about.  One, “Manufacturing Prejudice,” is in fact about analyzing prejudice, and isn’t by you, but it is also not very interesting; modulo language choices, it’s the kind of post I would expect to see on tumblr and scroll by quickly because I already know and agree with the kind of thing it’s trying to say.  If this is what LW has to offer on these subjects, I’ll just stay on tumblr.  The other, “Gender Identity and Rationality,” is at least about analyzing gender from a uniquely “LW” perspective, so I’d say it’s the closest thing to a “hit” this list has by my criteria.)

That was a lot of words, but what it all comes down to is that I’m looking for serious investigation into sexism and racism, preferably with reference to the academic literature, and treated as topics of general interest (for people interested in understanding society) rather than things to be used as stepping stones to “general rationality” insights or as things to be reluctantly addressed when they arise as community problems and then ignored later.  If that’s a tall order, so be it.  It’s not the same as just “talking about racism and sexism,” though.

(The “equal time” thing is a bit complicated, but much of my objection follows from the above – equal time between “gender/race” and “neoreaction” is not a fair standard because “gender/race” as it’s defined above is a gigantic net that appears to exclude stuff like Eliezer’s catgirls post only because they are not leftist enough.  It’s as though I had said there weren’t enough posts about whether learning to code was a good idea and you had linked me to every post that talked about programming, or every such post that you perceived as being sufficiently “on my side.”  "Post fits my specific, perhaps restrictive interest criteria" != “post is far enough to my side of some ideological spectrum to balance out some other post I don’t like in some quota agreement.”  There may be plenty of posts on my side yet few I find interesting to read.)

(via slatestarscratchpad)

Today’s tumblr dash argument was extra fun because it was “Less Wrong vs. a person I coincidentally knew of, and disliked, thanks to the Homestuck fandom”

I am becoming a bitter old internet person

In light of the fact (if it is a fact) that some biases don’t change unless you think about them in context, maybe one could pitch explicit discussion of sexism and racism to Less Wrong as “effective bias reduction.”  You know, like “effective altruism.”

It would be, like, rationality squared, man!