Install Theme

One nice thing about thinking you can solve every problem with your personal reasoning powers is that it can work like a built-in harm reduction mechanism when you run into a tough problem.  "Gee, this sure is complicated,“ you’ll think, and spend 10 years diligently trying to figure it out – because we have to be able to solve it by plowing through and figuring it out, right? – and at least in those 10 years you aren’t harming anyone through your cogitation.

Not all hyper-zealous rationalism works like this, and some truly terrible things have been done by hyper-zealous rationalists whose response to "gee this is complicated” was not “let’s spend 10 years thinking about it” but “let’s make it less complicated so we can understand” (James C. Scott’s “legibility”).  But a certain sort of hyper-zealous rationalist, at least, has the feature of automatically rendering themselves inert when confronted with something they might fuck up.

“This is a tough situation, but all situations are solvable by the diligent application of reason, let me seclude myself until I figure it out [secludes self indefinitely]” is a comforting response.  It reduces potential harm!  The world may be burning while you’re computing the fourth-order correction to the effects of potential action #15435, but at least you aren’t pouring fuel on the fire.

Much more frightening (to me personally) is “this is a tough situation, but one must act in tough situations, and who can really say what is or is not justified by reason in this fallen morass of a world, ultimately we must fall back on more basic dispositions and convictions which are essentially contested and thus not worth wasting breath on, and anyway what I’m saying is we should totally launch the Doomsday Machine right now

Her mother seemed not to notice. “And don’t you think we’ve thought about this?” She said. “I spend almost as much time talking to the people at MIRI as I do my own team!” Erica tried to picture her mother having long conversations with Eliezer or Luke from the Machine Intelligence Research Institute, the people who were always trying to get people to realize that superintelligent AI was a huge existential risk to humanity. She couldn’t picture those conversations. Her mother leaned forward and looked at her again. “The thing is, honey, this superintelligent AI, if that’s what it’s going to be, is not going to be some imaginary Yudkowskian nightmare who’ll convert the universe into paperclips or smiley faces. It’s going to be me. And I may not be wise or perfect, but I care about humanity. That’s why I’m doing this.” Now her voice caught. “I’m not doing this so I can become super-powerful, or rich, or famous. I’m doing it because we have an opportunity to change the world. Do you understand how much suffering there is in the world? We can end suffering, Erica. That’s what motivates me.”

She put her hand out and rested it on Erica’s hand. Erica didn’t take her hand, but she didn’t pull hers away either. She sighed heavily and looked out the window again. The sun was going down. Some teenagers were goofing around, playing ‘keep away’ with a laughing girl’s take-out bag on their way to their car. “I know that. I just hope you don’t end a lot of things we care about in the process.”

LW weirdness of the day: someone is writing a children’s/YA book about MIRI and the singularity

Anonymous asked: Who would win in a fight, you or Eliezer "Big Yud" Yudkowsky?

I think he’s bigger than me (that isn’t meant as a fat joke, I just straightforwardly mean that he is a larger man than I am) and I doubt either of us knows how to fight, so probably him?

On Arrogance

scientiststhesis:

scientiststhesis-at-pillowfort:

arrogant
adjective
having or revealing an exaggerated sense of one’s own importance or abilities.

A friend of mine once mentioned on a comment written in response to some post or another in a facebook debate group that he had knowledge of maths far above the Brazilian average. That is a simple factual sentence, a true statement (which isn’t exactly surprising given what the Brazilian average…

View On WordPress

I think there is another wrinkle to this.  When I use the word “arrogant” (or say similar things without using the word itself), I’m usually talking less about the person violating social rules and more about them being overconfident.

I think a lot of people who are actually good at certain things sometimes lack appropriate “epistemic humility” because they overestimate how broadly their talent can be applied.  Even if “I’m good at math” is true, “I’m good at math so I’m likely to be right about X” may or may not be true, and in any case isn’t a substitute for an argument for X.  (One way of putting it is that some competent people get in the habit of making arguments from authority where they are the authority.  Sometimes this happens entirely inside their minds and the only externally observable consequence is overconfidence.)

You point out that opening expressing one’s skill can make it look like one is looking down on other people.  I think that’s true, but it’s not the only pitfall: the other is that it may suggest one is engaging in an “I’m good at X, so I’m qualified to judge Y” thought process and becoming overconfident (“poorly calibrated”) as a result.  I’ve seen this a lot, and perhaps especially often among people who really are as good as they say.

We need to pay attention to why things like “I’m good at X” are being said – even if the statement is true, what claim is it being used to justify?  Is the claim actually justified by the statement?  (In logic terminology: someone explicitly says “P” and implies “P –> Q”; then when called arrogant they can fall back on “well, I just said P, and P is true,” when the real problem is with “P –> Q.”)

While I’m on the subject: does anyone know of a good introduction to MWI for non-scientists that can be recommended without reservation?

I think Yudkowsky’s quantum physics sequence is actually really good at providing the intuitive appeal of MWI – that an element of a superposition that doesn’t end up being observed can’t be thought of as a “path not taken,” it really seems like some real thing that may have exerted physical effects before the observation.

Of course Yudkowsky fails to mention MWI’s problems, and more generally fails to give the reader a sense of how the equivalence of different bases is a fundamental fact of QM reflected in the Born rule and confirmed by experiment, which makes MWI weirder, at the very least.  (He has a strange post about how the position basis is more fundamental because it’s local – but the Born rule works just the same if you measure something nonlocal.)

But the only other popular presentations I know of are in books by Deutsch and Tegmark, and while I haven’t read either of them, both of those authors are notorious for presenting established ideas and wild speculation in the same text.  Is there any “popular MWI with no far-out stuff” text?

bicyclicada:

bicyclicada:

Something about steelmanning in a discussion really rubs me the wrong way. I put effort into saying things. And someone responding to an argument I make by obviously trying to steelman it isn’t respecting that effort and, more importantly, isn’t treating me like their equal in the conversation.

Steelmanning as an intellectual exercise makes sense. But I am a person not an intellectual exercise!

I think people conflate steelmanning with the Principle of Charity when they’re really two different things, and it causes problems.

The Principle of Charity is “I’m going to act like you’re putting effort into saying things and choose the best / most sensible interpretation among the possible things I think you could have intended.”

Steelmanning is “I’m going to choose an interpretation of your words that is better than the best / most sensible interpretation I think you could have intended – you probably didn’t mean it, but perhaps you should have meant it.”

At least that’s what steelmanning seems to be when people explicitly say they’re doing it.

(via thededekindadafunction-deactiva)

Anonymous asked: Different person from last ask. I checked out that dude's blog and saw him going off on women liking Death Grips being race traitors. He's not super popular right? Please tell me he's not.

He (countersignal / nydwracu) is a relatively well-known neo-reactionary.  That means he is “popular” I guess among people who like neo-reactionaries, and unknown to most everyone else.  He seems to be integrated somewhat with the tumblr Less Wrong social group which is not really surprising given Less Wrong’s baffling interest in neo-reactionaries.

Like all neo-reactionaries his writing gives me a huge “you are just reversing leftist conventional wisdom to be edgy” reaction (it’s like reading a version of G. K. Chesterton who grew up on 4chan, and I can’t even stand more than a few pages at a time of the actual Chesterton) plus a lot of the reaction I talked about here.

(In fact, he is a lot more transparent than his peers about how he is basically making up social science technobabble on the fly to justify the claim that if he feels an immediate distaste for X then X is surely going to lead to an objective deterioration of society that would worry even John Rawls.  It just seems so pointless, you know?)

(Now that I wrote the phrase “making up social science technobabble on the fly” I can’t help but think of neo-reactionaries as sharing some spiritual kinship with Star Trek writers.)

su3su2u1:

nostalgebraist:

Okay I am going to have to stop reading The Name of the Wind now before I damage my eyes by rolling them too much, sorry Pat Rothfuss

(I just read the part where Kvothe takes a college entrance exam at age 15 and impresses the pants [figuratively] off of his examiners even though it’s been three years since he studied any of the material, and they decide to give him the first full scholarship in the history of the university because he is just. that. special.)

Whoa… sounds downright HPMORish.  The key to enjoying it might be to read every 3rd sentence or so, and skip all but the first and last lines of all dialogue. 

It is eerily HPMORish.  In the sequel there’s an episode in which the main character encounters a succubus-like magical seductress and literally uses his rationality techniques to resist her deadly charms.  (You know, actually, I don’t remember the gender stuff in HPMOR ever getting quite that clompingly obvious)

I have resorted to a more extreme version of your suggested method, which is not reading the books at all and just reading a hateblog.  So the same thing I do for HPMOR, come to think of it … 

(via su3su2u1-deactivated20160226)

bartlebyshop replied to your post “bartlebyshop replied to your post: Okay I am going to have to stop…”

You could just skip to that section of WMF. You really don’t need any of the plot beforehand to amuse yourself with Felurian’s section.

anoteinpink replied to your post “bartlebyshop replied to your post: Okay I am going to have to stop…”

frankly em and i thought that was the most entertaining part of book 2, and bartlebyshop is right - it Needs No Context. (the context involved us repeatedly shouting BUT WHERE’S THE SEX FAIRY)

On you guys’ good advice I have downloaded WMF and started reading/skimming the Felurian chapters and

1.  oh my god

2.  I think this is actually worse than I was expecting

3.  I definitely was not expecting this to directly involve, uh … methods of rationality

I had a moment of perfect, clear lucidity that resembled coming up for air and quickly closed my eyes, trying to lower myself into the Heart of Stone.
It didn’t come.  For the first time in my life, that cool taciturn state escaped me. Behind my eyes, Felurian distracted me. The sweet breath. The soft breast. The urgent half-despairing sighs that slipped through hungry, petal-tender lips. …
Stone. I kept my eyes closed and wrapped the calm rationality of Heart of Stone around me like a mantle before I dared even think of her again.
oh wow? neat?…um? im…sorry???

Like the guy said some really nice things about it, and I’d feel like a dick responding to that with anything but positivity

At the same time I keep getting the feeling that that the people who like my writing tend to be people who are into stuff I can’t stand, which is really weird

It’s like “thanks, podcast guy – sincerely – but also, can I take you aside for a moment and ask you if the HPMoR phenomenon is some sort of vast practical joke or something?”