Install Theme

Fiction recommendations:  Have you seen BBC’s Sherlock?  Imagine that he’s Lord Voldemort, John Watson is Harry Potter, and the two of them meet after time stops just before a nuclear war starts.  That’s How I Learned To Stop Worrying and Love Lord Voldemort, by cheryl bites.  And another word to the ever-growing MLP Loops, which contains by far the best confrontation I’ve ever seen between Fluttershy and 40K’s God-Emperor of Humankind, and is now up to 69 chapters and 638K words.

Remember what I said about Eliezer Yudkowsky giving the worst fanfic recs

On that topic, never forget:

image

spacedhamlet:

Trucks 16 part 2!

A bonus trucks episode in the middle of the week! How lucky you are! This one rounds off part one’s discussion with Rob about Less Wrong, and we also discuss the many nuances of Southland Tales, the worst movie ever made.

Follow Rob for a Tumblr experience you won’t regret.

Oh yeah, and if you’re just in it for the terrible driving, this one has the WORST DRIVING

Follow-up to the video I reblogged on Sunday – I haven’t listened to this yet but from what I remember it’s more of a back-and-forth discussion than the last one was, which is nice

You are personally responsible for becoming more ethical than the society you grew up in.

Eliezer Yudkowsky (via abundance-mine)

Said the guy who literally believes he’s smarter than the rest of humanity and wants to build an AI that will take over not just the world, but the *universe*, and fill it with whatever One True Way all of humanity *really* wants (which it will discover).

And thinks there’s a good case to be made the children under two are not “sentient,” but that this is too much of a shock to our morals and values and we are not rational enough to question those unless we’re a rare scintillating jewel of an intellect.

And thinks that in a morally more-enlightened world than our own, people might think of rape as basically just a pleasant surprise. 

And is a big fan of evolutionary psychology as an explanation for why men and women are literal aliens to each other, and for racial IQ differences and “achievement gaps”, and has said outright he thinks that rich people really are smarter, more engaged and more alive than others (and that he really hopes to fix that because gosh isn’t it unfair). 

And has routinely boosted his own “Machine Intelligence Research Institute” (formerly the “Singularity Institute” until someone there realized how that made them all sound) as the single most ethical source for all charitable donations.

And…you know what, I don’t even want to go *on* here. XD But this sort of quote that sounds really profound and insightful (and clashes horribly with the rest of what he says/does) is sort of his bag, I guess…

(via amaranth-mantis)

I was unfamiliar with the source, but that sounds…special. :/

(via clatterbane)

Yeah it’s pretty horrible.  I know someone who fell in with his cult of personality, and it’s changed them beyond recognition.  They fell in with him after researching how to best invest their money… and of course coming to the Singularity Institute as the best possible way to do it.  They seriously believe they’re saving the world.  They probably will be insulted if they read this and find that I no longer believe them, not even remotely, after doing the research.

Another fun thing his group has done is attempt to convince people that real actual disasters facing the planet right now are not actually important, compared to the possibility of the Singularity.  So everyone should forget about climate change and feeding the world and all this other important stuff, and instead focus on making sure that a super-intelligent computer won’t be able to do anything bad to us.  Also?  Those problems won’t be problems once the Singularity comes.  Because the Singularity is essentially so advanced as to be magic, and will be able to manipulate people into doing whatever it wants, and even create elements out of nowhere so that we will no longer have shortages of anything, and everything will be fine.

Honestly I think part of the reason that people flock to groups like this is that the state of the environment is horrible, we’re facing catastrophic events in the next couple hundred years, even possible extinction… and it’s better to think that the real threat is a computer that will probably never exist — and to also believe that this computer will be their savior if they can get it to be on their side instead of destroying humanity or something.

And the reason this doesn’t make sense to anyone with common sense?  Is apparently because we’re not logical enough.  Human brains, you see, weren’t built for understanding a situation like the Singularity, so we dismiss it out of hand.  Instead, we need to learn to think Logically And Rationally, and then the Singularity will make sense as this horrible threat, perhaps the worst threat facing the world today, and we will want to pour all our time, money, and energy into the Singularity Institute.

The last conversation I had with this person was harrowing and felt like having my mind shredded.  So does reading their website.  There’s, for lack of a better word (and they’d laugh at me), a really fucked-up feeling, almost an ‘energy’, around this, that I don’t trust and actually fear quite a bit.  It’s intense, it’s deliberately constructed, and it’s as foul as foul can get.  I badly miss the person in question, but I can’t talk to them as long as they still have traces of that feeling attached to them.  It feels like my mind is being shredded by cheese wire and I have to stay away from things that make me feel like that. 

Please avoid getting sucked in.  It looks laughable, but it’s also quite sinister.  Be aware, also, that they have a lot of PR aimed at getting upper-middle-class and wealthy geeks to give them a lot of money, time, and energy, and that they often succeed.  Don’t be sucked in.  Please.  Even if you can’t see the kind of mind-patterns I can see, this is bad news, it’s far more than something mildly disturbing to laugh at.

(via youneedacat)

The pattern that Amanda Baggs is talking about here— heroic responsibility + the PC/NPC thing + “the most important thing to do with your life is to give money to MIRI” + FAI + Roko’s Basilisk— scares the fuck out of me. 

“It is your job to save the world. People who really matter try to save the world; people who don’t are lesser beings. You save the world by giving us money.” That is a scary and manipulative and probably abusive.

Please note that it’s a scary thing to say even if it’s true. You know how Scott keeps saying that you don’t get to dox people and you don’t get to lie with statistics and you don’t get to try to make people get fired for saying things you don’t agree with and so on, even if you are on the Side of Good? Because maybe you are not on the Side of Good, and then you will have hurt lots of people for no reason? You should also not manipulate people into giving you money, because maybe you are not on the Side of Good.

And you should not have that high a confidence about whether people giving you money will save the world, because holy shit motivated cognition much? There’s a reason we have GiveWell and don’t trust random charities claiming they’re the literally most effective thing.

(To be clear here: I am not saying MIRI shouldn’t fundraise. I am saying that there are plenty of ways you can lay out the good you’re doing without saying that people who don’t donate to you are evil, you are going to be remembered in ten thousand years, etc.) 

I am still part of this community because the meme hasn’t mutated into a more virulent form. Most LWers don’t donate to MIRI; I’m fairly open about being Singularity agnostic and no one punishes me; a lot of people say they think that meme is a load of crap; the all time most upvoted post is about how MIRI is a bad charity to donate to. 

And… I still think of myself as a rationalist, because a lot of things in this community matter a lot to me. I started reading Less Wrong my senior year of high school, and the Sequences has fundamentally shaped the way I think. And the community is important to me, because it’s a place where I am respected as a person and I can talk about things that interest me where people challenge me but don’t trigger me. 

But I’m worried that all the good things were invented to sell people on this poisonous meme, and I am worried that my participation in the community is adding credibility and helping more people to get infected with it, and I am worried that the meme will mutate to a more virulent form and I’ll be so far in I can’t get out. 

Non-rationalist friends: think about this before you get involved.

Rationalist friends: Beware the evaporative cooling of group beliefs. Remember that not everything someone who says they’re rational says is rational. Remember that you can be wrong.

I am not going to argue with people about this, but I am happy to talk with people who have doubts. 

(via ozymandias314)

(via ozymandias314-deactivated201404)

The further I get in it the more I realize that the latest Slate Star Codex post is a hot mess but at least I hate it a lot less than the shithead he’s arguing again

Don’t get me wrong, I think Scott Alexander’s rhetoric is terrible and could be straightforwardly improved in so many ways, but those mostly center around him coming off as arrogant and spuriously identifying with socially unaccetable branches of nerddom for the sake of pointless defiance

It has nothing to do with him “using facts” or “taking other people seriously”

I love making fun of Mencius Moldbug’s name as much everyone else, but the reason I like doing this is because he's actually wrong about everything, and the Anti-Reactionary FAQ is a great explanation of why

It’s worth having a record of that and I can see why someone would view it as superfluous but if you view it as evil, you have lost me

There are impressionable people out there in the world – there are young people in the world, so many of them, and don’t you remember when you were young and you didn’t know all the stuff you know now? – and they are at least somewhat receptive to reason

no no no no no no no no no no no no no no
“he thinks some evil is actually the result of being misguided and is eliminable”
“his worldview incorporates the idea that some subset of the enemy might actually be human”
“he actually thinks the best, most...

no no no no no no no no no no no no no no

“he thinks some evil is actually the result of being misguided and is eliminable”

“his worldview incorporates the idea that some subset of the enemy might actually be human”

“he actually thinks the best, most effective rhetoric might involve sometimes treating the enemy as a real person”

“how naive”

“I don’t actually care about convincing people but I sure do care about making them feel bad about what they actually believe so that their actual opinions are not expressed in polite society but are sure as hell expressed in, say, the voting booth, where no one can see them or stop them”

I’m reading some recent debate about Less Wrong and it’s pushing my buttons so hard I can’t even come up with a coherent and vaguely socially acceptable tumblr post about it

I shouldn’t read rationalists, they make me too emotional

(I’m reading a person who is actually saying that the problem with Less Wrong is that they don’t get rhetoric, see, because some of them say loaded things like “I support eugenics” when they could use less loaded words for the same concepts)

(It’s almost as if some differences of opinions are about important things like whether certain people should be alive and no amount of polite phrasing can change that)

(Not all evil can be reduced to social awkwardness but my god are some people eager to convince themselves as much)

blurds:

spacedhamlet:

TRUCKS 16! …Part one!?

Our friend Rob joined us for an evening’s trucking to discuss some subjects of his fascination - namely the “outsider artist” Henry Darger and the online community known as Less Wrong.

Turns out this gives us quite a lot to talk about - so much that we recorded a second part to this episode, which will go up sometime in the middle of the week!

Check out this video if you want to see some really weird art?

A good trucks! Featuring our solid buddy nostalgebraist, who you should be following if you enjoy thoughtful commentary on the behavior of baffling humans. (snarp - I wish to tumblr matchmake the pair of you )

Hey I’m in this thing!  I was so long-winded it took more than one episode to subdue me!

Watch this if you want to hear me talk in my bizarre nerdvoice about some of the recurring topics here on nostalgebraist dot tumblr dot com

My friends in the video are tumblr users blurds and spacedhamlet and following them is a life choice I encourage

Edit: oh yeah, the thesis I was talking about was “The Transubstantiation of Henry Darger” by Faith Ann Shields, available for free at the PDF link at the bottom of this page

(via blurds)

johnwbh:

I feel uncomfortable with the Lesswrong meme of taking “heroic responsibility” and the idea that you, yes you have to save the world. 

Maybe thats helpful and motivating for some people, but taking obsessive responsibility for things beyond your control is pretty common and dangerous component of depression and anxiety. 

Thinking ”I have to maximise utility or else I am an awful awful person gaaaahhhhh“ tends to happen to me whe I’m in deep guilt and anxiety cycles, and isn’t helpful. 

Maybe the advice is aimed at people who have the opposite problem, and have chronically low sense of responsibility/altruism?

(via ozymandias314-deactivated201404)