Install Theme

Anonymous asked: What is the basis problem?

su3su2u1-deactivated20160226:

So this isn’t entirely accurate, but I think it’ll help you get the big picture:

So in the usual formulation of non-relativistic quantum mechanics, you have an arena, the Hilbert space and an object in the arena, the wavefunction.  

Quantum mechanics, the mathematical theory, thinks of the Hilbert space as just a mathematical object, like a plane.  And you can put any coordinates you want on that plane (polar coordinates, putting the origin at different places, putting different lengths on the coordinates,etc).  The wavefunction will look different in different coordinates, but its the same wavefunction.  

But many worlds wants to treat the wavefunction as more than a mathematical object- they want to say the wavefunction is the reality.  Further, they point to different blobs of the wavefunction and say that blob represent a world where X happened and this other blob represents a world where Y happened, etc.  But the “preferred basis” question just notices that “wait- if I change coordinates to some other basis, then I get a totally different set of blobs!  And now I have a totally different set of worlds. That doesn’t make sense.”  

(I figured I’d reblog this since it’s a good explanation and I’ve been mentioning this without explaining it)

I spend a lot of time looking out through the eyes of superintelligence, a view that tries to eliminate everything that belongs to human minds or evolved minds rather than minds in general. Watching Buffy enables me to do a “hard sync” with normal life (32), run the brain under its design conditions for a while.

jollityfarm asked: Also also, it's interesting that someone's actions are thought more excusable if they're perceived as intelligent. Perhaps I am projecting (as someone who other people mistakenly believe to be intelligent, but whose actions are maladaptive by any measure and are not related to my job or scores on anything). I guess it goes back to having to continually justify one's existence, being more allowed to exist as a mentally ill person if you're 'smart.' But this could be conjecture.

I’ve definitely noticed that kind of thing in my own life and the lives of various people I know — if you can be read as “smart” then you have a chance at the “brilliant eccentric” box and often that is much, much better than any of the other boxes available to you.  (Cf. a large portion of the internet seems to simultaneously love “autistic genius” types and loathe autistic people who can’t be fit into that trope)

I think there’s an argument to be made that Yudkowsky is responding to this dynamic as much as anyone — he definitely has mental health issues (see here, especially 1.6 and on) and seems to have gone full speed ahead with the “I’m different because I’m A GENIUS” idea that many people toy with and then reject.  I admit I’m sympathetic to this, having felt things like “I’m different enough from other people that I must either be superior or inferior, but not equal” pretty often in my life.  (Usually my brain goes with “inferior” but I imagine the other side of the coin has its perks)

jollityfarm asked: Perhaps I am misinterpreting, but are they seriously insinuating that Big Yud's accomplishments and/or intellect are on par with Hawking?

I think their point is that Hawking also (I think?) believes in the Many Worlds Interpretation, and they’re trying to figure out how much of su3su2u1’s opposition to Yudkowsky’s Many Worlds writing is due to Many Worlds itself (which has smarter / more accomplished adherents), and how much is due to specific mistakes Yudkowsky makes that Hawking wouldn’t make.

IMO, Yudkowksy’s exposition of Many Worlds itself is actually pretty good, but he falls down at explaining where the theory has problems.  And that makes his sequence ultimately very misleading, because “preferred basis” issues are a huge deal for Many Worlds, and he sweeps them under the rug.  It amounts to not mentioning the main reason people don’t believe in Many Worlds.

Anonymous asked: The scariest thing about Big Yud is how such an intelligent man, who spent so much time studying cognitive biases and talking about corrupted hardware, in the end used his intelligence to rationalize leading a cult whose members gave him money and let him sleep with their wives and girlfriends, just like every other cult leader in history. Do you agree?

su3su2u1:

thinkingornot:

su3su2u1-deactivated20160226:

I’m glad I didn’t see this last night, or I would probably have written a much stronger response.  

First, perspective- I don’t think the rationalist community has the abusive side we normally associate with a cult.  If he starts a remote rationalist monastery and the people who join it drop out of contact with their friends and family, then we can start to worry.  

That said, I do think that a lot of the writing is unfortunately tinged with spiritual language, and that the purposeful creation of ritual is problematic (whatever else the rituals are meant to accomplish, they will make it harder for participants to think rationally about “rationalism”) but that doesn’t mean he is running a cult.  

I do think the language around donating can be… extreme.  One friend of mine was donating so much of his rather small salary to MIRI that it put a lot of tension on his relationship.  If he is pushing polyamory on reluctant couples, I think this would also be problematic but I’ve not heard about it (also, “let him sleep with their wives and girlfriends?”  I don’t think the rationalist community is so patriarchal that women are being bequeathed against their will? (well, maybe the neoreactionaries))

That said, I think it’s a good lesson in the difficulty in escaping cognitive biases that someone who can spend so much time writing about, say, “map is not the territory” can write a whole sequence of posts about how the wavefunction of quantum mechanics is the territory, for instance.  

Question: if someone like Stephen Hawking (who also supports MWI) would have written the same sequence, would you have the same criticism?

Or is your criticism based on the fact that he isn’t an experienced physicist? 

I don’t live in all many worlds proponents’ heads, but certainly a lot of many world supporters are making this mistake.  I honestly don’t know what an intro quantum sequence from Hawking would look like.  I’d like to think he’d take some time to discuss why we should treat the wave function (not propagators, or the S-matrix, or field operators, but the specific map of wave functions) specifically as the territory, to demonstrate he has reasons and isn’t making a simple map-territory mistake.   

Hawking would never have written Yudkowsky’s sequence. He would not have made the mistake of reifying one specific formulation of quantum mechanics, he would not have written this post which doesn’t understand the position/momentum basis are just different coordinates on the same underlying structure (are polar coordinates more fundamental than x-y?).  Also, Schroedinger quantum mechanics is not local in configuration space- the potentials are functions of the coordinates of all the particles, so the reasoning in that piece is doubly bad.  (there are similar issues with almost all of his physics writing. Here he asserts you can prove the second law entirely from Louiville’s theorem, which is simply not true.  You need Louiville’s theorem + asymmetric boundary conditions or something else to break the symmetry.)

I had the same thought about the potentials when first reading that post, but after thinking further about it I think I can see what EY is going for there.  In principle the potentials could just be anything, and specifically they could be retarded potentials, or some other kind of potential that depends on all the particles but still doesn’t transmit information faster than light.

OTOH, the momentum basis is explicitly nonlocal (in a certain sense of the term) — the Fourier integral goes over all space at each given time.  (ETA: of course, this is assuming that the position wavefunction is the most fundamental quantity, from which other things are derived, which is sort of assuming the conclusion!)

Then again, there are still two objections I can see to the post:

1) if you can do all the same computations in the two bases (which you can), then the idea that one respects the “physical principle” of locality isn’t in any way tied to predictions — either your predictions obey locality or they don’t, it doesn’t matter which basis you use.  No reason to toss out a useful description because it’s not “local” enough (that would be like stipulating that you can’t write down anything about a given particle until it enters the light cone of some imagined observer, rather than just writing down info about all the particles in the problem at once like the all-seing physics-god you are)

2) As I understand it, the whole point of Many Worlds is that you should in principle be able to derive the Born Rule from statistics over the ensemble of worlds.  If the world “really” splits into different position basis states and not into different momentum basis states, this should be reflected somehow in the predictions.  Yet it’s not — the basis-independence in QM is borne out in all experimental tests.  For Many Worlds to work, it has to be formulated in a way that is compatible with that basis-independence, which has been a problem for Many Worlds people; Yudkowsky seems like he’s trying to side-step it but his solution is not satisfying (no basis is special in experimental results, but some basis is special in theory?).

(I’m sure there is a correct way to deal with this stuff in QFT, which I don’t know and EY doesn’t either.)

chroniclesofrettek.tumblr.com →

templesforwhores:

uncrediblehallq:

nostalgebraist:

uncrediblehallq:

So a few people I follow have brought up the “is LessWrong a cult” question again, with the “it’s nothing like Scientology” argument rearing its head.

I’ve made this argument in the past, but now it strikes me as naive. Sure, if…

I can’t speak to the cult-iness/lack thereof of the larger LW community, but (in my experience) Eliezer only pattern-matches to the Cult Leader archetype in fairly superficial ways - e.g. non-mainstream sexuality, loads of “followers.” FWIW, you don’t get the impression you’re hanging out with a living god/savior of humanity/end-times prophet when you’re hanging out at his place having pizza and watching cartoons. He’s a weird person. Very weird. I think some people read his weirdness as ‘creepy,’ and ‘creepy’ + ‘trying to save the world’ comes out looking very different from ‘normal and likeable’ + ‘trying to save the world.’

(via templesforwhores-deactivated201)

uncrediblehallq:

So a few people I follow have brought up the “is LessWrong a cult” question again, with the “it’s nothing like Scientology” argument rearing its head.

I’ve made this argument in the past, but now it strikes me as naive. Sure, if your standard for “cult” is “Scientology,” LessWrong definitely isn’t a cult. But if you’re willing to call Objectivism a cult–which Eliezer himself apparently is–it becomes more debatable.

(From what I’ve heard of Objectivism, it sounds like things got rather worse than anything I’ve seen in the LessWrong community, but I also see some of the same problems.)

This is kind of a complicated one – from what I’ve read about Ayn Rand’s inner circle (in the straightforwardly titled book The Ayn Rand Cult), it was very controlling, rigid, and inhumane in a way that isn’t matched by anything I’ve heard about in Less Wrong.  Admittedly, I don’t know what Yudkowsky’s own personal life is like, but if anything that bad was going on, you’d think (hope?) it’d get out – it would be orders of magnitude more damning than anything the “LW is a cult” crowd has managed to find so far.  Anyway, we haven’t heard any such things yet.

If by “Objectivism” you mean “everyone who calls themselves an Objectivist” rather than “Ayn Rand’s inner circle and the Ayn Rand Institute,” then the parallels are a lot closer, but by the same token, the “cultishness” is much less strong.  When I think of cults I don’t just think of “the founder speaks pure truths” – that’s a feature of many non-cult religions (and only barely present in LW).  I think of “certain characteristic structures of thought control, especially those that convince people to submit themselves to working conditions / etc. that most people would find repulsive while the leader lives in style.”  I don’t see that in LW.  Yudkowsky isn’t surrounded by worker bees.

Of all the criticisms of LW out there I think the “cult” angle is one of the worse ones – better to go for things like “Yudkowsky presents himself as supremely qualified to speak on many issues despite having no performance record to back it up,” or “members generally fail to demonstrate significant benefits in real-world performance despite obsession with ‘instrumental rationality’,” or “as far as anyone can tell your MIRI donations go to writing Harry Potter fanfic and proving mathematical logic results that no one reads with an extremely tenuous connection to actual AI,” or “the Harry Potter fanfic isn’t even good.”

Anonymous asked: The scariest thing about Big Yud is how such an intelligent man, who spent so much time studying cognitive biases and talking about corrupted hardware, in the end used his intelligence to rationalize leading a cult whose members gave him money and let him sleep with their wives and girlfriends, just like every other cult leader in history. Do you agree?

Not entirely — I think comparing Yudkowsky’s community to a traditional cult underestimates just how truly awful most well-known cults are.  I’ve read a fair amount about Scientology, the People’s Temple, Heaven’s Gate, etc. and the sheer level of psychological control, cruelty, rigid discipline, awful material conditions for followers, and so forth in those groups doesn’t have any parallel in anything I’ve seen out of Less Wrong.

It’s easy to find people giving detailed testimony about how the groups mentioned above ruined their lives.  Where are the equivalents for Less Wrong?  Has it ever ruined someone’s life?  Look at, say, Margery Wakefield’s account of inhumane working conditions and child abuse in a typical Scientology org — is there anything comparable in Less Wrong?  We need to keep some perspective here.

As far as I can tell, the worst thing Yudkowsky has done is getting some rich philanthropists to allocate some of their charitable donations to him rather than to some other charity.  Peter Singer or Peter Unger would probably say that that’s pretty bad in itself for utilitarian reasons, but they’d also say that about you or I buying a nice bottle of wine instead of giving the money to charity.  In any case, it’s not what comes to mind when I think of cult-level awfulness.

That said, yes, it is remarkable how little Yudkowsky seems to have applied his views about biases to his own pronouncements.  I’ve said this before, but it’s very strange to hear someone talk about the conjunction fallacy and then go on to live their life on the assumption not only that the singularity will occur, but that it will involve a variety of specific characteristics (FOOM, moral orthogonality, some sort of architecture that could support FAI).

Anonymous asked: my first thought upon hearing that name was 'the guy who did ' i mean it could have been a while ago and he's gotten better but i still don't trust it idk

clawsofpropinquity:

nostalgebraist:

IIRC he was involved in a flame war with the moneycat / maggotmaster crowd back when there was such a thing, and that’s where he got some of his e-notoriety?  That may be what you’re thinking of.

Also he went to a white supremacist conference once.  (I am sure he has some sort of story about how this choice had nothing to do with race or politics and was actually a fourth-level meta-counter-signaling gambit designed to send a code message to his double-secret esoteric intellectual kindred spirits in the service of disrupting the enemy’s phase deflector field while re-routing power to the warp nacelles, or something.)

What do you imagine his fourth level meta counter signaling explanations for “where are the pogroms” and “the wrong side won world war 2” are?

(Not that I’m inclined to attack random liberal strangers for being friendly with fascists, both for the reasons you suggest and because it’s not really my business, but I do find the whole Don’t Ask Don’t Tell thing kind of wearying. 

Like do you remember when Derbyshire got canned? He started writing for various fash rags and his early essays had two main themes “thank god I’ve been exiled from the lamestream media and can express my views openly, Heil Hitler” and “how dare those communists call me a white supremacist”. It wasn’t that he disputed the characterization, he just felt it was terribly rude, like an unspoken agreement had been betrayed, wherein he’d agreed to bite his tongue and not voice his opinions plainly, and everyone else would agree not to notice them. Of course no such agreement existed there, but it seems to exist in LessWrong and it’s suburbs, rather than them being totally politically open minded as such.

I don’t know, maybe I’m unfair, but I get the impression they wouldn’t be so friendly to, say, a working class EDL member. Like their position comes across as less “people can have evil politics and be otherwise good people” and more “our sort of people can’t be fascists”. It’s all very 1930s. And I know it’s uncharitable and kinda rude, but that’s the impression all this dancing around gives me)

On the one hand I see your point — it is definitely “our sort of people can’t be fascists.”  On the other hand, I guess I suspect that is kind of true in this case — sometimes it seems hard to tell whether the neo-reactionaries (or the subset of them that have crossover appeal) actually believe in far-right stuff themselves.

E.g. I had a truly surreal email conversation with Nyan Sandwich (a blogger at More Right) a while ago, in which he asked me what my disagreements with neo-reaction were, and I gave a standard laundry list (bad armchair ev-psych, reifying of statistical group differences with unclear causes into “Men do this and Women do that” type stuff, “formalism” can’t work long-term because generational succession and technological change mean actual power relations constantly move away from fixed signposts, etc.)

His response was that he actually agreed with all of these points in principle (!), but said he didn’t find these flaws in the neo-reaction bloggers he liked.  He then gave a list of these bloggers, which was fairly long and included Jim, of all people.

Elsewhere in this conversation he said that 95% of neo-reaction was an “edgy circlejerk” and he was only around for the remaining 5%.

My point in bringing up this anecdote is that I keep getting this sense that people who enjoy these blogs, and even people who write them, tend to keep this weird distance from the actual politics that doesn’t seem justifiable by saying they’re keeping up appearances (they’ll talk about it even in private).  The weird talk about “object level vs. meta level” is another example of this — it comes up not only in e.g. Scott’s writing about neo-reactionaries but in neo-reactionary writing itself (IIRC nydwracu likes to talk about how “politics” in itself is uninteresting and one should stay on the “meta level”).

Like, what the hell is going on here?  You wouldn’t normally expect see this kind of thing among political people, even political intellectuals.  (Something like the opposite seems common among political intellectuals of both left and right — “I may be an airy intellectual but I’m not scared of grubby politics.”)  ”These people aren’t really fascists, they’re just playing around with ideas” sounds defensive, but I’m not sure what else they could be.  Kids with too much time on their hands who get off on other people typing internet words about them?  (Look, here I am.)

It’s probably really telling that a number of these people (as well as Scott Alexander) are current or former Robert Anton Wilson fans.  Wilson was into the idea of reading lots of material that seems shocking or “obviously” wrong (such as really dumb conspiracy theories), not necessarily because any of it was right, but because encountering and wrestling with it would jolt your mind out of its habitual grooves (or something).  That provides one guess about what is going on: we have a bunch of people who are very much in favor of reading far-right material for sheer unfamiliarity’s sake but are unable to point to anything actually worthwhile about it (and don’t seem to view that as a problem).

“Yudkowsky University has a world champion cybersecurity team,” Alan blurted breathlessly from the Pinto’s passenger seat, bobbing his head back and forth between his father and the university enrollment pamphlet laying in his lap.