One nice thing about thinking you can solve every problem with your personal reasoning powers is that it can work like a built-in harm reduction mechanism when you run into a tough problem. "Gee, this sure is complicated,“ you’ll think, and spend 10 years diligently trying to figure it out – because we have to be able to solve it by plowing through and figuring it out, right? – and at least in those 10 years you aren’t harming anyone through your cogitation.
Not all hyper-zealous rationalism works like this, and some truly terrible things have been done by hyper-zealous rationalists whose response to "gee this is complicated” was not “let’s spend 10 years thinking about it” but “let’s make it less complicated so we can understand” (James C. Scott’s “legibility”). But a certain sort of hyper-zealous rationalist, at least, has the feature of automatically rendering themselves inert when confronted with something they might fuck up.
“This is a tough situation, but all situations are solvable by the diligent application of reason, let me seclude myself until I figure it out [secludes self indefinitely]” is a comforting response. It reduces potential harm! The world may be burning while you’re computing the fourth-order correction to the effects of potential action #15435, but at least you aren’t pouring fuel on the fire.
Much more frightening (to me personally) is “this is a tough situation, but one must act in tough situations, and who can really say what is or is not justified by reason in this fallen morass of a world, ultimately we must fall back on more basic dispositions and convictions which are essentially contested and thus not worth wasting breath on, and anyway what I’m saying is we should totally launch the Doomsday Machine right now”
