The Cthulhu Shield: A Critique of Rationalism
Rationalists, effective altruism adherents, and utilitarians are incredibly annoying. This is mostly universally understood.
I sometimes wonder how folks who consider themselves rationalists feel about this. I myself am a part of a few different 'groups,' and some of them have pretty negative reputations, but at least nobody ever turns to me and says, "Damn, you are part of such and such group? You fuckers are insufferable! Or, as my friend likes to say, "Meta this, meta that, have you ever meta woman?"
To be clear, when I say 'Rationalists,' I'm not talking about people who strive to think in a rational manner or the philosophical movement from the Enlightenment, which was in part championed by Spinoza, Leibniz, and other similar thinkers (Leibniz totally did it because he wanted to be better than Newton and make waves in both philosophy and science; okay, fine that is probably not true).
I'm referring to a specific movement of people who can be defined by similar philosophical outlooks (utilitarianism, atheism, 'effective altruism,' typically libertarianism but not always), certain texts (most work by Eliezer Yudkowsky and Scott Alexander), and a culture that Scott Alexander describes as consisting of "libertarian political beliefs, Dawkins-style atheism, vague annoyance that the question of gay rights even comes up, eating paleo, drinking Soylent, calling in rides on Uber, reading lots of blogs, calling American football 'sportsball,' getting conspicuously upset about the War on Drugs and the NSA, and listening to filk."
Sam Kriss, who is most certainly not a rationalist, describes it as being defined by "living in the Bay Area, writing things like 'fark' or 'f@#k' instead of 'fuck', and having unappealing sex with your entire friend group." While I take issue with the insinuation that having sex with your entire friend group is unappealing, you probably know the philosophical movement I am referring to. They are sometimes called the TESCREALs, which stands for "Transhumanism, Extropianism, Singularitarianism, Cosmism, Rationalist ideology, Effective Altruism, and Longtermism. Rationalists love utilitarianism.
How to Be Ethical in Large Groups Using Math
Utilitarianism is an inherently modern philosophy in the sense that it makes the most sense when applied to large groups of humans existing within complex systems. If you have a small town of a few hundred people, you don't need to use complex logic or probability theory to determine what you should do, you just look around you. If people seem pleased with you, chances are you are doing a decent job.
The extent to which people are happy with you typically has less to do with whether you are actually creating a better world and more to do with whether you are functioning within an existing framework of social norms. This certainly isn't a perfect system, and communal norms are infamous for being repressive. In addition, they differ from culture to culture, which makes it inconvenient to use this as a moral system in a modern world where people can talk with somebody who lives on the other side of the planet very easily.
People are very good at figuring out communal norms. Well, I'm not, which will come as a surprise to you only if you have skipped ahead and this is the first sentence you are reading, but most people are. Similarly, we are terrible at complex world modeling. This essentially means that we are not that good at utilitarianism. People have made attempts to create utilitarian frameworks, the most famous is probably Practical Ethics by Peter Singer, but it is very much an unsolved problem. This doesn't mean that we shouldn't even try—most people are terrible at linear algebra, but we still have computers. Making the world a better place is a worthwhile task.
The point I'm making is that morality on vast scales in complex systems is not something we are naturally good at, nor is it the way that we are designed to think about ethics. Nobody knows really why people 'developed morality' and evolutionary speculation is extremely susceptible to having people project their biases on things, however, I think it is reasonably likely that at least a large part of the reason that some people are 'altruistic' is because we are designed to live in groups. Utilitarianism is what happens when we try our best to create a morality system for large groups, which we were never designed to live within.
Utilitarianism certainly has its flaws, as does any philosophy when placed under scrutiny. This is why philosophy remains an unsolved problem and we can't just decide that morality is relative and we should use science for everything else, as much as many people may wish we could.
A classic example of the limits of utilitarianism is organ donations. It may be correct under utilitarian doctrine to murder a disliked individual to give their organs to a beloved world leader who has heart failure, but nobody would want to live in this world. Another example, posed by Nozick, is the 'utility monster,' which feels incredible happiness from causing mayhem and therefore is justified in causing suffering. My favorite example of utilitarianism's limitations is the billionaire philanthropist. Using utilitarianism alone, Bill Gates is one of the greatest humans to ever live—that alone is a huge blow to the validity of the framework because he's a fucking asshole. Sam Bankman-Fried was very openly utilitarian, and he was a gigantic asshole as well.
That being said, deontology, the counterpart to utilitarianism, has its flaws as well. Deontology dictates that the best way to behave is to establish (the deciding element is typically the difficult part of deontology) a set of behaviors that are 'bad,' such as lying, cheating, killing, stealing, etc., and then proceed to not do them.
Instead, you should do things that you have established as 'good', such as being honest, kind, etc. Kant, who was very much a deontologist, claimed that if a murderer was at the door and asked you if their intended victim was inside, you should tell them the whereabouts of the victim because lying was wrong. This is a statement so absurd I was certain it was apocryphal until I went to look it up, and it turns out that it is real, as far as I can tell. I can only hope that Kant just professed this opinion to back up his philosophy, but if I am ever running from a murderer, I certainly know whose house NOT to hide at.
Utilitarianism is certainly a philosophy with flaws, but I don't think anybody denies this, and only a deranged individual would follow a philosophical framework to the letter without understanding that any philosophical framework is incomplete. However, I think you can probably see why wealthier people, like SBF, are drawn to it. Under a utilitarian framework, you essentially have to be rich to be a good human being. The incarcerated man who donated $17.74 to Gaza is far less virtuous than the investors who backed lab-grown meat. The man who donated $17.74 had nearly zero real impact, and in fact people raised over $100K for him, which could have gone elsewhere. He was arguably a net negative in terms of contribution to society's resources. In contrast, the investors, who were probably looking to make money off their investments, are helping to create a system of meat production that is less cruel to both livestock and workers, as well as less carbon-intensive. Perhaps I am being overly optimistic about the future of lab-grown meat, but you hopefully get the point I am trying to make. Rationalists tend to be on the wealthier side, so it makes sense that they are utilitarian.
The Universe is the Domain of Cthulhu
As a kid who grew up on Isaac Asimov, utilitarianism and effective altruism always reminded me of the fictional field of psychohistory described in the Asimov Universe (he had an MCU-style universe that spanned many series and characters). For those of you who were not as lucky as I was and did not spend your adolescence reading the Foundation series, the basic concept behind psychohistory is that if you model the world with very high accuracy, you can predict the future. Asimov had a PhD in chemistry and a strong math background, so he was able to convincingly create this fake math field without resorting to the 'technobabble' that you hear in Star Wars/Trek. Psychohistory was not an ethical framework, purely a predictive one, but the concept of world modeling via math was very much present within the concept. Fun fact: the first sex scene in any book I ever read was in Asimov's Foundation series when I was 12 years old, and I have yet to recover.
Asimov's works are certainly still read by many, and he is considered a classic of the genre. I suspect that many will enjoy his works far into the future, no matter how dated they become. He is a good author, despite being kind of a pig.
But people don't really write science fiction like that anymore. The 'Golden Age of Science Fiction,' characterized by optimism, very hard science, and misogyny, is mostly at an end and feels either dated or nostalgic depending on whether you like it or not. 'Hard science' is a bit of an understatement for Asimov, who considered his greatest short story to be 'The Last Question,' which is literally a story about the Second Law of Thermodynamics.
Science fiction nowadays tends to focus less on moonshot optimistic predictions about where science and engineering will take us in hundreds of years and more on the concerning implications of problematic political and social trends. For example, "All the Water in the World," published in 2025, is about what will happen when the glaciers melt. "We Lived on the Horizon," also published in 2025, is about surviving an AI apocalypse. Part of the shift in tone is likely due to the fact that people are far less optimistic about the future than they used to be.
But I think there is another factor. Asimov lived in a world where science made a lot more sense than it does now. His life and career coincided with the development of quantum weirdness, but it didn't get nearly as weird as it is today. Where science stands now, the universe seems less like the work of a brilliant creator who put the heavenly spheres into motion with a perfect orderly plan, in accordance with their great wisdom and knowledge, and more like something vomited up by Cthulhu after a particularly nasty fight with Azathoth.
The mathematician Dr. Paul Erdős used to claim that he wished he could see into "God's book of proofs," which makes sense because he died in 1996. Nowadays, I think that many scientists would suspect that "God's book of proofs" would probably leave a mathematician with bleeding eyes and an eel for a tongue, rendering them unable to speak in any voice but that of a shrieking hyena. We are entering the stage of scientific discovery that involves concepts that cannot really be understood by a three-pound lump of wet fat resting inside a cranial cavity. These are very interesting times.
Again, that doesn't mean that scientific progress isn't useful. I don't think anybody really gets relativity on a deep level, but confusion about the structure of an atom will not leave you any less dead in the aftermath of an atomic bomb. GPS still works. Just because we can't predict the world with 100% accuracy doesn't mean that getting most of the way there isn't useful, and just because we can't understand it on a gut level doesn't mean we cannot describe it using math.
However, I think it is becoming more and more clear that logic is the domain of humanity. It is how people interface with reality, not the language of reality itself. If the universe is run by Cthulhu, which I am using as a shorthand for 'that which we cannot comprehend on a gut level, only approximate mathematically if we are lucky,' then logic is the language we use to speak with Cthulhu and ask for things. Sometimes it obliges, and we get penicillin, Elden Ring, Golden Rice, the Apollo Program, and Lockheed Martin. But sometimes it tells us to go fuck ourselves, and there isn't anything anybody can do about it.
Rationalism doesn't deny this, but it kind of closes its eyes to this fact and considers it a minor issue. Sure, we can't intuitively grasp the behavior of the proton, but that doesn't invalidate the usefulness of logic. Just because we can't get to 100% accurate world modeling doesn't mean that getting much of the way is not helpful. Utilitarianism is still valid because humans who need access to deworming technology don't exist in the world of quantum uncertainty.
Okay, but why is everybody here a fucking MAN?
Now I am going to totally change gears, but I promise this is relevant. Why are most self-proclaimed 'rationalists' men?
Scott Alexander, the classic 'rationalist who is willing to consort with repugnant ideas for the sake of evenhandedness,' writes a bit about this in his article "Gender Imbalances are Mostly not due to Offensive Attitudes." He says this article has been mostly replaced by a later, better one, but I honestly like the original one much more because it references catholic.com, and I dislike organized religion.
It makes the argument that if there are more women in the Catholic Church, which is not known for its feminist views, than there are in the rationalist community, then rationalists being more feminist will probably not make more women join the rationalist community. He then goes on to say that rationalists should be more feminist for its own sake, so long as it is understood that it is not a flaw of libertarians and rationalists to be predominantly male communities, because that would give "fuzzy-empathizing-humanities types a giant hammer with which to beat all sciency-systematizing-utilitarian types forever."
I am so fucking torn about my thoughts on Scott Alexander. I don't think he is flat-out wrong on many things, but I am bewildered by his priorities. Do you really care about the whole math vs. humanities fight this much? Did you get screwed over by a D&D group that was more into roleplay than minmaxing? Or do you just have literally zero friends who are women? How on Earth is sticking it to English majors more important than not being a misogynist?
In any event, during this blog post, he references this paper on "gender gaps in sociopolitical attitudes: a social psychological analysis" to make the claim that women are "more liberal than men in social compassion and more conservative in traditional morality", which is a pretty good description of "not libertarian." This may have some truth to it, but I do not think it is the full picture.
Another perspective on why most members of the rationalist community are men is that men are emotionally stunted due to either cultural or biological reasons and therefore turn to purely logical thinking to make up for their lack of emotional intelligence. There are probably parts of this that are true as well.
What I like about this argument is that it takes a somewhat similar argument, which is that men are inherently more logical and less emotional than women, and turns it on its head. "Okay, so men are better at logic than women? Well, who needs logic? Men are miserable due to lack of emotional intelligence and are less effective anyway! Where has your logic gotten you, huh? Did you logic your way into a loneliness epidemic and lower college enrollment? Fuck you!"
I would argue that the lack of emotional intelligence in men is mostly cultural, but that is a different topic entirely. My main issue with this argument is that it somewhat sells rationalists short. The average LessWrong poster may be shockingly emotionally immature, but Scott Alexander is a trained psych. That isn't to say that there are not terrible psychiatrists; that is probably the majority of the field, but I don't actually get that vibe from any of his writing. He seems reasonably thoughtful most of the time, just slightly wrong about things.
Eliezer Yudkowsky, a rationalist writer who has garnered so much respect within the rationalist community that they wrote a Chuck Norris-style list of Eliezer Yudkowsky facts, was for some reason possessed to write his rationalist manifesto in the form of a Harry Potter fanfic. It is called "Harry Potter and the Methods of Rationality" and you can read it here, but you probably shouldn't.
Why he chose this deeply unfortunate medium I couldn't tell you, although it is somewhat funny when they turn some of the absurdities of the Harry Potter universe on its head. For example, Voldemort uses a gun. One quote from the first chapter is:
He was given anything reasonable that he wanted, except, maybe, the slightest shred of respect. A Doctor teaching biochemistry at Oxford could hardly be expected to listen to the advice of a little boy. You would listen to Show Interest, of course; that's what a Good Parent would do, and so, if you conceived of yourself as a Good Parent, you would do it. But take a ten-year-old seriously? Hardly.
Writing this phrase doesn't neccecarily make Yudkowsky the most emotionally intelligent person in the world, but it suggests that he has spoken to other humans before and has left his couch at least twice. While the lack of emotional maturity may apply to some rationalists, I don't think it really fully answers the question of why it is such a male dominated community.
I have a third perspective on why most rationalists are men, which is that men live in a more rational world than non-men.
The Cthulhu Shield
You know the story about the blind men who all touch different parts of the elephant and all have different opinions about what the elephant looks like? One guy touches the elephant's tail and thinks the elephant is like a rope, and one touches the leg and thinks the elephant is like a barrel? Here is a more accurate version: one man gets to ride the elephant from a saddle, steering it wherever he wants to go, crushing everything unfortunate to be in its path into pulp, and thinks that the elephant is amazing, safe, and perfectly rational. When he kicks behind the ears, it goes forward! When you press backwards with your heels, it stops walking! All is well!
Everybody else is tied up in front of the elephant, left to be trampled to death by its feet. They are blind, so they have no way of knowing why they are being crushed or what is crushing them, all they get to know is that something beyond their understanding is stepping on their chests and killing them. It is unknowable, unstoppable, and makes no fucking sense.
Cthulhu may rule the universe, but we don't want them to. It is unpleasant when chaos asserts itself on our logical lives. When a person exclaims, "Fuck it my life makes absolutely no sense! I can't make it make sense!" They are not having a good day. We want our lives to be logical, and if we have control over our lives, they likely will be reasonably easy to understand. I would argue rationalists are mostly men because being a man is one of many forms of privilege. Rationalists are more likely to be wealthy as well. Rationalists tend to be men because Cthulhu asserts itself less strongly in the lives of most men, because most men have more control over their lives.
Disclaimer: if you are a man who is reading this thinking "wtf I have zero control over my own life what are you talking about" that is because in a patriarchal society, most men are not the patriarch. So if you don't feel like you are in control of your life, that means that the patriarch is not you, it's some other man.
Overall, I think part of the reason that so many rationalists are utilitarians is the amount of control they have over their lives. They have much more in common with the venture capitalists who invested in lab-grown meat than they do with the incarcerated man who donated to Gaza. They have the ability to sit back and look at things on a large scale without worrying about small scale-issues affecting them. Yudkowsky founded the Singularity Institute, which is attempting to create a 'friendly AI' that will not destroy us, because this minimizes the danger that a malicious AI will destroy humanity. He advises that to save humanity, the best course of action is to:
"Find whatever you're best at; if that thing that you're best at is inventing new math[s] of artificial intelligence, then come work for the Singularity Institute. If the thing that you're best at is investment banking, then work for Wall Street and transfer as much money as your mind and will permit to the Singularity Institute where [it] will be used by other people."
Ah, yes, the two jobs: mathematician and investment banker.
But on top of that, this is such a deeply detached way of looking at things. If you have no other real pressing issues, and nobody you know is in imminent danger due to the MANY SERIOUS CATASTROPHES OCCURRING RIGHT FUCKING NOW, then yes, a malicious AI may perhaps seem to be the biggest issue we face. But you can only really think like this if nothing is more pressing in your life. From where I stand, I don't know if humanity as we know it will survive long enough to create a singularity-like AI. With the way science funding is going, how are we going to train the new researchers?
Now I want to make a very important caveat here: I do not think that being privileged makes you evil. There are plenty of very wealthy people who are not rationalists, or extreme long-termists, or this particular form of detached. However, I think that privilege can make you evil if you don't try and stay grounded in reality, and I think that most rationalists have not attempted this. Yudkowsky clearly has not attempted this, because he thinks that the most pressing issue for anybody to be working on right now is avoiding an evil AI from being created. This man needs to pull his head out of his ass and smell the fresh air.
I am going to be irrational and tribal and emotional and short-sighted, and you are going to fucking deal with it
The inspiration for this post was a Slate Star Codex article by Scott Alexander (I like SSC, okay? Everything I've read there is like 70% correct from my perspective, which makes it enjoyable reading.) Specifically this one, "I can tolerate anything except the outgroup," which made the argument that the political divide in the US was tribal rather than political, and the 'outgroup' for the left is just the right.
He makes some good points. One of them is that the 'Blue Tribe,' which is what Scott Alexander called the group that passes for the left in America, doesn't actually care about minority groups but merely sees them as "allies of convenience who deserve to be rehabilitated with mildly condescending paeans to their virtue."
This is probably true, but Scott is far from the first person to make this claim. It is a fairly well-established concept that many people enjoy paying lip service to social justice but don't actually give a shit about minorities.
The main issue I have with what he wrote is that it implies that the left's hatred of the right is purely tribal hatred and we only feel different because we are in the throes of a tribal war. We have reasons for hating the right, but the Hatfields probably had reasons for hating the McCoys. It's okay to eat pigs, even though they are smarter than dogs, because dog eating is not normalized the way pig eating is. Ear gauges and full-body tattoos are fine, but cranial deformation is barbaric. Everybody thinks that their in-group culture is fine, but the mean, evil out-group is terrible and wrong. If you want to be a good, logical, self-aware rationalist, you should be tolerant towards groups that make you think to yourself (this is a quote from the outgroup article), "Being tolerant makes me see red, makes me sweat blood, but darn it I am going to be tolerant anyway."
I am not going to do this. If I do this, my life will get worse. The reality of my life is that the 'Red Tribe' is in fact out to get me in particular, and if I deny this so that I can claim to be some high-minded rationalist sage, I will be fucked. Scott wrote the article in 2014 under Obama, so maybe it was easier to deny reality then, but regardless, I'm not going to be tolerant of the right.
It is a mark of maturity to be aware of your limitations. A child thinks that everybody else is biased due to mysterious reasons, but they alone have a perfectly objective view. If you acheive maturity, which happens at different ages for everybody and never for many, you realize that you are biased as well. People from other places think you have an accent, people from the future will think you lived in a barbaric time, people richer than you will think you are poor, people poorer than you will think you are rich, and a portion of the population thinks you are going straight to Hell.
But it is a mark of having no skin in the game to be capable of mantaining total dispassion in a conflict. I may be a brain in a jar, the universe may be a simulation, and my whole life may be a delusion caused by my limited mammalian senses observing a complex world. That last statement is probably mostly correct. But I still need do my job so I can get paid, and thus afford food because at the moment my body seems to need food rather than nutrients piped into the brain jar.
Disliking the outgroup is also not always a bad thing. The outgroup of the Union was the Confederacy, the outgroup of the Allied power were the Axis powers. Sometimes conflicts are senseless, sometimes people are actually doing bad things and must be stopped, and usually the answer lies in a murkey grey area between those two poles.
I will conceed that there is an aspect of outgroup hating that is problematic even when the outgroup has, in fact, done something wrong. If you hate the American right because of what the Trump administration is doing, this is probably reasonable, at least from my perspective. If you hate them because they are a buncha dirty football watchin' redneck bastards... well then in that case I do agree with Scott Alexander and you are probably just engaging in the time honored human tradition of hating the 'other', and you need to do some self reflection. It is likely that many people who think they are doing the former are doing a combination, that is just the nature of people, but this does not invalidate the fact that if I tolerate the American right, they will make my life worse. They already have.
The main point I'm trying to make here is that yes, it is possible that when people feel anger towards fascists, they may be activating the part of the psyche that hates people who are 'other', but if you think that this is a reason to cease anger towards fascists then you are clearly not deeply involved in what is going on right now. Of all the pressing issues going on right now, thinking about the poor 'othered' conservatives in the US is really not on the top of my list, but not only is it on the top of Scott Alexander's list, it currently sits at #3 in his ten suggested posts in his about page. Like the majority of Scott's writing, he is not neccecarily wrong about anything in particular, he just has priorities that suggest that he does not have much skin in the game, and that effects his viewpoint.
Scott told on himself in an interesting way in the outgroup article. He mentioned that he could not really be part of the Blue Tribe because of how easily he critiqued it. Therefore, he must be part of a 'Grey Tribe,' which is essentially the rationalist movement. While he could think of important issues with the Grey Tribe, they made his blood boil, and it would be difficult to voice them. This prompted many people in the comments to bash their own in-groups, which I think was supposed to seem brave and self-aware but really just came off as deeply sad and pathetic.
If you have never had to critique your in-group for the sake of making a concession in an argument while trying to defend your in-group's right to exist, then you have had a very interesting and unique life. One that probably contained enough rational phenomena to turn you into a rationalist. Scott Alexander is indeed a rationalist, so this, at least, makes perfect sense.
When Rationalism is Rational to Use
This may seem suprising given what I've just said, but in general, I think utilitarianism and rationalism make a lot of sense in most circumstances. The universe may be run by Cthulhu, but logic is a decent approximation of reality most of the time, and it's all we've got. We may not be designed to view the world in a utilitarian lense, but the reality is that we do not live in small communities anymore and we do not have much of a choice.
In addition, the difference between optimal moral choice and the slightly suboptimal moral choice isn't always huge. If somebody spends their life trying to raise as much money as possible trying to eradicate malaria, when in fact the slightly better choice would be rasing as much money as possible for education, that isn't the end of the world. To be clear I'm not actually making the claim that education is indeed the more pressing issue, I'm just using it as an example.
One of the primary theses of Practical Ethics is that we should donate more to charity. Scott Alexander advocates donating 10% of your yearly income. Even if you are annoyed by rationalists, they are not some evil force bringing destruction.
However I do think that it is important to be aware of the fact that the ability to sit back and look at the world from a birds-eye view almost always coincides with a very thick anti-Cthulhu shield, which can lead to a disconnect with how most people have to live. This leaves them with nothing but utilitarianism to guide them, which can work sometimes if you are very very good at this type of thinking. But while some people are, the vast majority are not.
This in turn can then lead to situations like the Singularity Institute, in which a large number of very wealthy people are pouring unreasonable amounts of money and effort into something that is only an issue if you have literally zero problems in your life. The fact that some people think that utilitarianism dictates spending all your effort fighting malicious AI is a testemant to how bad most humans are at utilitarian thinking. If you want to make a real difference in the world, do not spend your life supporting the Singularity Institute's research—almost any other option would be better.