Contra Bentham's Defamation Of My Character
On an idiot
The Wonderland Rules is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.
This isn’t the article I wanted to post this morning. I’d planned to post something I’ve been working on about the difference behind how fights work in the real world and how they are popularly conceived to work, particularly in film and television.
Bentham of Bentham’s Bulldog has accused me of doing and saying a number of things I didn’t do or say, including (but not limited to) slander and holding positions I don’t hold. For reputational reasons, I think it’s worth my time to rebut their worthless article point by point.
(Substack does not permit me to nest quotes. As such, I have bolded statements made by Bentham.)
Jay Rollins has a truly bizarre specimen of an article. This article alleges, among other things, that EA is a cult; it does this based on flimsy, wholly unjustified assumptions combined with a startling and profound range of confusions. This is a new and rather odd breed of criticism of EA. It seems the old breed of effective altruism criticism was to generally just point at random things EAs have written and sputter—how dare they write that thing that sounds bad out of context? The outrage!
First of all, when I link an article I'm discussing in any depth, I use the feature Substack added a few months ago that allows the reader to see the title, the ‘stack it’s on, and a preview of the content. If you look at any of my Rollins’s Review columns, you’ll see what I mean. This is a courtesy to both the reader and the author of that stack; it tells both that I take them seriously by making it easy for the reader to judge the content and showcasing the author’s brand. I have not linked Bentham’s ‘stack, and I have deleted their comment on my comments section in which they advertised this calculated insult to my integrity. The reader may meditate on why that might be at his or her leisure.
This breed was interesting, albeit totally bankrupt—much like criticizing the Democratic party by digging up random bad-sounding, out-of-context statements by obscure Democratic politicians. But this old strategy required a sort of art—one had to be somewhat informed of a lot of things EAs had said to be able to slander them effectively. This new breed seems even stranger and in some ways worse—more perverse. Instead of quote mining for things that sound bad, the new breed will merely poorly summarize EA, before lying about it repeatedly, claiming it supports things that it it diametrically opposed to. I’ve already replied to one of these articles—given their rapid proliferation, someone needs to do the unpleasant work of refuting them systematically.
Criticizing the Democratic party by digging up bad-sounding, out-of-context statements by obscure Democratic politicians is a context-dependent exercise, but in many cases a valuable one. Regardless of their obscurity, Democratic politicians represent their party, and pointing out bad behavior on the part of its representatives is both a fair and an effective means of pointing out the failings of the party.
The implication that I have slandered EA is baseless and factually untrue. I did not summarize EA in the article, nor did I claim to be the source of the summary, which was composed by Zvi Mowshowitz of(as I stated throughout the article). Nor at any point did I lie about EA (a necessary condition for slander to take place), or about Zvi’s summary.
Jay Collins follows the new breed of EA criticisms based on misrepresenting it and alleging it’s horrible in various ways. Given that Collins’ entire article is one grand, dramatic exercise in erecting straw-men, before declaring them scientologists, or similar cult members, let’s get clear on what EA is. I’ll quote Richard’s excellent summary of the topic.
My last name is spelled with an “R,” Bentham. I suggest you take your anti-seizure medication before sitting down at your keyboard.
So, EA is a community of people trying to act on the idea of effective altruism—doing good as effectively as possible. That seems good, especially when the movement has saved over a hundred thousand lives. However, Collins seems to think that EA is a cult.
Bentham doesn't write as well as they read, but I suppose it’s something that they have correctly ascertained that I think EA is a cult. Whether it has saved lives is orthogonal to whether it’s a cult, and is also orthogonal to the moral value of the organization and its followers. The Catholic Church and the Boy Scouts have both done much good for many more than EA has; they have also both caused many innocent victims to suffer.
How does he argue it’s a cult. Well, he goes through a very confused list by someone else of lots of things that correlate a bit with EA, before declaring them necessary for EA, before declaring them indicative of cult-ness. The argument is sufficiently full of holes that an elephant could pass through it unscathed. Collins quotes 21 points from Zvi Mowshowitz. I’ll go through each of them and explain why they’re either not required for EA or not objectionable.
Zvi’s list is quite clear. It was composed as part of an article criticizing EA for the EA criticism contest. I do not recall whether it won, butreferred to it and quoted from it in one of the posts relating to the contest, which he sponsored.
Utilitarianism. Alternatives are considered at best to be mistakes.
’s take on utilitarianism is that the Venn diagram of college utilitarians and psychopaths is basically a circle. I’d go further. Utilitarianism is a philosophy with a specific use case: leadership decisions. If you are not representing your tribe, you have no business deploying utilitarian ethics, which consist of moral calculations about maximizing utility for groups. People who engage in utilitarianism for personal reasons have taken academic philosophy classes, and thus bear watching; they have formal education in how to make a special case of themselves.
One might expect that in a case where one is arguing that effective altruism is a cult because it’s too utilitarian, they would argue both that EA is objectionably utilitarian and that being objectionably utilitarian makes one a cult. Now, they provided no argument for the first claim—beyond pointing out that Zvi Mowshowitz hinted at it once, and the second claim is ridiculous and supported by nothing.
Bentham has incorrectly summarized my argument in the first clause. I do not believe EA is a cult based on any one of the points Zvi raises, I believe it is a cult due to the combined weight of my assessment of more than six of those twenty-one points. And Bentham should get their eyes checked; I didn’t say Zvi “hinted at” the first claim, I said he said it, which he did. It’s a direct quote.
As to the second claim, which is based on my paraphrase of, it most certainly isn’t “based on nothing.” Utilitarian consequentialism, which forms the basis of the personal philosophy of both EA mastermind Will MacAskill and disgraced financier and former EA wunderkind Sam Bankman-Fried, is not something that one arrives at through intuition. It’s an ethical system whose application requires study, and most people learn it through formal instruction. MacAskill, for example, learned it at Cambridge, while Bankman-Fried learned it from his parents, both professors at Stanford (which amounts to aristocratic tutoring a la ).
Now, seeing as our author has, rather bizarrely, just linked generically to Henderson’s newsletter, I have no idea which article they are referring to. When one does not cite their sources properly, it is hard to know which sources to address. But the utilitarians that are EAs—which is clearly the relevant reference class—are obviously not psychopaths; psychopaths don’t dedicate significant portions of their life to helping others. Psychopaths are not big donors to curing malaria overseas, for example.
The quote, from this article by Henderson, reads as follows:
Results revealed that social comparison orientation was significantly correlated with psychopathy. In other words, people who habitually compare themselves with others are more likely to have psychopathic traits (selfishness, callousness, cynicism).
And psychopathy, in turn, was associated with more comfort with sacrificing a few to save many.
Social comparison is also associated with narcissism. People prone to comparing themselves with others agree more strongly with statements such as “I am great” and “Other people are worth nothing.”
Some years ago I was sitting in a philosophy seminar and the professor asked the 16 students how many of us were utilitarians (that is, how many of us would sacrifice a smaller number of people to save a larger number, or generally believe that the ends justify the means). Fourteen out of 16 raised their hands. Later, I learned that psychopaths are overrepresented among college students by a factor of four. Roughly two percent of the general population are psychopaths, compared with 8 percent of college students.
To be clear, I do not believe that all college utilitarians are literally psychopaths. The literary device I employ in the quote Bentham is yapping about is hyperbole, which I used because it is funny. My friendrefers to EAs as "hyper-left-brain moral 'spergs," which is his characteristically bombastic way of insinuating that they have no sense of humor. I would call this a case in point.
I will let Bentham's line about psychopaths and philanthropy pass without comment.
One might additionally expect that the author would provide a reason to think that utilitarianism—an ethical view that’s been around hundreds of years with many adherents—is a bad theory; at least, bad for things beyond leadership decisions. However, they’d be wrong. Rollins is apparently above such mundane and frivolous tasks as arguing against moral theories—he can just smear its adherents as cultlike, while providing no reasons to judge his assessment to be accurate.
I’m pretty sure my entire article is an argument against a moral theory, at least indirectly, but okay, I’ll bite.
First of all, utilitarian consequentialism requires moral math. Its appropriate use case is limited to extreme circumstances and leadership decisions because the only time anyone should be putting a dollar value on human life is when shit has gone sideways to a degree with which you, Bentham, are clearly unfamiliar and they are in charge of a lot of people.
Trolley problems do not normally occur in nature; they’re thought experiments invented by philosophers to illustrate the limits of moral rules of thumb. We do not require ordinary, morally healthy humans to make the choice between killing one person or a number of others on the regular, because an ordinary person put in such a situation will most likely be wracked with guilt and traumatized for life. All human societies give that authority to tribal leaders because normal people don’t want the responsibility.
Additionally, there’s no reason to think EA is intimately utilitarian. As Richard notes, one only has to think that it’s good to make the world a better place and that more good is better than less to be an effective altruist. They don’t have to think any of the controversial things that utilitarians tend to think.
My assertion is based on the weight of a number of arguments. And since I believe EA is a cult, and that cult leaders and adherents are inherently disingenuous about the nature of the cult, I'm not going to assume good faith on the part of EA reps who assert that EA isn't definitionally utilitarian. Other people have made good arguments that it is.
Importance of Suffering. Suffering is The Bad. Happiness/pleasure is The Good.
Suffering builds character. If you want to bubble-wrap the world, there is a fundamental gap between your world-view and mine that probably isn’t getting bridged.
This is a standard objection to—or more accurately described, confusion about—utilitarianism generally given by uninformed freshman undergraduates. Suffering is bad intrinsically but it may get a good thing. It is good instrumentally in that it sometimes produces other good things, but it is, by itself, bad.
I didn’t know that “the end justifies the means” was such a complicated statement. Thanks for clearing that up, Bentham.
Once again, this is irrelevant to basically all that EA does, unless you literally hold the view that factory farms and death from malaria are good because people have productive suffering. That view, however, would be crazy.
This is the kind of totalizing, black-and-white thinking I associate with people with Cluster B diagnoses. “Oh, you don’t like EA? Guess you must like malaria!”
Ascribing an inability to distinguish between types and degrees of suffering to me is evidence that you don’t actually read my blog, and it’s a bad-faith argument besides.
Quantification. Emphasis on that which can be seen and measured.
I’m a fan of quantification. I also sincerely believe that there’s a wolf-god that incarnates within his followers from time to time, and that there are other gods who do the same. There’s a time and a place for each.
I’m confused as to what’s being said—perhaps it’s some witty remark. Given that I don’t know what the objection is supposed to be, if there is one, I’ll just leave a link to my article replying to the objection to utilitarianism that says calculation is impossible.
I appreciate the confirmation that Bentham has neither a sense of humor nor the time to read enough of my blog to determine that I do have a sense of humor. I’m not getting into the technical argument Bentham is making about utilitarian calculation; it’s not relevant to this article.
Altruism. The best way to do good yourself is to act selflessly to do good.
I am suspicious of anyone who says or implies they’re selfless. I think the overwhelming majority of the things people do are ultimately for their own benefit, even if that benefit is not obvious to observers, and I’m fine with that. I think you should not be intentionally antisocial, but if you don’t want to sing “Kumbaya,” more power to you.
Being altruistic doesn’t require saying you’re selfless. Indeed, I’m certainly not totally selfless, nor is anyone else on earth (plausibly). Nevertheless, I think one should try to do good things. Only on planet Rollins does saying people should do good mean “I’m perfectly selfless.”
Don’t put words in my mouth. Nowhere in the section in which I am quoted does the word “perfectly” appear. The word “selfless” does appear in the list from which I’m quoting, but it’s Zvi's list; I didn’t say it.
Obligation. We owe the future quite a lot, arguably everything.
One of my favorite lines from a Best of Craigslist post entitled Advice to Young Men From an Old Man is “You don’t owe the vast majority of people shit.” Don’t tell me I owe anyone anything. I know who I owe, and how much I owe them. I also know what I’m owed.
This is another case where I think Zvi went badly awry. EA doesn’t require saying we owe the future a lot—just that it’s very important to help the future. I’m something like a scalar utilitarian, so I don’t really believe in obligations—though it’s more complicated than that; I sort of believe in them—and I’m still an EA. Rollins gives no argument against this beyond quoting a line asserting that you don’t owe anyone anything, so there’s nothing to respond to.
It’s telling that Bentham is capable of distinguishing between things I said and things Zvi said when it suits them.
I imagine Rollins walking past a drowning child who he could save at no personal cost. He declares “I don’t owe you shit!” Then he walks away and the child drowns.
You may want to peruse my archive with this line in mind if you’re unsure as to why I’m not being characteristically polite, Bentham.
Self-Recommending. Belief in the movement and methods themselves.
This is starting to sound explicitly culty. I don’t do blind belief in an organization, regardless of the organization. And I evaluate method as part of my day job. If your methods are good, fine. I don’t know that I think EA’s heuristics are good, though. They don’t reflect reality as I understand it in all cases.
Apparently if you think your movement has good methods for figuring things out, it’s a cult. Thus, if you, as a catholic, trust mainstream catholic doctrine on issues you haven’t explicitly investigated, then you’re in a cult. But I also don’t think you need to generally defer to EA to be part of it—though if you don’t, then you’ll probably be less sympathetic to some things EA is doing. For example, if you don’t think that much of EA is very good, you should be a bit less sympathetic to EA movement promotion.
Again, Bentham is putting words in my mouth. My belief that a requirement that one must blindly trust an organization in order to be a member in good standing is a sign of a cult is well-founded; “do you trust me or your lyin’ eyes?” is known to be a question asked regularly by those who manipulate others for a living; i.e., cult leaders.
Evangelicalism. Belief that it is good to convert others and add resources to EA.
That is a cult, by any definition with which I’m familiar.
If a Democrat wants there to be more Democrats, that’s a cult.
If a Republican wants there to be more Republicans, that’s a cult.
If a Christian wants there to be more Christians, that’s a cult.
If a chess player wants more people to play chess, that’s a cult.
The second clause in Zvi’s sentence, “and add resources to EA” is distinct from 1, 2, and 4 as they are framed. The goal of converting others to a political party or to a hobby is to optimize one’s own goals in the first case and one’s own happiness in the second. Political parties definitionally represent a selfish constituency; giving that constituency what it wants in exchange for votes is a political party's raison d'être. Chess is a hobby, and the more players there are, the more chess is played.
Christianity, on the other hand, is an established religion, which absolutely started as a cult.
Reputation. EA should optimize largely for EA’s reputation.
I assume by the first “EA” The Zvi means “Effective Altruists,” not “Effective Altruism.” I don’t optimize for anyone’s reputation but my own. That’s not how it works. I stand by my word and my actions. I can’t stand by anyone else’s; that’s not a reasonable expectation of anyone.
I don’t think this is a core part of EA at all. Lots of EAs seem to be willing to say controversial things without caring much about hurting movement optics. But while Rollins may not optimize for anyone else’s reputation, this says nothing at all about anything important. If you know that some controversial statement will start a scandal that will be bad for the world and turn people off to an important social movement, you shouldn’t say it probably.
“Maybe you shouldn’t give your life savings to an autistic crypto trader who’s a major public face of a group that defines itself by its bloodlessly mechanical system for calculating moral decisions that is completely coincidentally known to be the method used to simulate morality by a disproportionate number of highly-educated psychopaths.”
Is that the kind of controversial statement you mean, Bentham?
Modesty. Non-neglected topics can be safely ignored, often consensus trusted.
“The collective decides what its members can discuss.”
I remember reading a very bizarre article a while ago by one Amanda Marcotte. She was trying to smear Scott Aaronson after he opened up about various troubles he’d had—see here for the complete story. To do this, what she would do was quote a very reasonable sounding statement and then provide an outlandish translation. For example
This is sort of how I feel about the Rollins example. The original claim is that if there are lots of people working on a problem, then working on it is probably not very effective, at the margins. Rollins translates this to essentially “the EA thought police tells you what to think.”
The term for what Bentham is describing is reductio ad absurdum. Given their evident lack of a sense of humor, I’m not surprised they’re unfamiliar with the basis of parody, but whatever. And yes, that’s how cults work; they enforce groupthink. It’s a defining feature of cults.
Judgment. Not living up to this list is morally bad. Also sort of like murder.
If your social group has precepts not generally accepted by society, deviation from which makes you a moral outcast, its structure is religious by nature.
This is false; a common theme discussed by EAs is that none of us do the best thing—none of us always act rightly. Still, we should try to do what we can.
I’m not surprised. Struggle sessions are a fact of life in many cults.
I note Bentham does not address my point, only Zvi's. (Is anyone making sure Bentham isn’t wandering out into traffic? A caseworker or something? They seem somewhat distractable.) That is because Bentham has no refutation for it; it does not occur to them that it is important. In fact, the organizational structure of a social group or institution is the most important fact about it; it is the key to that group or institution's nature.
This also means that vegans are “religious by nature.”
I’m not the first to make that argument.
Veganism. If you are not vegan many EAs treat you as non-serious (or even evil).
If your social group has dietary rules not generally accepted by society, deviation from which puts you in a lower social or moral tier, you are a member of either a religion, or more rarely, a sex cult. I cannot think of any exceptions to this rule.
I think lots of EAs tend to treat you as doing something seriously morally wrong. But they don’t treat you as evil. I haven’t known this to happen. Most EAs aren’t vegan. By this standard, animal rights activists would also be a cult.
What is the difference between “seriously morally wrong” and “evil,” Bentham? Do you not connect the sin and the sinner? The fact that you do not understand the nature of the connection between act and actor, or even that there is a connection between a person's actions and their moral status, may be your moral blind spot of greatest concern to society.
(And yes, I would describe animal rights activists, by and large, as cult inductees.)
Totalization. Things outside the framework are considered to have no value.
Y’all are a cult.
WHAT?? Why in the world did Zvi write that? Nice weather is outside the framework, but it obviously has value. This claim is just ridiculous.
Things outside the moral framework, you dolt. And if you want to bother Zvi about his valuation of a sunny day, I suggest you squat down and squeeze out your nuggets of brilliance into his comments section. Stay out of mine; you lower the tone.
Anyway, EA isn’t a cult. And if it is, it’s certainly not for the reasons that Rollins says.
The Narcissist’s Prayer:
That didn’t happen.
And if it did, it wasn’t that bad.
And if it was, that’s not a big deal.
And if it is, that’s not my fault.
And if it was, I didn’t mean it.
And if I did, you deserved it.
That someone didn’t do or say, at any rate; Bentham refers to me as “Collins” several times, and the “R” key and the “C” key are nowhere near one another on the keyboard. If I’m being charitable, I think it’s not out of the question that Bentham has recently suffered a head injury, which would explain both the persistent misspelling of my last name and much of the article’s content.
The total inability of these EA people to process humor is the most telling indictment against them. They're more robotic than the imaginary general AIs their cult formed in the shadow of.
You must be semi-famous to elicit that much gas. Take it as a good sign you don't even have to seriously attack them to give them a hyperventilating freakout. I don't know anything about Bentham and not much more about EA really, but I'm half tempted to call them a cult too just by the language of their response.