Unable to load image

EFFORTPOST Scott Alexander Doesn't Understand Cult Public Relations Part 2: NARCISSISM

https://old.reddit.com/r/slatestarcodex/comments/1867cm1/in_continued_defense_of_effective_altruism

								

								

It turns out that I was coming in late to the conversation when I posted my takedown last night.

Effective Altruism has been experiencing subjectification at the hands of Twitter and Scott Alexander continues to miss the point in such spectacularly abundant fashion that I'm a little annoyed I dashed what I said last night.

The problem with Effective Altruism isn't whether or not it is a part of American Civic Virtue. That isn't really the debate.

The problem with Effective Altruism is that as a movement it is a pure expression of intellectual narcissism.

Scott Alexander is a Narcissist

He thinks, perhaps, that he has escaped narcissism because he delegates the recipient of the adulation to an egregore. But he must defend the threatened object of his narcissism at length, and the nature of his defense reveals the thinness of the cult mentality in pure, dripping narcissism that is deeply repellent to other people.

Most Movements have Exactly the Reputation they Deserve.

Let's dig in.

THE WORD OF THE LORD

Matthew 6

1“Take heed that you do not do your charitable deeds before men, to be seen by them. Otherwise you have no reward from your Father in heaven. 2 Therefore, when you do a charitable deed, do not sound a trumpet before you as the hypocrites do in the synagogues and in the streets, that they may have glory from men. Assuredly, I say to you, they have their reward. 3 But when you do a charitable deed, do not let your left hand know what your right hand is doing, 4 that your charitable deed may be in secret; and your Father who sees in secret will Himself reward you openly.

Praise be to God.

Now this passage provokes an ETERNAL CONUNDRUM in that it makes communicating about good and evil difficult, OR...

This passage reveals the ETERNAL CONUNDRUM about communicating about good and evil.

These people think they're atheists.

It happens all the time that people break the spirit and the letter of this religious command simply for the Christian Church to communicate and organize its doings (if there even is such a single conglomeration). Certainly I am self-consciously a cult leader in that I have self-consciously cult - ivated a following. But I am a student of the mystic zummi, and zummi gave up his mask once it became cumbersome: in this way he successfully demonstrated detachment from the spiritually impure form of his following once it had more or less served its purpose (it was an accident of our gathering and therein was its beauty).

The SFBA Rationalist Cult did not have such an intelligence at their disposal.

If you were to throw this verse in Scott Alexander's face after he cried in the public square about the good he and his flock were doing, he would quite rightly dissemble: how is it that we're supposed to talk about doing good?

But the problem with Effective Altruism is that it believes it is separate from the American Civic Religion, even though it is not. It is merely a replication of Christian ethos in a supposedly secular philosophical framework in a deliberately spiritual practice. These people are merely hopelessly confused Christians seeking to differentiate themselves from American Civic Virtue at large with a cult name.

In other words it's not enough that their cult gives, they also demand, in a narcissistic sense, to be recognized for their giving, and they refuse to take any sort of responsibility for the reputation they have incurred.

Scott Alexander doesn't understand: he is the SBF of writing about politics. Every single luminary from the SFBA Rationalists carries the imprint of narcissistic fraud upon them, and they cry angrily that no one will recognize them for "their" work.

With that said, let's look at Scott Alexander's impressive display of impressing you with his display of impressing money virtue money impressing.

Socialists think we're sociopathic Randroid money-obsessed Silicon Valley hypercapitalists.

Because the end result of your "AI Safety" team is capitalists exploiting AI.

But Silicon Valley thinks we're all overregulation-loving authoritarian communist bureaucrats.

Because they're winning in the conflict over their exploitation of AI. The biggest narcissism at work in their AI team is believing they, and they alone, were working on AI safety. Some of us just know how to keep quiet. But I foresaw that there would be idiots who would talk anyway. Who would do the work anyway. It's just funny that the AI Safety team of the SFBA Rationalists were the least effective AI Safety team that has ever existed. (SotS didn't form an AI Safety team: that's just how competent we are.)

So their AI Safety team is pure fraud. (It doesn't matter what they believe, only that they're incompetent.)

The right thinks we're all woke SJW extremists.

Unsurprising the group that prioritizes order at the expense of outcasts is skeptical of a cult of midwit outcasts.

But the left thinks we're all fascist white supremacists.

No, at least, I only think you're incapable of noticing fascist white supremacists, which is worse. That's the EY = fraud AI researcher, SBF = fraud financier, SA = fraud political writer part.

The anti-AI people think we're the PR arm of AI companies, helping hype their products by saying they're superintelligent at this very moment.

Why is it their problem if your team has a PR problem?

But the pro-AI people think we want to ban all AI research forever and nationalize all tech companies.

Why is it their problem if your team has a PR problem?

The hippies think we're a totalizing ideology so hyper-obsessed with ethics that we never have fun or live normal human lives.

You're writing a hyper-obsessed article about ethics in which you are clearly intent on spreading a totalizing ideology. ("No I'm not saying everyone should be an effective altruist, just that effective altruism is a good movement!" And that's where the cult mechanic is operating unconsciously on these sleepwalking myconids.)

But the zealots think we're a grift who only pretend to care about about charity, while we really spend all of our time feasting in castles.

I think the argument that you care more about being perceived about caring effectively about charity than the actual charity itself has a lot of weight: narcissistic cult.

The bigshots think we're naive children who fall apart at our first contact with real-world politics.

In a world where the political parties were: religious fundamentalists behind an authoritarian strongman unto fascism, and the political party that believes in climate change, your team got lost and thinks it's somewhere in the middle even though you're all fricking shitlibs performing American Civic Virtue like good Christians but without any proper awareness of your spiritual origin. (If it's not just Christianity with a lot of complicated handwaving so you can pretend you're atheists, what is it?)

But the journ*lists think we're a sinister conspiracy that has “taken over Washington” and have the whole Democratic Party in our pocket.

This is were Scott Alexander especially is just an idiot. Many of the midwits under the SFBA Rationalist banner suffer from this problem of essentializing journ*lisms, and in normal circumstances I might be willing to give Scott the benefit of the doubt here for brevity, but he does this all the time. His hatred of journ*lists is deeply irrational and comes from his narcissism getting poked at by the NYT.

[An infographic in which Scott Alexander presents twitter criticism of EA]

Scott spends no time rebutting these arguments directly.

The only thing everyone agrees on is that the only two things EAs ever did were “endorse SBF” and “bungle the recent OpenAI corporate coup.”

This is the conversation for inside the PR committee, not for a public statement. Yes, those are the things which EA is known for being involved with. Yes, those things reflect badly on EA.

In other words, there's never been a better time to become an effective altruist! Get in now, while it's still unpopular! The times when everyone fawns over us are boring and undignified. It's only when you're fighting off the entire world that you feel truly alive.

In a post ironic world, all expression is genuine. Scott Alexander doesn't believe he's a myconid. Scott Alexander does not believe he is recruiting for a cult. He carries out the motions of a ritual's function, and he receives the ritual's result.

All of this is precisely why EA deserves criticism.


Time for a quick break.

:#marseydrinkwater: :#marseycool:

We ready? Alright. Now we get to the counter-gish-gallop.


A Gish Gallop isn't inherently bad form. If the preface Scott Alexander wrote were coherent and shorter, this would be an interesting document. Unfortunately, the opening paragraph of the Gish Gallop is deep failure.

And I do think the movement is worth fighting for. Here's a short, very incomplete list of things effective altruism has accomplished in its 10 years of existence. I'm counting it as an EA accomplishment if EA either provided the funding or did the work, further explanations in the footnotes. I'm also slightly conflating EA, rationalism, and AI doomerism rather than doing the hard work of teasing them apart:

COMPUTER

ENHANCE

I'm counting it as an EA accomplishment if EA either provided the funding

There it is. Done. QED. This is narcissism. It's a narcissistic cult in which they think they avoid the narcissistic attachment by delegating it to an outside non-human anchor: their Team which is now getting criticism, and their response to criticism is to say

"LOOK AT HOW MANY POINTS WE COLLECTED!"

What do you want, Scott? Do you want an award? For all your good boy points, I am prepared to issue a motion in Impassionata Congress to give you an award: the "EFFECTIVE ALTRUISM IS A REAL CULT WITH REAL PHILANTHROPIC NARCISSISM" award is practically in the mail, I bet I can convince the other Impassionatas to vote 'yes.'

Providing funding isn't doing the work and you really shouldn't be proud of it

But if you are you shouldn't form a cult about it

and if you form a cult about it

Don't be surprised when that cult gets a reputation for being conceited. For being full of frauds who are more interested in getting credit than anything else.

You want credit? You'll get it, but here's what that looks like.

Saved about 200,000 lives total, mostly from malaria

FRAUD. The cult provided money for other people to do work. All of his citations link to non-profits, I'm not digging into every single one but I'll do a few. This one is GiveWell, whose wikipedia page states:

"In 2006, Holden Karnofsky and Elie Hassenfeld, who worked at a hedge fund in Connecticut, formed an informal group with colleagues to evaluate charities based on data and performance metrics similar to those they used at the fund, and were surprised to find the data often didn't exist.[2] The next year, Karnofsky and Hassenfeld formed GiveWell as a nonprofit to provide financial analyst services to donors."

"In 2013, GiveWell moved its offices to San Francisco where people in Silicon Valley had become strong supporters of the effective altruism philosophy."

In other words, existing Non Profits were already practicing "effective altruism" before the cult tried to take credit for it. What did EA add except for fraud, this taking credit that Scott Alexander so brazenly attempts?

Treated 25 million cases of chronic parasite infection.

FRAUD. The cult provided money for other people to do work.

Given 5 million people access to clean drinking water.

FRAUD. The cult provided money, and wants to take credit for the work.

Supported clinical trials for both the RTS.S malaria vaccine (currently approved!) and the R21/Matrix malaria vaccine (on track for approval)

FRAUD

Supported additional research into vaccines for syphilis, malaria, helminths, and hepatitis C and E.5

FRAUD

Supported teams giving development economics advice in Ethiopia, India, Rwanda, and around the world.

FRAUD

Convinced farms to switch 400 million chickens from caged to cage-free

Scott Alexander's own footnote says:

Their numbers add up to 800 million but I am giving EA half-credit because not all organizations involved were EA-affiliated. I'm counting groups like Humane League, Compassion In World Farming, Mercy For Animals, etc as broadly EA-affiliated, and I think it's generally agreed they've been the leaders in these sorts of campaigns.

AN HONEST FRAUD IS STILL A FRAUD.

Freed 500,000 pigs from tiny crates where they weren't able to move around

This is another one I dug into and FRAUD I'm not going to run through all of these because you get the idea.

Gotten 3,000 companies including Pepsi, Kelloggs, CVS, and Whole Foods to commit to selling low-cruelty meat.

This one doesn't even have a cite and sounds like they're taking credit for the bandwagons others made. FRAUD.


This is the part where things get wacky. See Scott Alexander knows, on some level, that he's defending his cult, as he "slightly [conflates] EA, rationalism, and AI doomerism."

AI

Developed RLHF, a technique for controlling AI output widely considered the key breakthrough behind ChatGPT.

Is this really effective altruism or is it just a cult defending its contribution to AI Threat?

…and other major AI safety advances, including RLAIF and the foundations of AI interpretability.

The really unfortunate thing is that though EY was stupid enough to talk he was also loud enough that researchers in his cult were part of generating the AI Threat.

Founded the field of AI safety, and incubated it from nothing up to the point where Geoffrey Hinton, Yoshua Bengio, Demis Hassabis, Sam Altman, Bill Gates, and hundreds of others have endorsed it and urged policymakers to take it seriously.

You didn't found the field of AI Safety and anyone who thinks this just because the Bishop of SFBA Rationalism said it is an idiot. AI Safety has been a concern for decades. As an actual field the better work was done by people with pedigrees and yes that is 100% an endorsement of formal education over FRAUDS TAKING CREDIT FOR OTHER PEOPLE'S WORK.

That you got a bunch of names to sign a letter doesn't mean it was a success.

Helped convince OpenAI to dedicate 20% of company resources to a team working on aligning future superintelligences.

lmao "Secured some lip service to AI safety after Capital had taken over AI development"

Gotten major AI companies including OpenAI to work with ARC Evals and evaluate their models for dangerous behavior before releasing them.

This is good work that is not altruism, not effective, and if the only thing connecting it to EA is that it's part of the SFBA Rationalist Cult Complex, you have a problem.

The more accurate story is that Eliezer Yudkowsky bricked a bunch of human brains with bad firmware, but some of them landed in a good enough spot to work on the problems that everyone was already working on: it's just that SFBA Rationalists have the brainworm of tithing the credit to the SFBA Rationalist cult.

Scott Alexander has to Stop Performing Cult Bishop in order to actually address the criticism of the SFBA Rationalist Cult, of which Effective Altruism is just one banner.

Got two seats on the board of OpenAI, held majority control of OpenAI for one wild weekend, and still apparently might have some seats on the board of OpenAI, somehow?

:marseylaugh: "we're naive children who fall apart at our first contact with real-world politics." :marseylaughpoundfist:

[Sam Altman manipulates EY directly in a tweet]

The SFBA Rationalists are so easily duped it's a bit sad.

Helped found, and continue to have majority control of, competing AI startup Anthropic, a $30 billion company widely considered the only group with technology comparable to OpenAI's

Is it any wonder there's people criticizing them both for AI Safety and bringing AI to capitalism? God these people are idiots.

[For some reason, Scott includes a tweet here...]

...where he misses the point about the "weird lesswrong ideas" in favor of a, uh, stupid? comparison to medicare? Like these people are in a cult so they can't understand that their cultishness is offputting. It's sad and, frankly, funny.

Become so influential in AI-related legislation that Politico accuses effective altruists of having “[taken] over Washington” and “largely dominating the UK's efforts to regulate advanced AI”.

So your complaint is that people noticed that you exercised the power you always intended to exercise? God these frickers are insufferable. Politico is engaging in hyperbole and if you're going to be involved in politics, you just have to get used to this and stop flying off the handle every time a journ*lism says something you don't like.

Helped (probably, I have no secret knowledge) the Biden administration pass what they called "the strongest set of actions any government in the world has ever taken on AI safety, security, and trust.”

To the extent that this work was the SFBA Rationalist Cult, it wasn't present solely in the SFBA Rationalist Cult. To a certain extent I admire the fact that this has happened, but the credit does not go to the cult except in the cult's narcissistic brain bust, which is the problem everyone is trying to point you to when they talk about EA as a dipshit fraud factory.

Helped the British government create its Frontier AI Taskforce.

Same. If y'all weren't a dipshit fraud factory this might be a point in your favor, but it's still missing the point of the criticism.

Won the PR war: a recent poll shows that 70% of US voters believe that mitigating extinction risk from AI should be a “global priority”.

A single victory, poorly fought, by a writer who then got lost on twitter in the deluge of his own mania.

Helped organize the SecureDNA consortium, which helps DNA synthesis companies figure out what their customers are requesting and avoid accidentally selling bioweapons to terrorists

Fraud. Funding isn't doing the work.

Provided a significant fraction of all funding for DC groups trying to lower the risk of nuclear war.15

Your cult is engaged in fraudulent credit for this work because you think money is contributing when it's merely helping.

Donated a few hundred kidneys.16

Unironically, this one I'll give you, but you should talk about it less because the logical result is giving cult members enhanced status for organ donation and that's just the kind of incoherent spirituality that

These People Think They're Atheists

your types perform.

Sparked a renaissance in forecasting, including major roles in creating, funding, and/or staffing Metaculus, Manifold Markets, and the Forecasting Research Institute.

lmao THESE ARE WASTES OF TIME FOR IDIOTS YOU MADE THE WORLD A WORSE PLACE WITH THEM

(anything which can be predicted in a prediction market isn't worth predicting)

(unless you're an idiot who thinks winning points is inherently good)

(of course the midwit cult boasting about its points gathered for other people to work with boasts about its development of online point measuring)

(this is just too funny I can't believe this is real)

Donated tens of millions of dollars to pandemic preparedness causes years before COVID, and positively influenced some countries' COVID policies.

yeah you and Bill Fricking Gates

Played a big part in creating the YIMBY movement - I'm as surprised by this one as you are, but see footnote for evidence17.

Open Philanthropy's Wikipedia page says it was “the first institutional funder for the YIMBY movement”.

This one infuriates me. People worked for years to popularize YIMBY and you don't get to count funding an already existing movement as creating it.

These people are unpopular because they're frauds

And they're frauds because their narcissism is in constantly taking credit

FOR OTHER PEOPLE'S WORK

I think other people are probably thinking of this as par for the course - all of these seem like the sort of thing a big movement should be able to do.

That's not at all what I'm thinking: what I'm thinking is that no other movement would struggle with its own self-definition in such a narcissistic and fraudulent way.

But I remember when EA was three philosophers and few weird Bay Area nerds with a blog. It clawed its way up into the kind of movement that could do these sorts of things by having all the virtues it claims to have: dedication, rationality, and (I think) genuine desire to make the world a better place.

I don't actually think that anyone, anyone, is questioning whether or not the cultist desire to make the world a better place is genuine.

Just that: the weird bloggers? They should shut up and the movement of the SFBA Rationalists should disperse into the nonprofits it has helped develop.

To the extent that there's a community in the SFBA Rationalists, unironically they should convert to Christianity or at least just take this one little bit, Matthew 6, and tattoo it on their tramp stamp region so they don't get lost this way.


Scott Alexander's conclusion

Scott Alexander performs the ultimate idiocy of the point counters: he makes human lives fungible and cracks a joke about curing diseases and preventing terrorist attacks that happen to equal the number of lives they believe they've saved.

He handwaves around some numbers to make them holy, then praises EA in the public square.

>Still not impressed? Recently, in the US alone, effective altruists have:

ended all gun violence, including mass shootings and police shootings

cured AIDS and melanoma

prevented a 9-11 scale terrorist attack

Okay. Fine. EA hasn't, technically, done any of these things.

But it has saved the same number of lives that doing all those things would have.

About 20,000 Americans die yearly of gun violence, 8,000 of melanoma, 13,000 from AIDS, and 3,000 people in 9/11. So doing all of these things would save 44,000 lives per year. That matches the 50,000 lives that effective altruist charities save yearly18.

I don't hate this crude joke in isolation, it's just stupid after all of the FRAUD.

People aren't acting like EA has ended gun violence and cured AIDS and so on. all those things. Probably this is because those are exciting popular causes in the news, and saving people in developing countries isn't. Most people care so little about saving lives in developing countries that effective altruists can save 200,000 of them and people will just not notice. “Oh, all your movement ever does is cause corporate boardroom drama, and maybe other things I'm forgetting right now.”

Christ forgive me: but this is whiny narcissism. No Scott, it's not that people don't care about the lives that EA claims to have saved, it's that EA didn't save them, it just funded other people who deserve the credit more.

This is the cult's persecution complex, the performative whining of the final arc. "Any press is good press," Scott Alexander thinks as he prepares to plant another cult hook.

In a world where people thought saving 200,000 lives mattered as much as whether you caused boardroom drama, we wouldn't need effective altruism.

This is complete delusional narcissism at work. A stupid, and blatantly stupid, false statement justifying the existence of the cult.

These skewed priorities are the exact problem that effective altruism exists to solve - or the exact inefficiency that effective altruism exists to exploit, if you prefer that framing.

If this were all the SFBA Rationalist Cult was (this must be the shell game deBoer was pointing out, lmao I should have read his essay), this conversation wouldn't be happening because it would be one more non-profit doing non-profit things which isn't news and... that's ok! It's ok for not all mainstream news to be celebrating the work non-profits do!

Nobody cares about preventing pandemics, everyone cares about whether SBF was in a polycule or not.

This isn't remotely the problem with SBF, that's just the funny part of it.

Effective altruists will only intersect with the parts of the world that other people care about when we screw up; therefore, everyone will think of us as “those guys who are constantly screwing up, and maybe do other things I'm forgetting right now”.

When you stubbornly refuse to understand what your errors are, perform a cultish "call for criticism" which unsurprisingly does not result in reforming away the SFBA Rationalist cultishness, you deserve the reputation that you have as, oh what was it?" "children who fall apart when exposed to real world politics?"

And I think the screwups are comparatively minor. Allying with a crypto billionaire who turned out to be a scammer. Being part of a board who fired a CEO, then backpedaled after he threatened to destroy the company.

They might(?) be minor if the SFBA Rationalist Cult could actually reckon with them as mistakes on the scale of the massive continual streak of FRAUD which runs from the original high school dropout midwit, through SBF, to Scott Alexander.

These are bad, but I'm not sure they cancel out the effect of saving one life, let alone 200,000.

LMAO

LOL, EVEN

This is the way these Cartesian minds think: they think in offsets, they think spirituality is a matter of net good and net bad. It's all just points to them because they're hopelessly broken minds trying to arithmetic their way out of a social problem they refuse to believe they have.

(Somebody's going to accuse me of downplaying the FTX disaster here. I agree FTX was genuinely bad, and I feel awful for the people who lost money. But I think this proves my point: in a year of nonstop commentary about how effective altruism sucked and never accomplished anything and should be judged entirely on the FTX scandal, nobody ever accused those people of downplaying the 200,000 lives saved. The discourse sure does have its priorities.)

And whining about discourse is empty, you should shut up about things you can't change, Scott.

The discourse does prioritize mocking goofy and weird cults that

They Think They're Atheists

have nothing but fraud and a methodology indistinguishable from the American Civic Virtue already at work in the United States to distinguish them in discourse. Yes: discourse is functioning just fine. You just can't handle the heat of that which you have wrought.

Doing things is hard. The more things you do, the more chance that one of your agents goes rogue and you have a scandal. The Democratic Party, the Republican Party, every big company, all major religions, some would say even Sam Altman

lmao is Scott Alexander threatening Sam Altman?? This is the most interesting line.

  • they all have past deeds they're not proud of, or plans that went belly-up. I think EA's track record of accomplishments vs. scandals is as good as any of them, maybe better.

But the Democratic Party is self-conscious of itself as a political party, it's not a cult with a denial complex. I joke about leading my own cult, for instance, but to the extent that it's true, the purpose of Impassionata Congress is to insulate me (and my 'followers') from cult mechanics by providing a structure which accommodates for the relation of speaker-listener. Companies are bound by their corporate charter and the cultish elements of CEOs is diminished through the configuration provided by the board of directors. Religions have to figure some shit out in order to function! (But they can't do that if they're in denial about what they are, and what SFBA Rationalism is is a cult.) (They think they're atheists, and atheists can't be in a cult.)

It's just that in our case, the accomplishments are things nobody except us notices or cares about.

THE NON PROFITS EXISTED

BEFORE YOU FUNDED THEM

YOU ABSOLUTE

NARCISSISTIC

MIDWIT

Like saving 200,000 lives. Or ending the torture of hundreds of millions of animals. Or preventing future pandemics. Or preparing for superintelligent AI.

Your organization is less capable of handling these things than mainstream electoral politics very specifically because you believe you are in any way separate from mainstream American politics.

But if any of these things do matter to you, you can't help thinking that all those people on Twitter saying EA has never done anything except lurch from scandal to scandal are morally insane.

SFBA Rationalists are politically insane, and you can tell because everyone who is against them must just be 'morally insane.' You see the cult mechanic at work? Isolate the followers with rhetoric like this.

That's where I am right now. Effective altruism feels like a tiny precious cluster of people who actually care about whether anyone else lives or dies, in a way unmediated by which newspaper headlines go viral or not.

If you actually believe that the SFBA Rationalist Cult is above newspaper headlines in a work that has headlines in it all over the place, you're an easy dupe and you belong in the SFBA Rationalist Cult.

My first, second, and so on to hundredth priorities are protecting this tiny cluster and helping it grow. After that I will grudgingly admit that it sometimes screws up - screws up in a way that is nowhere near as bad as it's good to end gun violence and cure AIDS and so - and try to figure out ways to screw up less. But not if it has any risk of killing the goose that lays the golden eggs, or interferes with priorities 1 - 100

"I am 100% aware that I am in a cult, I will grow that cult, I will protect that cult, and I don't care to learn from any of its mistakes thus far."


The final section.

:#marseyschizotwitch:

We're almost done.


Am I cheating by bringing up the 200,000 lives too many times?

No, just weakening your essay, your conclusion is 2x and maybe 3x too long for the actual points you are making.

People like to say things like “effective altruism is just a bunch of speculative ideas about animal rights and the far future, the stuff about global health is just a distraction”.

That is one criticism of some of the SFBA Rationalist Cult.

If you really believe that, you should be doubly amazed! We managed to cure AIDS and prevent 9/11 and so on as a distraction, when it wasn't even the main thing we wanted to be doing! We said “on the way to doing the other things we really care about, let's stop for a second to cure AIDS and prevent 9/11, it won't take too much time or resources away from the important stuff”. Why haven't any of you distraction-free people managed that?

Uh, you didn't do any of that, you didn't even manage it. You just funded it. And when you got directly involved you've *embarrassed yourselves and you're still in denial about that.

I don't think any of this is a distraction. The Democratic Party is anti-gun and pro-choice. The gun control isn't a ruse to trick pro-life people into joining the party, and the abortion rights aren't a smokescreen to deflect opposition from gun owners. Movements are coalitions. Sometimes this means different people with different priorities choose to work together. But other times it means the same people tend to share both priorities. People who are pro-choice are usually also anti-gun, and vice versa.

If your priority is taking credit, no one wants to build a coalition with you. It's not hard man.

And I notice that the tiny handful of people capable of caring about 200,000 people dying of neglected tropical diseases are the same tiny handful of people capable of caring about the next pandemic, or superintelligence, or human extinction. I think the common skill is trying to analyze what causes are important logically. Surprisingly, this comes up with different results than the usual strategy of caring about whatever is trending on Twitter today!

YOUR MEDIA LITERACY IS BADLY BROKEN

Some of those results are broadly popular, like saving 200,000 lives. Other results are broadly unpopular, like worrying about AI risk. But both sets of results come from the same process, and the people who get them are natural allies in a world where most people can't bring themselves to care about anything that isn't current front-page news.

You're not the only people who decided to do anything about the world and you don't have a monopoly on altruism. This is where the SFBA Rationalist Cult mentality is most grandiose and totalizing: you can't conceive of non-cult members as being on the same 'side' as you because you get distracted by twitter insults.

[some chart which talks about bait and switch in an article which is full of bait and switches]

lmao. I haven't been including the images because they're not good. Like they're actually well selected it's just that they're presented, by Scott, as if they're a mystery, a sort of art piece in contrast to his calm and logical argument. Unintentionally hilarious. They're honest frauds.

I don't want the takeaway from this post to be “Sure, you may hate EA because it does a lot of work on AI - but come on, it also does a lot of work on global health and poverty!” I'm proud of all of it. I believe - just like Geoff Hinton, Bill Gates, Sam Altman, Daniel Dennett, etc - that there's a pretty serious risk of human extinction from AI over the next few decades. I can't imagine believing this and not thinking it's important to prepare for. EA has done a great job working on this (see list of accomplishments above), and I think the AI and x-risk people have just as much to be proud of as the global health and animal welfare people.

If you were simply attributing this to "altruism" you might be correct to write this essay, but it would also be absurdly incoherent. Do you see? You formed a movement around "Altruism" and the only people who took it seriously were midwits. But you don't own altruism and defending "altruism" is incoherent. The only reason you can even write this essay, Scott, is you are the Bishop of a cult.

So my recommendation is that you look carefully into the research on what causes can most benefit the world - near-term, long-term, whatever seems most important to you after some honest study - and try to come up with some way to help them. If you want, take the Giving What We Can pledge to donate 10% of your income, or look at 80,000 Hours to see how you can get an altruistic career.

And there's the cult hook.

And whatever you do, do it quick, before the metronome swings back and all of this becomes popular again.

And there's the knowing wink that he's preaching to the choir to shore up support internally.


These People Believe They Are Atheists

And Atheists Can't Be In A Cult

But they're self conscious of the Effective Altruist movement as a spiritual mechanism for gathering and binding people, they just can't take the next step of perceiving the cult around them and attempting to mitigate the disastrous effects the cultishness has on their broader appeal.

The streak of fraud which runs through the SFBA Rationalist Cult and its various organizations aren't incidental mistakes taken by a developing movement, but reflections of a culture of narcissism which takes the persecution complex of the grandiose, runs it through a midwit consensus reality that could only have been achieved in the early Internet, and results in a reputational problem which is wholly deserved.

Here's the kicker :bluelight: :redlight:

If Effective Altruism Fixed their Culture

It would just become Altruism, i.e.

It would cease to exist

It would just become the American Civic Virtue at work within the normal American Narcissism which is better modulated. A series of non-profits sharing a cult lineage that doesn't actually matter because the people in it balanced their spiritual identities around a less narcissistic collective practice.

Thus my advice to the SFBA Rationalists is: disperse. Throw down your idols now that you know you have them. If they had any decency or integrity they would log off and consider their mission accomplished.

That's what I plan to do once Impassionata Congress finishes this legislative session.

Thank you for reading.

:#marseyshrug:

13
Jump in the discussion.

No email address required.

In a rare turn of events I actually somewhat agree with Imp here, Effective Altruism is pure distilled r-sluration that only makes sense from an incredibly narrow neurodivergent worldview that assumes the data its proponents rely on is actually reliable and pertinent to their goals. At its core it's just badly done empiricism, with a healthy dose of wacko nerd autism thrown in.

Effective Altruism morons poured something like half a billion dollars into convincing the world that AI was going to cause the apocalypse. And the only thing we got out of it was a bunch of stupid sci-fi thought experiments on worthless sites like LessWrong. They've done fricking nothing, actually less than nothing because they're mostly clueless tech nerds like Bankman-Fried that gained way too much power from the money they made in the industry. It's just crazy how inept and useless they are. There's so many extremely pertinent issues that people face today yet these EA clowns with too much time and money on their hands want to focus on the supposed impending human mass extinction caused by vector math done on garbage data stolen from people's posts online. They're just spergs soyjaking at this tech being just like their sci-fi movies. And just like with Crypto, they've managed to create a giant bubble with AI that only benefits a small group of wealthy corps and individuals before it pops.

Jump in the discussion.

No email address required.

That's nice sweaty. Why don't you have a seat in the time out corner with Pizzashill until you calm down, then you can have your Capri Sun.

Jump in the discussion.

No email address required.

Link copied to clipboard
Action successful!
Error, please refresh the page and try again.