Back to Home11/17/2025, 5:37:42 PM

A new book about the origins of Effective Altruism

118 points
220 comments

Mood

heated

Sentiment

mixed

Category

culture

Key topics

Effective Altruism

Philosophy

Philanthropy

Debate intensity85/100

A new book about the origins of Effective Altruism (EA) has sparked a heated discussion on the movement's principles and practices, with some defending its rational approach to charity and others criticizing its potential flaws and misapplications.

Snapshot generated from the HN discussion

Discussion Activity

Very active discussion

First comment

32m

Peak period

147

Day 1

Avg / period

80

Comment distribution160 data points

Based on 160 loaded comments

Key moments

  1. 01Story posted

    11/17/2025, 5:37:42 PM

    2d ago

    Step 01
  2. 02First comment

    11/17/2025, 6:09:15 PM

    32m after posting

    Step 02
  3. 03Peak activity

    147 comments in Day 1

    Hottest window of the conversation

    Step 03
  4. 04Latest activity

    11/19/2025, 1:37:20 PM

    5h ago

    Step 04

Generating AI Summary...

Analyzing up to 500 comments to identify key contributors and discussion patterns

Discussion (220 comments)
Showing 160 comments of 220
philipallstar
2d ago
3 replies
> In the past, there was nothing we could do about people in another country. Peter Singer says that’s just an evolutionary hangover, a moral error.

This is sadly still true, given the percentage of money that goes to getting someone some help vs the amount dedicated to actually helping.

weepinbell
2d ago
2 replies
Certainly charities exist that are ineffective, but there is very strong evidence that there exist charities that do enormous amounts of direct, targeted good.

givewell.org is probably the most prominent org recommended by many EAs that does and aggregates research on charitable interventions and shows with strong RCT evidence that a marginal charitable donation can save a life for between $3,000 and $5,500. This estimate has uncertainty, but there's extremely strong evidence that money to good charities like the ones GiveWell recommends massively improves people's lives.

GiveDirectly is another org that's much more straightforward - giving money directly to people in extreme poverty, with very low overheads. The evidence that that improves people's lives is very very strong (https://www.givedirectly.org/gdresearch/).

It absolutely makes sense to be concerned about "is my hypothetical charitable donation actually doing good", which is more or less a premise of the EA movement. But the answer seems to be "emphatically, yes, there are ways to donate money that do an enormous amount of good".

gopher_space
2d ago
2 replies
> giving money directly to people in extreme poverty, with very low overheads. The evidence that that improves people's lives is very very strong

When you see the return on money spent this way other forms of aid start looking like gatekeeping and rent-seeking.

weepinbell
1d ago
1 reply
GiveWell actually benchmarks their charity recommendations against direct cash transfers and will generally only recommend charities whose benefits are Nx cash for some N that I don't remember off the top of my head. I buy that lots of charities aren't effective, but some are!

That said I also think that longer term research and investment in things like infrastructure matters too and can't easily be measured as an RCT. GiveWell style giving is great and it's awesome that the evidence is so strong (and it's most of my charitable giving), but that doesn't mean other charities with less easily researched goals are bad necessarily.

zozbot234
1d ago
The Open Philanthropy Project is one major actor in EA that focuses mostly on "less easily researched goals" and riskier (but potentially higher-impact on average) giving than GiveWell.
rincebrain
1d ago
1 reply
Eventually, almost any organization distorts from its nominal goal to self-perpetuation.

As the numbers get larger, it becomes easier and easier to suggest that the organization's continued existence is still a net positive as you waste more and more on the organization bloating.

potato3732842
20h ago
>consider how the ACA required that 85% of premiums go to care, and how that meant that the incentives became for the prices to become enormous.

To be fair, that particular example was obvious from day 1.

philipallstar
1d ago
That's fantastic, and I think most charities start this way.
cm2012
2d ago
1 reply
You can pretty reliably save a life in a 3rd world country for about $5k each right now.
tavavex
2d ago
1 reply
How? I'm curious because the numbers are so specific ($5000 = 1 human life), unclouded by the usual variances of getting the money to people at a macro scale and having it go through many hands and across borders. Is it related to treating a specific illness that just objectively costs that much to treat?
cm2012
2d ago
1 reply
Here is a detailed methodology: https://www.givewell.org/impact-estimates. It convinced me that $5k is a reasonable estimate.
bombcar
1d ago
3 replies
A weird corollary to this is that if you work for one of these charities, you’re paid in human lives (say you make $50k, that’s ten people who could have been saved).
lmm
1d ago
1 reply
That's an extremely weird way to think about it. The same logic applies to anyone doing any job - whatever money you spend on yourself could be spent saving lives instead, if you really want to think about it that way. There's no reason that people working for an effective charity should feel more guilty about their salaries than people working for any other job - if anything it's the opposite, since salaries usually do not reflect the full value of a person's work.
philipallstar
1d ago
1 reply
> That's an extremely weird way to think about it

Perhaps, but it's exactly the type of thinking the article is describing.

lmm
1d ago
No it isn't. EA folks do not think that people who work for charities specifically should be paid less or feel guiltier about their salaries (indeed witness the whole Scottish Castle drama, if anything it's the opposite).
tovej
1d ago
The reasonable way to think of it is that if you were not paid those 50k, the chatity eould be less able to deliver on this. It would be amortized over the entire sum of people being helped by the charity, eventually becoming a negligible overhead.
cm2012
23h ago
No more than anyone is paid in human lives.
jimbokun
2d ago
Peter Singer is the LAST person I would go to for advice on morality or ethics.
jmount
2d ago
1 reply
Effective Altruism and Utilitarianism are just a couple of the presentations authoritarians sometimes make for convenience. To me the code simply as "if I had everything now, that would eventually be good for everybody."

The arguments always feel to me too similar "it is good Carnegie called in the Pinkerton's to suppress labor, as it allowed him to build libraries." Yes it is good what Carnegie did later, but it doesn't completely paper over what he did earlier.

lesuorac
2d ago
3 replies
> The arguments always feel to me too similar "it is good Carnegie called in the Pinkerton's to suppress labor

Is that an actual EA argument?

The value is all at the margins. Like Carnegie had legitimate functional businesses that would be profitable without Pinkerton's. So without Pinkerton's he'd still be able to afford probably every philanthropic thing he did so it doesn't justify it.

I don't really follow the EA space but the actual arguments I've heard are largely about working in FANG to make 3x the money outside of fang to allow them to donate 1x ~1.5x the money. Which to me is very justifiable.

But to stick with the article. I don't think taking in billions via fraud to donate some of it to charity is a net positive on society.

hobs
2d ago
1 reply
When you work for something that directly contradicts peaceful civil society you are basically saying the mass murder of today is ok because it allows you to assuage your guilt by giving to your local charity - its only effective if altruism is not your goal.
lesuorac
1d ago
It still depends on the marginal contribution.

A janitor at the CIA in the 1960s is certainly working at an organization that is disrupting the peaceful Iranian society and turning it into a "death to America" one. But I would not agree that they're doing a net-negative for society because the janitor's marginal contribution towards that objective is 0.

It might not be the best thing the janitor could do to society (as compared to running a soup kitchen).

8note
1d ago
1 reply
> I don't think taking in billions via fraud to donate some of it to charity is a net positive on society.

it could be though, if by first centralizing those billions, you could donate more effectively than the previous holders of that money could. the fraud victims may have never donated in the first place, or have donated to the wrong thing, or not enough to make the right difference.

JohnFen
1d ago
2 replies
"The ends justify the means" is a terrible, and terribly dangerous, argument.
jmount
1d ago
That is the point. Much clearer than I was. Thank you.
Sporktacular
1d ago
But if it's a net positive, the point is made.
Eisenstein
2d ago
> Is that an actual EA argument?

you missed this part: "The arguments always feel to me too similar"

> The value is all at the margins. Like Carnegie had legitimate functional businesses that would be profitable without Pinkerton's. So without Pinkerton's he'd still be able to afford probably every philanthropic thing he did so it doesn't justify it.

That isn't what OP was engaging with though, they aren't asking for you to answer the question 'what could Carnegie have done better' they are saying 'the philosophy seems to be arguing this particular thing'.

libraryofbabel
2d ago
1 reply
I expect the book itself (Death in a Shallow Pond: A Philosopher, a Drowning Child, and Strangers in Need, by David Edmonds) is good, as the author has written a lot of other solid books making philosophy accessible. The title of the article though, is rather clickbaity: it’s hardly “recovering” the origins of EA to say that it owes a huge debt to Peter Singer, who is only the most famous utilitarian philosopher of the late 20th century!

(Peter Singer’s books are also good: his Hegel: A Very Short Introduction made me feel kinda like I understood what Hegel was getting at. I probably don’t of course, but it was nice to feel that way!)

dang
2d ago
Ok, we've de-recovered the origins in the title above.
CactusBlue
2d ago
1 reply
> I think they’re recovering. They’ve learned a few lessons, including not to be too in hock to a few powerful and wealthy individuals.

I do not believe the EA movement to be recoverable; it is built on flawed foundations and its issues are inherent. The only way I see out of it is total dissolution; it cannot be reformed.

flexagoon
1d ago
Which of the foundations is flawed, the "we have the ability to help others and should use it" or the "some ways of helping others are more effective than others"?
hexator
2d ago
2 replies
I find it to be a dangerous ideology since it can effectively be used to justify anything. I joined an EA group online (from a popular YouTube channel) and the first conversation I saw was a thread by someone advocating for eugenics. And it only got worse from there.

> A paradox of effective altruism is that by seeking to overcome individual bias through rationalism, its solutions sometimes ignore the structural bias that shapes our world.

Yes, this just about sums it up. As a movement they seem to be attracting some listless contrarians that seem entirely too willing to dig up old demons of the past.

nullc
2d ago
2 replies
> through rationalism,

When they write "rationalism" you should read "rationalization".

XorNot
2d ago
1 reply
It's a variant of how you instantly know what a government will be like depending how much democracy they put in their name.
potato3732842
20h ago
"If a state calls itself a commonwealth you know it sucks for the common man there" or something like it is one I've heard before.

It's at least 50% right in my experience.

chrisweekly
2d ago
Yes! It's a crucial distinction. Rationalism is about being rational / logical -- moving closer to neutrality and "truth". Whereas to rationalize something is often about masking selfish motives, making excuses, or (self-)deception -- moving away from "truth".
mikkupikku
2d ago
Agreed. It's firmly an "ends justify the means" ideology, reliant on accurately predicting future outcomes to justify present actions. This sort of thing gives free license to any sociopath with enough creativity to spin some yarn with handwavy math about the bad outcome their malicious actions are meant to be preventing.
keiferski
2d ago
17 replies
The popularity of EA always seemed pretty obvious to me: here's a philosophy that says it doesn't matter what kind of person you are or how you make your fortune, as long as you put some amount of money toward problems. Exploiting people to make money is fine, as long as some portion of that money is going toward "a good cause." There is really no element of self virtue in the way that virtue ethics has..it's just pure calculation.

It's the perfect philosophy for morally questionable people with a lot of money. Which is exactly who got involved.

That's not to say that all the work they're doing/have done is bad, but it's not really surprising why bad actors attached themselves to the movement.

nonethewiser
2d ago
8 replies
>The popularity of EA always seemed pretty obvious to me: here's a philosophy that says it doesn't matter what kind of person you are or how you make your fortune, as long as you put some amount of money toward problems. Exploiting people to make money is fine, as long as some portion of that money is going toward "a good cause."

I dont think this is a very accurate interpretation of the idea - even with how flawed the movement is. EA is about donating your money effectively. IE ensuring the donation gets used well. At it's face, that's kind of obvious. But when you take it to an extreme you blur the line between "donation" and something else. It has selected for very self-righteous people. But the idea itself is not really about excusing you being a bad person, and the donation target is definitely NOT unimportant.

some_guy_nobel
2d ago
6 replies
You claim OP's interpretation is inaccurate, while it tracks perfectly with many of EA's most notorious supporters.

Given that contrast, I'd ask what evidence do you have for why OP's interpretation is incorrect, and what evidence do you have that your interpretation is correct?

jandrese
2d ago
1 reply
It's like libertarianism. There is a massive gulf between the written goals and the actual actions of the proponents. It might be more accurately thought of as a vehicle for plausible deniability than an actual ethos.
glenstein
1d ago
2 replies
The problem is that creates a kind of epistemic closure around yourself where you can't encounter such a thing as a sincere expression of it. I actually think your charge against Libertarians is basically accurate. And I think it deserves a (limited) amount of time and attention directed at its core contentions for what they are worth. After all, Robert Nozick considered himself a libertarian and contributed some important thinking on things like justice and retribution and equality and any number of subjects, and the world wouldn't be bettered by dismissing him with twitter style ridicule.

I do agree that things like EA and Libertarianism have to answer for the in-the-wild proponents they tend to attract but not to the point of epistemic closure in response to its subject matter.

Eisenstein
1d ago
1 reply
When a term becomes loaded enough then people will stop using it when they don't want to be associated with the loaded aspects of the term. If they don't then they already know what the consequences are, because they will be dealing with them all the time. The first and most impactful consequence isn't 'people who are not X will think I am X' it is actually 'people who are X will think I am one of them'.
glenstein
1d ago
1 reply
I think social dynamics are real and must be answered for but I don't think any self-correction or lacktherof has anything to do with subject matter which can be understood independently.

I will never take a proponent of The Bell Curve seriously who tries to say they're "just following the data", because I do hold them and the book responsible for their social and cultural entanglements and they would have to be blind to ignore it. But the book is wrong for reasons intrinsic to its analysis and it would be catastrophic to treat that point as moot.

Eisenstein
1d ago
3 replies
[delayed]
glenstein
1d ago
1 reply
You risk catastrophe if you let social dynamics stand in for truth.
Eisenstein
1d ago
[delayed]
mitthrowaway2
1d ago
3 replies
Some very bad people believe that the sky is blue. Does that incline you towards believing instead that it's green?
Eisenstein
1d ago
My claim is not that people abandon beliefs but that they abandon labels when the label takes on connotations they do not want to be associated with.
Eisenstein
1d ago
[delayed]
Eisenstein
1d ago
No one has ever asked me what color I believe the sky is, and I don't make it part of my identity.
int_19h
1d ago
If people really believe in something, it stands to reason that they aren't willing to just give up on the associated symbolism because someone basically hijacked it.

Coincidentally, libertarian socialism is also a thing.

nyeah
1d ago
[delayed]
RobinL
1d ago
2 replies
> many of EA's most notorious supporters.

The fact they're notorious makes them a biased sample.

My guess is for the majority of people interested in EA - the typical supporter who is not super wealthy or well known - the two central ideas are:

- For people living in wealthy countries, giving some % of your income makes little difference to your life, but can potentially make a big difference to someone else's

- We should carefully decide which charities to give to, because some are far more effective than others.

That's pretty much it - essentially the message in Peter Singer's book: https://www.thelifeyoucansave.org/.

I would describe myself as an EA, but all that means to me is really the two points above. It certainly isn't anything like an indulgence that morally offsets poor behaviour elsewhere

Eddy_Viscosity2
1d ago
1 reply
I would say the problem with EA is the "E". Saying you're doing 'effective' altruism is another way of saying that everyone else's altruism is wasteful and ineffective. Which of course isn't the case. The "E" might as well stand for "Elitist" in that's the vibe it gives off. All truly altruistic acts would aim to be effective, otherwise it wouldn't be altruism - it would just be waste. Not to say there is no waste in some altruism acts, but I'm not convinced its actually any worse than EA. Given the fraud associated with some purported EA advocates, I'd say EA might even be worse. The EA movement reeks of the optimize-everything mindset of people convinced they are smarter than everyone else who just say just gives money to a charity A when they could have been 13% more effective if they sent the money directly to this particular school in country B with the condition they only spend it on X. The origins of EA may not be that, but that's what it has evolved into.
estearum
1d ago
2 replies
A lot of altruism is quite literally wasteful and ineffective, in which case it's pretty hard to call it altruism.

> they could have been 13% more effective

If you think the difference between ineffective and effective altruism is a 13% spread, I fear you have not looked deeply enough into either standard altruistic endeavors nor EA enough to have an informed opinion.

The gaps are actually astonishingly large. Which is the impetus for the entire train of thought.

mcv
1d ago
1 reply
It's absolutely worth looking at how effective the charities you donate to really are. Some charities spend a lot of money on fundraising to raise more funds and then reward their management for raising to much funds with only a small amount being spent on actual help. Others are primarily known for their help.

Especially rich people's vanity foundations are mostly a channel for dodging taxes and channeling corruption.

I donate to a lot of different organisations, and I do check which do the most good. Red Cross and Doctors Without Borders are very effective and always worthy of your donation, for example. Others are more a matter of opinion. Greenpeace has long been the only NGO that can really take on giant corporations, but they've also made some missteps over the years. Some are focused on helping specific people, like specific orphans in poor countries. Does that address the general poverty and injustice in those countries? Maybe not, but it does make a real difference for somebody.

And if you only look at the numbers, it's easy to overlook the individuals. The homeless person on the street. Why are they homeless, when we are rich? What are we doing about that?

But ultimately, any charity that's actually done, is going to be more effective than holding off because you're not sure how optimal this is. By all means optimise how you spend it, but don't let doubts hold you back from doing good.

HDThoreaun
21h ago
> Some charities spend a lot of money on fundraising to raise more funds

EA is about much more than this. Even among the same cause areas different interventions can vary greatly in their efficacy. So say you're interested in helping the homeless in your area. You can think of a few different interventions for them ranging from giving them free housing, job training, even just handing them cash. The question EA asks is which of those options you should be spending your money on. It doesnt have to turn into "its immoral to do anything other than give to the global poor" or "we need to consider the unborn masses" if you dont want to go there.

Eddy_Viscosity2
1d ago
1 reply
> the gaps are actually astonishingly large

For sure this is case. But just knowing what you are donating to doesn't need some sort of special designation. Like yes A is in fact much better than B, so I'll donate to A instead of B is no different than any other decision where you'd weigh options. Its like inventing 'effective shopping'. How is it different than regular shopping? Well, with ES, you evaluate the value and quality of the thing you are buying against its price, you might also read reviews or talk to people to have used the different products before. Its a new philosophy of shopping that no one has ever thought of before and its called 'effective shopping'. Only smart people are doing it.

estearum
1d ago
1 reply
The principal idea behind EA is that people often want their money to go as far as possible, but their intuitions for how to do that are way, way off.

Nobody said or suggested only smart people can or should or are “doing EA.” What people observe is these knee jerk reactions against what is, as you say, a fairly obvious idea once stated.

However it being an obvious idea once stated does not mean people intuitively enact that idea, especially prior to hearing it. Thus the need to label the approach

Eddy_Viscosity2
1d ago
1 reply
> However it being an obvious idea once stated does not mean people intuitively enact that idea, especially prior to hearing it. Thus the need to label the approach

This has some truth to it and if EA were primarily about reminding people that not all donations to charitable causes pack the same punch and that some might even be deleterious, then I wouldn't have any issues with it at all. But that's not what it is anymore, at least not the most notable version of it. My knee jerk reaction to it comes from this version. The one where narcissistic tech bros posture moral and intellectual superiority not only because they give, but because they give better than you.

Joeboy
1d ago
Out of interest, do you identify any of the comments in this discussion as that kind of posturing? The "pro-EA" comments I see here seem (to me) to be fairly defensive in character. Whereas comments attacking EA seem pretty strident. Are you perceiving something different?
mcv
1d ago
I agree. I think the criticism of EA's most notorious supporters is warranted, but it's criticism of those notorious supporters and the people around them, not the core concept of EA itself.

The core notions as you state them are entirely a good idea. But the good you do with part of your money does not absolve you for the bad things you do with the rest, or the bad things you did to get rich in the first place.

Mind you, that's how the rich have always used philanthropy; Andrew Carnegie is now known for his philanthropy, but in life we was a brutal industrialist responsible for oppressive working conditions, strike breaking, and deaths.

Is that really effective altruism? I don't think so. How you make your money matters too. Not just how you spend it.

cortesoft
1d ago
1 reply
Well, in order to be a notorious supporter of EA, you have to have enough money for your charity to be noticed, which means you are very rich. If you are very rich, it means you have to have made money from a capitalistic venture, and those are inherently exploitive.

So basically everyone who has a lot of money to donate has questionable morals already.

The question is, are the large donators to EA groups more or less 'morally suspect' than large donors to other charity types?

In other words, everyone with a lot of money is morally questionable, and EA donors are just a subset of that.

nl
1d ago
2 replies
> you have to have made money from a capitalistic venture, and those are inherently exploitive.

You say this like it's fact beyond dispute, but I for one strongly disagree.

Not a fan of EA at all though!

tovej
1d ago
2 replies
For very much money, as in, let's say, more than 1000x the median person in the wealth distribution, I'd say it's obviously true.

You cannot make 1000x the average persons wealth by acting morally. Except possibly winning the lottery.

A person is not capable of creating that wealth. A group of people have created that wealth, and the 1000x individual has hoarded it to themselves instead of sharing it with the people who contributed.

If you are a billionaire, you own at least 5000x the median (200000k in the US). If you're a big tech CEO, you own somewhere around 50-100,000x the median. These are the biggest proponents of EA.

The bottom 50% only own about 2% of the wealth anymore, the top 10% own two thirds of the wealth, the top 1% owns a whole third and it's only getting worse. Who is responsible for the wealth inequality? The people at the right edge of the Lorenz curve. They could fix it,.but don't. That is why they are exploitative.

zozbot234
1d ago
1 reply
> You cannot make 1000x the average persons wealth by acting morally. Except possibly winning the lottery.

The risk profile of early startup founders looks a lot like "winning the lottery", except that the initial investment (in terms of time, effort and lost opportunities elsewhere as well as pure monetary ones) is orders of magnitude higher than the cost of a lottery ticket. There's only a handful of successful unicorns vs. a whole lot of failed startups.

tovej
1d ago
1 reply
The risk profile being the same does not mean that the actions are the same. The unicorns that make it rich invariably have some way of screwing over someone else., Either workers, users, or smaller competitors.

For Google and Facebook, users' data was sold to advertisers, and their behaviour is manipulated to benefit the company and its advertising clients. For Amazon, the workers are squeezed for all the contribution they can give and let go once they burn out, and they manipulate the marketplace that they govern to benefit them. If you make multiple hundreds of millions, you are either exploiting someone in the above way, or you are extracting rent from them.

Just looking at the wealth distribution is a good way to see how unicorns are immoral. If you suddenly shoot up into the billionaire class, you are making the wealth distribution worse, because your money is accruing from the less wealthy proportion of society.

That unicorns propagate this inequality is harmful in itself. The entire startup scene is also often a fishing pond for existing monopolies. The unicorns are sold to the big immoral actors, making them more powerful.

What is taken away when inequality becomes worse is political power and agency. Maybe other contributors close to the founders are better off, but society as a whole is worse off.

zozbot234
1d ago
1 reply
The problem with your argument is that most organizations by far that engage in these detrimental, anti-social behaviors are not unicorns at all! So what makes unicorns special and exceptional is the fact that they nonetheless manage to create outsized value, not just that they sometimes screw people over. Perhaps unicorns do technically raise inequality, but by and large, they do so while making people richer, not poorer.
tovej
1d ago
Could you please back that up with some evidence. Right now you're just claiming that there are a lot of anti-social businesses but that unicorns are separate from this.

That's quite a claim, as there's a higher probability of unicorns screwing people over. If a unicorn lives long enough it ends up at the top of the wealth pyramid. As far as I can tell, all of the _big_ anti-social actors were once unicorns.

That most organizations engaging in bad behavior aren't unicorns says nothing, because by definition most companies aren't unicorns. If unicorns are less than 0.1% of the population of companies X, then P(X | !unicorn(X)) > P(X | unicorn(X)) is almost guaranteed to be true for all P.

nl
1d ago
1 reply
I think Yvon Chouinard has acted morally throughout his career. His net reported wealth was $3B before he gave his company to the trust he created.

He's far from the only example.

I understand the distribution of wealth. I agree that in the US in particular it is setup to exploit poor people.

I don't think being rich is immoral.

tovej
1d ago
You think the wealth inequality is set up to exploit poor people, but you don't think contributing to the wealth inequality is immoral.

That's an interesting position. I would guess that in order to square these two beliefs you either have to think exploiting the poor is moral (unlikely) or that individuals are not responsible for their personal contributions to the wealth inequality.

I'm interested to hear how you argue for this position. It's one I rarely see.

cortesoft
1d ago
1 reply
Fair to disagree on that point, but I think the people who would find the EA supporters “morally questionable” feel that way for reasons that would apply to all rich people. I would be curious to hear what attributes EA supporters have that other rich people don’t.
nl
1d ago
2 replies
I think the idea the future lives have value, and the value of those lives can outweigh the value of actual living people today is extremely immoral.

To quote[1]:

> In Astronomical Waste, Nick Bostrom makes a more extreme and more specific claim: that the number of human lives possible under space colonization is so great that the mere possibility of a hugely populated future, when considered in an “expected value” framework, dwarfs all other moral considerations.

[1] https://blog.givewell.org/2014/07/03/the-moral-value-of-the-...

cortesoft
1d ago
1 reply
> I think the idea the future lives have value, and the value of those lives can outweigh the value of actual living people today is extremely immoral.

This is an interesting take. So if we found out for certain that an action we are taking today is going to kill 100% of humans in 200 years, it would be immoral to consider that as a factor in making decisions? None of those people are living today, obviously, so that means we should not worry about their lives at all?

nl
16h ago
The extreme form of the argument ("don't worry about the future at all") isn't what I'm saying. It is also immoral to not consider the future.

But to put future lives on the same scale (as in to allow for the possibility of measuring one against the other) of current lives is immoral.

Future lives are important, but balancing them against current lives is immoral

addaon
22h ago
Isn't this just the Thanos argument, though? Given the huge number of possible future lives under space colonization, all of them ending inevitably in death and suffering, no amount of trying to improve those lives can ever has as much of a positive impact as just avoiding them by pushing for, say, nuclear self-annihilation now, because the somewhat larger suffering for a much, much smaller number of people is a higher "expected value"? I'm not really keen on moral arguments that end up arguing for nuclear war…
btilly
1d ago
The OP's interpretation is an inaccurate summary of the philosophy. But it is an excellent summary of the trap that people who try to follow EA can easily fall into. Any attempt to rationally evaluate charity work, can instead wind up rationalizing what they want to do. Settling for the convenient and self-aggrandizing "analysis", rather than a rigorous one.

An even worse trap is to prioritize a future utopia. Utopian ideals are dangerous. They push people towards "the ends justify the means". If the ends are infinitely good, there is no bound on how bad the "justified means" can be.

But history shows that imagined utopias seldom materialize. By contrast the damage from the attempted means is all too real. That's why all of the worst tragedies of the 20th century started with someone who was trying to create a utopia.

EA circles have shown an alarming receptiveness to shysters who are trying to paint a picture of utopia. For example look at how influential someone like Samuel Bankman-Fried was able to be, before his fraud imploded.

stickfigure
1d ago
> tracks perfectly with many of EA's most notorious supporters

Just wait until you find out about vegetarianism's most notorious supporter.

socalgal2
1d ago
this feels like “the most notorious atheists/jews/blacks/whites/christian/muslims are bad therefore all atheists/jews/blacks/whites/christian/muslims are bad
klustregrif
2d ago
1 reply
> EA is about donating your money effectively

For most it seems EA is an argument that despite no charitable donations being made at all, and despite gaining wealth through questionable means it’s still all ethical because it’s theoretically “just more effective” if the person continues to claim that they would in the far future put some money towards these hypothetical “very effective” charitable causes, that just never seems to have materialized yet, and all of cause shouldn’t be perused “until you’ve built your fortune”.

Aunche
1d ago
1 reply
If you're going to assign a discount rate for cash, you also need to assign a similar "discount rate" for future lives saved. Just like investments compound, giving malaria medicine and vitamins to kids who needs him should produce at least as much positive compounding returns.
faidit
1d ago
That future promise doesn't do much good if the planet is dead by the time these guys get around to donating, thanks to the ecological catastrophe caused by their supposedly well-intentioned greed. Also, EA proponents tend to ignore society's opportunity cost here - that money could have been taxed and put to good uses by the public in the meantime. Whatever the inefficiencies of the public sector, at least we can do something to fix it now instead of trusting on the promises of rich sociopaths to do something decades from today.
ghurtado
2d ago
1 reply
I don't see anything in your comment that directly disagrees with the one that you've replied to.

Maybe you misinterpreted it? To me, It was simply saying that the flaw in the EA model is that a person can be 90% a dangerous sociopath and as long as the 10% goes to charity (effectively) they are considered morally righteous.

It's the 21st century version of Papal indulgences.

HDThoreaun
1d ago
The thing is that dangerous sociopaths will be dangerous sociopaths either way. What’s the downside in convincing them to donate 10% of their income to effective causes?
pjc50
1d ago
3 replies
> EA is about

A friend of mine used to "gotcha" any use of the expression "X is about Y", which was annoying but trained a useful intellectual habit. That may have been what EA's original stated intent was, but then you have to look at what people actually say and do under the name of EA.

Joeboy
1d ago
2 replies
As per conversation elsewhere, I think you've fallen for some popular but untrue / unfair narratives about EA.

But I want to take another tack. I never see anybody make the following argument. Probably that's because other people wisely understand how repulsive people find it, but I want to try anyway, possibly because I have undiagnosed autism.

EA-style donations have saved hundreds of thousands lives. I know there are people who will quibble about the numbers, but I don't think you can sensibly dispute that EA has saved a lot of lives. This never seems to appear in people's moral calculus, like at all. Most of those are people who are poor, distant and powerless but nevertheless, do they not count for something?

I know I'm doing utilitarianism and people hate it, but I just don't get how these lives don't count for something. Can you sell me on the idea that we should let more poor people die of preventable diseases in exchange for a more spotless moral character?

nemomarx
1d ago
1 reply
I don't think the complaint is really the donations or the impact, rather it's that the community has issues?

Whether you agree that someone can put money into saving lives to make up for other moral faults or issues or so on is the core issue. And even from a utilitarian view we'd have to say that more of these donations happened than would have without the movement or with a different movement, which is difficult to measure. Consider the usaid thing - Elon musk may have wiped out most of the EA community gains by causing that defending, and was probably supported by the community in some sense. How to weigh in all these factors?

Joeboy
1d ago
1 reply
> Whether you agree that someone can put money into saving lives to make up for other moral faults or issues or so on is the core issue

For me the core issue is why people are so happy to advocate for the deaths of the poor because of things like "the community has issues". Of course the withdrawal of EA donations is going to cause poor people to die. I mean yes, some funding will go elsewhere, but a lot of it's just going to go away. Sorry to vent but people are so endlessly disappointing.

> Elon musk may have wiped out most of the EA community gains by causing that defending

For sure!

> and was probably supported by the community in some sense

You sound fairly under-confident about that, presumably because you're guessing. It's wildly untrue.

nemomarx
1d ago
1 reply
I can't imagine EA people supported the USAID decision specifically - but the silicon valley environment, the investing bubble, our entire tech culture is why Musk has the power he does, right?

And the rationalist community writ large is very much part of that. The whole idea that private individuals should get to decide whether or not to do charity, or where they can casually stop giving funds or etc, or that so much money needs to be tied up in speculative investments and so on, I find that all pretty distasteful. Should life or death matters be up to whims like this?

Joeboy
1d ago
1 reply
> the silicon valley environment, the investing bubble, our entire tech culture is why Musk has the power he does, right?

For sure, not quibbling with any of that. The part I don't get is why it's EA's fault, at least more than it's many, many other people and organizations' fault. EA gets the flak because it wants to take money from rich people and use it to save poor people's lives. Not because it built the Silicon Valley environment / tech culture / investing bubble.

> I find that all pretty distasteful. Should life or death matters be up to whims like this?

Referring back to my earlier comment, can you sell me on the idea that they shouldn't? If you think aid should come from taxes, sell me on the idea that USAID is less subject to the whims of the powerful than individual donations. Also sell me on the idea that overseas aid will naturally increase if individual donations fall. Or, sell me on the idea that the lives of the poor don't matter.

nemomarx
1d ago
1 reply
For decades things like usaid were bipartisan and basically untouchable, so that and higher taxes would have been a fairly secure way to do things. The question is can that be accomplished again, or do we need a thorough overhaul of who's in power in various parts of society?

None of this will happen naturally though. We need to make it happen. So ultimately my position is that we need to aim efforts at making these changes, possibly at a higher priority than individual giving - if you can swing elections or change systems of government the potential impact is very high in terms of policy change and amount of total aid, and also in terms of how much money we allow the rich to play and gamble with. None of these are natural states of affairs.

Joeboy
1d ago
(Sincerely) good luck with that, but I don't see why it means we should be against saving the lives of poor people in the immediate term. At some point we might just have to put it down to irreconcilably different mental wiring.
nyeah
1d ago
[delayed]
HDThoreaun
1d ago
The vast majority of EA money goes to givewell and their mission to serve the global poor. Some people have obviously abused the earn to give idea but most effective altruists are just trying to think about ways to be more effective with their giving.
Recursing
1d ago
> you have to look at what people actually say and do under the name of EA.

They donate a significant percentage of their income to the global poor, and save thousands of lives every year (see e.g. https://www.astralcodexten.com/p/in-continued-defense-of-eff... )

cyanydeez
5h ago
Its a illogical theory, even if practiced in good faith.

Just because the market pays for one activity does not mean ots externalitirs are equally solvedby the matkets valuation.

From basic physics, its akin to saying you can drop a vase and return it to predropped state with equal effort.

Entropy alone prevents EA.

JohnMakin
1d ago
It is a little bit though, using these lines of thinking it becomes extraordinarily easy to excuse, justify, or even paint as a good thing highly unethical or immoral actions.

For instance -

If I find some sort of fraud that will harm X number of users, but make me Y dollars - if Y > (harm caused), not doing (fraud making me Y dollars) could be interpreted as being "inefficient" with your resources or causing more harm. It's very easy to use the philosophy in this manner, and of course many see it as a huge perk. The types of people drawn to it are all much the same.

WhyOhWhyQ
1d ago
The op and your reply are basically guaranteed text on the page whenever EA comes up (not that your reply is unwarranted, or the op's message is either, but it is interesting that these are guaranteed comments).
glenstein
2d ago
I actually think I agree with this, but nevertheless people can refer to EA and mean by it the totality of sociological dynamics surrounding it, including its population of proponents and their histories.

I actually think EA is conceptually perfectly fine within its scope of analysis (once you start listing examples, e.g. mosquito nets to prevent malaria, I think they're hard to dispute), and the desire to throw out the conceptual baby with the bathwater of its adherents is an unfortunate demonstration of anti-intellectualism. I think it's like how some predatory pickup artists do the work of being proto-feminists (or perhaps more to the point, how actual feminists can nevertheless be people who engage in the very kinds of harms studied by the subject matter). I wouldn't want to make feminism answer for such creatures as definitionally built into the core concept.

nxor
2d ago
1 reply
SBF has entered the chat
AgentME
2d ago
1 reply
I'm tired of every other discussion about EA online assuming that SBF is representative of the average EA member, instead of being an infamous outlier.
nxor
1d ago
What reasons at all do you have?
phantasmish
2d ago
3 replies
I’m skeptical of any consequentialist approach that doesn’t just boil down to virtue ethics.

Aiming directly at consequentialist ways of operating always seems to either become impractical in a hurry, or get fucked up and kinda evil. Like, it’s so consistent that anyone thinking they’ve figured it out needs to have a good hard think about it for a several years before tentatively attempting action based on it, I’d say.

jrochkind1
2d ago
2 replies
What does "virtue ethics" mean?
keiferski
2d ago
1 reply
One of the three traditional European philosophy approaches to ethics:

https://en.wikipedia.org/wiki/Virtue_ethics

EA being a prime example of consequentialism.

phantasmish
2d ago
… and I tend to think of it as the safest route to doing OK at consequentialism, too, myself. The point is still basically good outcomes, but it short-circuits the problems that tend to come up when one starts trying to maximize utility/good, by saying “that shit’s too complicated, just be a good person” (to oversimplify and omit the “draw the rest of the fucking owl” parts)

Like you’re probably not going to start with any halfway-mainstream virtue ethics text and find yourself pondering how much you’d have to be paid to donate enough to make it net-good to be a low-level worker at an extermination camp. No dude, don’t work at extermination camps, who cares how many mosquito nets you buy? Don’t do that.

TimorousBestie
2d ago
1 reply
The best statement of virtue ethics is contained in Alasdair Macintyre’s _After Virtue_. It’s a metaethical foundation that argues that both deontology and utilitarianism are incoherent and have failed to explain what some unitary “the good” is, and that ancient notions of “virtues” (some of which have filtered down to present day) can capture facets of that good better.
glenstein
1d ago
1 reply
Probably a topic for a different day, but it's rare to get someone's nutshell version of ethics so concise and clear. For me, my concern would be letting the evolutionary tail wag the dog, so to speak. Utility has the advantage of sustaining moral care toward people far away from you, which may not convey an obvious evolutionary advantage.

And I think the best that can be said of evolution is that it mixes moral, amoral and immoral thinking in whatever combinations it finds optimal.

TimorousBestie
1d ago
Macintyre doesn’t really involve himself with the evolutionary parts. His arguments are historical/social/cultural instead.

> Utility has the advantage of sustaining moral care toward people far away from you

Well, in some formulations. There are well-defined and internally consistent choices of utility function that discount or redefine “personhood” in anti-humanist ways. That was more or less Rawls’ criticism of utilitarianism.

pjc50
1d ago
After a couple of decades I've concluded that you need both. Virtue ethics gives you things like the War on Drugs and abortion bans; justification for having enforcement inflict real and significant harms in the name of virtue.

Virtue ethics is open-loop: the actions and virtues get considered without checking if reality has veered off course.

Consequentialist is closed-loop, but you have to watch out for people lying to themselves and others about the future.

glenstein
1d ago
I partly agree with you but my instinct is that Parfit Was Right(TM) that they were climbing the same mountain from different sides. Like a glove that can be turned inside out and worn on either hand.

I may be missing something, but I've never understood the punch of the "down the road" problem with consequentialism. I consider myself kind of neutral on it, but I think if you treat moral agency as only extending so far as consequences you can reasonably estimate, there's a limit to your moral responsibility that's basically in line with what any other moral school of thought would attest to.

You still have cause-end-effect responsibility; if you leave a coffee cup on the wrong table and the wrong Bosnian assassinates the wrong Archduke, you were causally involved, but the nature of your moral responsibility is different.

1vuio0pswjnm7
1d ago
2 replies
There's the implication that some altruism may not be "effective"
btilly
1d ago
1 reply
What makes it absurd?

If I want to give $100 to charity, some of the places that I can donate it to will do less good for the world. For example Make a Wish and Kids Wish Foundation sound very similar. But a significantly higher portion of money donated to the former goes to kids, than does money donated to the latter.

If I'm donating to that cause, I want to know this. After evaluating those two charities, I would prefer to donate to the former.

Sure, this may offend the other one. But I'm absolutely OK with that. Their ability to be offended does not excuse their poor results.

keiferski
1d ago
1 reply
I don’t think anyone has an issue with being efficient with donation money. But it isn’t called Effective Giving.

The conclusion that many EA people seemed to reach is that keeping your high-paying job and hiring 10 people to do good deeds is more ethically laudable than doing the thing yourself, even though it may be inefficient. Which really rubs a lot of people the wrong way, as it should.

chongli
1d ago
1 reply
It’s another argument in favour of EA that they try to cut past arguments like this. If you’re a billionaire you can do a lot more good by investing in a mosquito net factory than you ever could by hanging mosquito nets one at a time yourself.

The argument of EA is that feelings can be manipulated (and often are) by the marketing work done by charities and their proponents. If we want to actually be effective we have to cut past the pathos and look at real data.

keiferski
1d ago
1 reply
Firstly, most people aren't billionaires. Nor do I think EA is somehow novel in suggesting that a billionaire should buy nets instead of help directly.

Secondly, you're missing the point I'm making, which is why many people find EA distasteful: it completely focuses on outcomes and not internal character, and it arrives at these incomes by abstract formulae. This is how we ended up with increasingly absurd claims like "I'm a better person because I work at BigCo and make $250k a year, then donate 10% of it, than the person that donates their time toward helping their community directly." Or "AGI will lead to widespread utopia in the future, therefore I'm ethically superior because I'm working at an AI company today."

I really don't think anyone is critical of EA because they think being inefficient with charity dollars is a good thing, so that is a strawman. People are critical of the smarmy attitude, the implication that other altruism is ineffective, and the general detached, anti-humanistic approach that the people in that movement portray.

The problems with it are not much different from utilitarianism itself, which EA is just a half-baked shadow of. As someone else in this comment section said, unless you have a sense of virtue ethics underlying your calculations, you end up with absurd, anti-human conclusions that don't make much sense to anyone with common sense.

There's also the very basic argument that maybe directly helping other people leads to a better world overall, and serves as an example than just spending money abstractly. That counterargument never occurs to the EA/rationalist crowd, because they're too obsessed with some master rational formula for success.

chongli
21h ago
Secondly, you're missing the point I'm making, which is why many people find EA distasteful: it completely focuses on outcomes and not internal character, and it arrives at these incomes by abstract formulae.

No, I did not miss that point at all. I think it is WRONG to focus on character. That leads us down the dark path of tribalism and character assassination and culture war.

If we're going to talk about a philosophy and an ethics of behaviour, we have to talk about ACTIONS. That's the only way we'll ever accomplish any good.

1vuio0pswjnm7
1d ago
https://www.sierraclub.org/sierra/trouble-algorithmic-ethics...

"But putting any probability on any event more than 1,000 years in the future is absurd. MacAskill claims, for example, that there is a 10 percent chance that human civilization will last for longer than a million years."

anonymousiam
1d ago
1 reply
EA should be bound by some ethical constraints.

Sam Bankman-Fried was all in with EA, but instead of putting his own money in, he put everybody else's in.

Also his choice of "good causes" was somewhat myopic.

jahnu
1d ago
Some might suggest that he wasn't an EA at all but just used it for cover.
internet_points
1d ago
1 reply
"I Work For an Evil Company, but Outside Work, I’m Actually a Really Good Person"

https://www.mcsweeneys.net/articles/i-work-for-an-evil-compa...

sershe
1d ago
The very first example about water use is a well known red herring, data center water use is miniscule compared to the production of everyday goods.

It's really amazing when reading this kind of stuff how many people don't appear to realize others don't buy into their cult. The way I see it, "I work for a company that intellectual descendants of the 2nd (or the 1st) most evil ideology invented by man consider evil"

Joeboy
1d ago
1 reply
Similarly, the reason comments like yours get voted to the top of discussions about EA is that they imply "It's best if rich people keep their money, because the people trying to save poor people's lives are actually bad". There's a very obvious appeal to that view, especially somewhere like HN.
pjc50
1d ago
1 reply
No, I think this is just about the difference between Effective Altruism (tm), altruism that is actually effective, and the hidden third option (tax the rich).

EA-the-brand turned into a speed run of the failure cases of utilitarianism. Because it was simply too easy to make up projections for how your spending was going to be effective in the future, without ever looking back at how your earning was damaging in the past. It was also a good lesson in how allowing thought experiments to run wild would end up distracting everyone from very real problems.

In the end an agency devoted to spending money to save lives of poor people globally (USAID) got shut down by the world's richest man, and I can't remember whether EA ever had anything to say about that.

Joeboy
1d ago
1 reply
Well, the work I do is / was largely funded by USAID so I'm biased, but from literally everything I've seen EA people are unanimously horrified by the gutting of USAID. And EA people are overwhelmingly pro "tax the rich".

But again, I recognize the appeal of your narrative so you're on safer ground than I am as far as HN popularity goes.

pjc50
1d ago
1 reply
> EA people

I have a lot of sympathy for the ideas of EA, but I do think a lot of this is down to EA-as-brand rather than whatever is happening at grassroots level. Perhaps it's in the same place as Communism; just as advocates need a good answer to "how did this go from a worker's rights movement to Stalin", EA needs an answer to "how did EA become most publicly associated with a famous fraudster".

Joeboy
1d ago
Well, there are some fairly obvious answers:

EA had a fairly easy time in the media for a while which probably made its "leadership" a bit careless. The EA foundation didn't start to seriously disassociate itself from SBF until the collapse of FTX made his fraudulent activity publicly apparent.

But mostly, people (especially rich people) fucking hate it when you tell them they could be saving thousands of lives instead of buying a slightly nicer house. That (it seems to me) is why MOMA / Harvard / The British Museum etc get to accept millions of dollars of drug dealer money and come out unscathed, whereas "EA took money from somebody who turned out to be a fraudster" is presented as a decisive moral hammer blow.

I feel like I need to say, there's also a whole thing about EA leadership being obsessed with AI risk, which (at least at the time) most people thought was nuts. I wasn't really happy with the amount of money (especially SBF money) that went into that, but a large majority of EA money was still going into very defensible life-saving causes.

TitaRusell
1d ago
2 replies
Just pay your taxes.

I am not impressed with billionaires who dodge taxes and then give a few pennies to charity.

chongli
1d ago
1 reply
The ones who do so in good faith do this because they’re appalled by government waste. If you look at the government as a charity, its track record is pretty abysmal. People point to USAID but that’s like pointing to the small % of actual giving done by the worst offenders among private charities.
TitaRusell
19h ago
First of all all these clowns give money to the GOP. Which they should stop doing immediately. Then they can start advocating for universal healthcare and child benefits.

The government is quite literally all of us. Do better.

carlosjobim
1d ago
How does that solve anything for the victims? Giving money to a different evil organization in this case.
CyberDildonics
1d ago
Lots of charity is just about buying something else. Buying good press, buying your way out of guilt, etc. Short sellers even count some companies' altruism as a red flag.
downrightmike
2d ago
Its basically the same thing as the church selling indulgences. Didn't matter if you stole the money, pay the church and go to heaven
directevolve
1d ago
The practice of effective altruism, as distinct from the EA movement, is good for our culture. If you have a lot of money or talent, please think critically about how to leverage it efficiently to make the world a better place.

Doing that doesn’t buy you personal virtue. It doesn’t excuse heinous acts. But within the bounds of ordinary standards of good behavior, try to do the most good you can with the talents and resources at your disposal.

GardenLetter27
1d ago
Modern day indulgences.
ChadNauseam
1d ago
You'll never find a single prominent EA saying that because it's 100% made up. Maybe they'll remark that from an academic perspective it's a consequence of some interpretations of utilitarianism, a topic some EAs are interested in, but no prominent EA has ever actually endorsed or implied the view you put forward.

To an EA, what you said is as laughable of a strawman as if someone summarized your beliefs as "it makes no difference if you donate to starving children in africa or if you do nothing, because it's your decision and neither is immoral".

The popularity of EA is even more obvious than what you described. Here's why it's popular. A lot of people are interested in doing good, but have limited resources. EAs tried to figure out how to do a lot of good given limited resources.

ou might think this sounds too obvious to be true, but no one before EAs was doing this. The closest thing was charity rankings that just measured what percent of the money was spend on administration. (A charity that spends 100% of its donations on back massages for baby seals would be the #1 charity on that ranking.) Finding ways to do a lot of good given your budget is a pretty intuitively attractive idea.

And they're really all about this too. Go read the EA forum. They're not talking about how their hands are clean now because they donated. They're talking about how to do good. They're arguing about whether malaria nets or malaria chemotreatments are more effective at stopping the spread of the disease. They're arguing about how to best mitigate the suffering of factory farmed animals (or how to convince people to go vegan). And so on. EA is just people trying to do good. Yeah, SBF was a bad actor, but how were EA charities supposed to know that when the investors that gave him millions couldn't even do that?

Sporktacular
1d ago
That's not what it's about. Exploiting people to make money is not fine. Causing harm while mitigating it elsewhere defeats the point. Giving is already about the kind of person you are.
alsetmusic
1d ago
That guy who went to jail believed in it, so it has to be good.

I hope SBF doesn’t buy a pardon from our corrupt president, but I hope for a lot of things that don’t turn out the way I’d like. Apologies for USA-centric framing. I’m tired.

sershe
1d ago
That argument applies to any charity. The difference in EA, even if one was to agree with your general framing, is that at least the money one uses on whitewashing should actually do some good and not be wasted.
Aunche
2d ago
> It's the perfect philosophy for morally questionable people with a lot of money.

The perfect philosophy for morally questionable people would just be to ignore charity altogether (e.g. Russian oligarchs) or use charity to launder strategically launder their reputations (e.g. Jeffrey Epstein). SBF would fall into that second category as well.

chaseadam17
2d ago
4 replies
Man, EA is so close to getting it. They are right that we have a moral obligation to help those in need but they are wrong about how to do it.

Don't outsource your altruism by donating to some GiveWell-recommended nonprofit. Be a human, get to know people, and ask if/how they want help. Start close to home where you can speak the same language and connect with people.

The issues with EA all stem from the fact that the movement centralizes power into the hands of a few people who decide what is and isn't worthy of altruism. Then similar to communism, that power gets corrupted by self-interested people who use it to fund pet projects, launder reputations, etc.

Just try to help the people around you a bit more. If everyone did that, we'd be good.

keiferski
2d ago
1 reply
That's the thing though, if EA had said: find 10 people in your life and help them directly, it wouldn't have appealed to the well-off white collar workers that want to spend money, but not actually do anything. The movement became popular because it didn't require one to do anything other than spend money in order to be lauded.
phantasmish
2d ago
Better, it’s a small step to “being a small part of something that’s doing a little evil to a shitload of people (say, working on Google ~scams targeting the vulnerable and spying on everybody~ Ads) is not just OK, but good, as long as I spend a few grand a year buying mosquito nets to prevent malaria, saving a bunch of lives!”

Which obviously has great appeal.

PaulDavisThe1st
2d ago
1 reply
> Just try to help the people around you a bit more. If everyone did that, we'd be good.

This describes a generally wealthy society with some people doing better than average and others worse. Redistributing wealth/assistance from the first group to the second will work quite well for this society.

It does nothing to address the needs of a society in which almost everyone is poor compared to some other potential aid-giving society.

Supporting your friends and neighbors is wonderful. It does not, in general, address the most pressing needs in human populations worldwide.

chaseadam17
1d ago
2 replies
If you live in a wealthy society it's possible to travel or move or get to know people in a different society and offer to help them.
PaulDavisThe1st
1d ago
The GP said:

> Just try to help the people around you a bit more. If everyone did that, we'd be good.

That's why I was replying too. Obviously, if you are willing to "do more", then you can potentially get more done.

skybrian
1d ago
There might be a bit of a language barrier, so you’ll need a translator. Also a place to stay, people to cook for you, and transportation. The tourist infrastructure isn’t all that developed in the poorest areas.

Tourism does redistribute money, but a lot of resources go to taking care of the tourists.

mk12
2d ago
If everyone did that, lots of people would still die of preventable causes in poor countries. I think GiveWell does a good job of identifying areas of greatest need in public health around the world. I would stop trusting them if they turned out to be corrupt or started misdirecting funds to pet projects. I don’t think everyone has to donate this way as it’s very personal decision, nor does it automatically make someone a good person or justify immoral ways of earning money, but I think it’s a good thing to help the less fortunate who are far away and speak a different language.
jimbokun
2d ago
What studies can you point to demonstrating your approach is more effective than donating to a GiveWell recommended non profit?
jmyeet
2d ago
1 reply
I'm leery of any philosophy that is popular in tech circles because they all seem to lead to eugenics, hyperindividualism, ignoring systemic issues, deregulation and whatever the latest incarnation of prosperity gospel is.

Utilitarianism suffers from the same problems it always had: time frames. What's the best net good 10 minutes from now might be vastly different 10 days, 10 months or 10 years from now. So whatever arbitrary time frame you choose affects the outcome. Taken further, you can choose a time frame that suits your desired outcome.

"What can I do?" is a fine question to ask. This crops up a lot in anarchist schools of thought too. But you can't mutual aid your way out of systemic issues. Taken further, focusing on individual action often becomes a fig leaf to argue against any form of taxation (or even regulation) because the government is limiting your ability to be altruistic.

I expect the effective altruists have largely moved on to transhumanism as that's pretty popular with the Silicon Valley elite (including Peter Thiel and many CEOs) and that's just a nicer way of arguing for eugenics.

omnimus
2d ago
Effective altruism and transhumanism is kinda the same thing along with other stuff like longetermism. There is even name for the whole thing TESCREAL. Very slightly different positions invented i guess for branding.
matt3D
2d ago
1 reply
Is there a term for what I had previously understood Effective Altruism to be, since I don’t want to reference EA in a conversation and have the other person think I’m associated with these sorts of people.

I had assumed it was just simple mathematics and the belief that cash is the easiest way to transfer charitable effort. If I can readily earn 50USD/hour, rather than doing a volunteering job that I could pay 25USD/hour to do, I simply do my job and pay for 2 people to volunteer.

throw4847285
2d ago
1 reply
That's just called utilitarianism/consequentialism. It's a perfect respectable ethical framework. Not the most popular in academic philosophy, but prominent enough that you have to at least engage with it.

Effective altruism is a political movement, with all the baggage implicit in that.

Vinnl
2d ago
1 reply
Is there a term for looking at the impact of your donations, rather than process (like percentage spent on "overhead")? I like discussing that, but have the same problem as GP.
edent
1d ago
1 reply
"Overhead" is part of the work. It's like saying you want to look at the impact of your coding, rather than the overhead spent on documentation.

An (effective) charity needs an accountant. It needs an HR team. It needs people to clean the office, order printer toner, and organise meetings.

lmm
1d ago
1 reply
> An (effective) charity needs an accountant. It needs an HR team. It needs people to clean the office, order printer toner, and organise meetings.

Define "needs". Some overheads are part of the costs of delivering the effective part, sure. But a lot of them are costs of fundraising, or entirely unnecessary costs.

throw4847285
1d ago
1 reply
That's what an organization like Charity Navigator is for. Like a BBB for charities. I'm sure their methodology is flawed in some way and that there is an EA critique. But if I recall, early EA advocates used Charity Navigator as one of their inputs.
lmm
1d ago
The "Program Expense Ratio" is pretty prominent in Charity Navigator's reports, and that's almost exactly a measure of "overhead".
TimorousBestie
2d ago
> . . . but also what’s called long-termism, which is worrying about the future of the planet and existential risks like pandemics, nuclear war, AI, or being hit by comets. When it made that shift, it began to attract a lot of Silicon Valley types, who may not have been so dedicated to the development part of the effective altruism program.

The rationalists thought they understood time discounting and thought they could correct for it. They were wrong. Then the internal contradictions of long-termism allowed EA to get suckered by the Silicon Valley crew.

Alas.

60 more comments available on Hacker News

ID: 45955889Type: storyLast synced: 11/19/2025, 7:00:02 PM

Want the full context?

Jump to the original sources

Read the primary article or dive into the live Hacker News thread when you're ready.