Discussion about this post

User's avatar
Doina's avatar

Wholeheartedly agree with you that "they’re wonderfully admirable principles, and I wish more people found them as inspiring as I do".

Expand full comment
Kenny Easwaran's avatar

I wasn't aware of this Regan idea! I like it, and find it interestingly similar to Functional Decision Theory (https://arxiv.org/abs/1710.05060) - what is rational to do is the thing such that, if your decision algorithm were to output that thing in all places where it's instantiated, the outcome would be best. Your algorithm should cooperate in Prisoner's Dilemmas against your twins and other time slices, and one-box in Newcomb problems where the predictor is using your algorithm to predict (but not in medical Newcomb problems where your algorithm is downstream of the common cause correlating some behavior with cancer).

It's definitely good enough for this rough characterization, but I think it runs into problems with the fact that there just isn't any precise set of cooperators (or places where your decision algorithm is instantiated). Probably some people will cooperate with some plans and not others, and you should take into account how much good could come of each of those different amounts of cooperation.

Also, it's not always clear that marginal returns diminish. If you put a lot of resources into vaccination against polio, or measles, or smallpox, there's definitely some diminution of return for a while. But at a certain point, you can actually eliminate the relevant virus, and at that point there's a huge jump in returns! It's much better to fully eradicate one of these viruses than it is to do half the effort it takes to eradicate one and half the work to eradicate the other.

Interestingly, if you have a totally fungible supply of resources, and a bunch of orthogonal causes where returns always diminish such that the amount of resources it takes to do a given amount of good in each cause grows precisely quadratically, then the optimal distribution of resources among those causes *is* in fact to divide the resources in proportion to the marginal good done by applying resources at the current state. (But I assume that Max's case doesn't fit these stipulations.)

Expand full comment
4 more comments...

No posts

Ready for more?