Scope Insensitivity

to-process

Metadata

Page Notes

Highlights

  • Once upon a time, three groups of subjects were asked how much they would pay to save 2,000 / 20,000 / 200,000 migrating birds from drowning in uncovered oil ponds. The groups respectively answered 78, and $88.1 This is scope insensitivity or scope neglect: the number of birds saved—the scope of the altruistic action—had little effect on willingness to pay.—Updated on 2024-02-19 15:43:03—Group:Public
  • The usual finding is that exponential increases in scope create linear increases in willingness-to-pay—Updated on 2024-02-19 15:45:23—Group:Public
  • The moral: If you want to be an effective altruist, you have to think it through with the part of your brain that processes those unexciting inky zeroes on paper, not just the part that gets real worked up about that poor struggling oil-soaked bird.—Updated on 2024-02-19 15:51:02—Group:Public
    • Annotation: This looks like good evidence to leave moral decisions to your System 2. Scope insensitivity would hurt my intuitions of what is right and wrong, since systems (e.g. utilitarianism) depend on calculations. #to-process

to-process

  • Once upon a time, three groups of subjects were asked how much they would pay to save 2,000 / 20,000 / 200,000 migrating birds from drowning in uncovered oil ponds. The groups respectively answered 78, and $88.1 This is scope insensitivity or scope neglect: the number of birds saved—the scope of the altruistic action—had little effect on willingness to pay.—Updated on 2024-02-19 15:43:03
  • The usual finding is that exponential increases in scope create linear increases in willingness-to-pay—Updated on 2024-02-19 15:45:23
  • The moral: If you want to be an effective altruist, you have to think it through with the part of your brain that processes those unexciting inky zeroes on paper, not just the part that gets real worked up about that poor struggling oil-soaked bird.—Updated on 2024-02-19 15:51:02
    • This looks like good evidence to leave moral decisions to your System 2. Scope insensitivity would hurt my intuitions of what is right and wrong, since systems (e.g. utilitarianism) depend on calculations.