Pretty standard EA book written by Will MacAskill. Aims to bring to the public the idea that maybe future people hold significantly higher moral status than we assign, and that we should focus on taking actions that preserve those future people.

It was convincing, but kind of dry (Where’s the magic?). The population ethics part felt tangential at best and kind of unnecessary (though enjoyable).

Part 1: The Long View

Nothing much new.

Part 2: Trajectory Changes

The idea behind Ossification was something I had been thinking about myself, and I love this phrasing of it as a mental model for creating change.

Part 3: Safeguarding Civilization

Extinction

Basically we need to help prevent extinction

  • AI risk
  • Bioweapons
  • Great Power War (increases risk of above)

Collapse

Pretty optimistic about humanity recovering from societal collapse (though some flawed reasoning I think).

Believes that it’s important to keep fossil fuels in the ground to give us a second chance of industrialization if there is a collapse.

Stagnation

Thinks that we are basically a free climber scaling a mountain, and we need to push on and get to the top instead of stopping now, because then we would fall. Technology now and in the near future is very dangerous, and we need to advance counters to that.

Objection: I don’t think that this reasoning has much merit. I don’t think that you can get the positive effects of progress without increasing risk. They’re inseperable. “There is a sense in which human understanding is always dual use: genuine depth of understanding makes the universe more malleable to our will in a very general way. For example, while the insights of relativity and quantum mechanics were crucial to much of modern molecular biology, medicine, materials, computing, and in many other areas, they also helped lead to nuclear weapons. I don’t think this is an accident: such dual uses are very near inevitable when you greatly increase your understanding of the stuff that makes up the universe.” - Michael Nielsen: https://michaelnotebook.com/xrisk/

Part 4: Assessing the End of the World

Hm. Population ethics is tricky. Logically, I think the argument is correct but it just feels wrong.

Part 5: Taking Action

Standard EA stuff

  • Donate to GWWC
  • Vote
  • Career