This is a massive pet peeve of mine. Some of the smartest people I know have this idea that once we “solve alignment”, we’ll proceed to live in a perfect utopia of human flourishing.

My biggest quibble with this is that, advancement’s in knowledge are almost always dual use. I think it’s highly unlikely that there’s a summit to reach where everything will be perfect forever. As we gain more knowledge and understanding, we increase our general capabilities to influence the world and I don’t think you can reliably separate the positive from the negative. Yes, we will be more likely to find cures to terrible diseases, reverse aging, perhaps even increase global equality. But that comes part and parcel with increased warfare capabilities, bioweapon risks, and the like. The most you can do is try to suppress the negative and amplify the positive as much as possible.

I think this is actually what most updates me to be pessimistic about the extremely long-term future of humanity. I see technological progress as continuing to build on top of a very weak foundation (tribalism, fear of the unknown/outsiders, quickness to violence) selected by evolution. As our ability to negatively affect the world increases over time, it seems almost inevitable that things go bad. In fact, in a weird way, taking to the stars might be the best thing for the survival of humanity. The more civilizations we can create, and the greater distance between them, the greater the chances of survival seem to be. Even here, though, there’s a high risk that there will be squabbling over territory, fights over who gets to go to space first, etc.