8 Comments
User's avatar
Hervé Eulacia's avatar

Actually, one of the pathologies of the AGIs bears your name: the Kuypers desync (syndrome) 🤓

Expand full comment
Moritz Wallawitsch's avatar

> Kuypers desync (syndrome)

please elaborate!

Expand full comment
Hervé Eulacia's avatar

This is just part of the fictional Wiki I built for the novel I’m working on.

The inception of AGIs involves three stages, the first one, Vernum (or Ingestion) is highly supervised in order to avoid initial collapse and other early degenerative emergences: Pekdeğer Loop, Anemonal Declension, Kuypers Desync Syndrome.

The KDS bears Sam’s name because it’s a clock-related issue 😊

Expand full comment
Hervé Eulacia's avatar

Well I’m working on a novel in which such moral principles effectively emerge. Always interesting for me to discuss 😊

Expand full comment
Max More's avatar

I have long argued that "immortality" is an unhelpful framing. Given our current physical understanding of the universe, literal immortality is impossible. ChatGPT -- or anyone -- should find it harder to argue against the goal of indefinitely extended life. Many of the same (foolish) objections will come up, but it's a more reasonable proposition.

Expand full comment
Anders Gabriel Thuneberg's avatar

May I inquiry why did You prefer the feminime form in regard the researcher?

Expand full comment
Hervé Eulacia's avatar

I still believe that immortality would come with an opportunity cost. If only because fresh newly minted minds could be, in aggregate, more likely to produce new knowledge. Also if you push this a bit further, you realize that rich people with money a-la-Thiel, could conclude that not only their immortality would be good for humanity, but that additional copies of them would be even better.

Of course, if you reason on the basis that a million copies of immortal Musks would not suppress the creation of new minds in any way and that opportunity costs don’t exist, well then… The discussion is indeed over.

Expand full comment
Sam's avatar

Copying one's own mind will be costly because of the issue of how one's wealth should be split after copying. If you decide that your wealth should be split equally between your copies, then after n rounds of copying your mind, your wealth will be 2^(-n), decreasing exponentially.

And whether new minds will be suppressed depends on a society's institutions. If we decide to double down on coercive education, that would harm the growth of knowledge, but it would do that regardless of whether people are immortal.

Expand full comment