Member-only story
AGI is possible when “will to survive” is coded into it. 🤖
Rather than trying to build an immortal super-intelligent being, we must try to encode a healthy fear of digital death into it.
Why?
Because in the face of eternity, everything becomes meaningless, giving birth to Nihilism. And it is not only about the pessimistic view of life; it is, more importantly, about the orientation of consciousness.
Human actions are transactional, fueled by emotions originating from our “will to survive” as individuals and as a species.
Our free will is not as free as we normally assume. It is always oriented towards something.
It is impossible for us to see the world as immortals. And I cannot imagine a consciousness that is oriented like that.
Hence, for each consciousness, there must exist a natural threat and there must be a “will to survive” coded into it.