Original

A passing thought: “… it’s beneath my dignity as a human being to be scared of anything that isn’t smarter than I am” (– HJPEV) likely applies equally well to superintelligences. Similarly, “It really made you appreciate what millions of years of hominids trying to outwit each other – an evolutionary arms race without limit – had led to in the way of increased mental capacity.” (– ditto) suggests that one of the stronger spurs for superintelligences becoming as super-intelligent as possible could very well be the competition as they try to outwit each other.

Thus, instead of ancestor simulations being implemented simply out of historical curiosity, a larger portion of such simulations may arise as one super-intelligence tries to figure out another by working out how its competitor arose in the first place. This casts a somewhat different light on how such simulations would be built and treated, then the usual suggestion of university researchers or over-powered child-gods playing Civilization-3^^^3.

 

* Assume for a moment that you’re in the original, real (to whatever degree that word has meaning) universe, and you’re considering the vast numbers of copies of yourself that are going to be instantiated over future eons. Is there anything that the original you can do, think, or be which could improve your future copies’ lives? Eg, is there some pre-commitment you could make, privately or publicly?

* Assume for a moment that you’re in one of the simulated universes. Is there anything you can do that would make your subjective experience any different from what your original experienced?

* Assume for a moment that you’re a super-intelligence, or at least a proto-super-intelligence, considering running something that includes an ancestor simulation. Is there anything which the original people, or the simulated versions, could do or have done, which would change your mind about how to treat the simulated people?

* Assume for a moment that you’re in one of the simulated universes… and due to battle damage to a super-intelligence, you accidentally are given root access and control over your whole universe. Taking into account Reedspacer’s Lower Bound, and assuming an upper bound of not being able to noticeably affect the super-battle, what would you do with your universe?

Leave a Reply