Information icon.svg Consider taking the RationalWiki Community Survey 2017 or see the results.

Talk:Roko's basilisk

From RationalWiki
Jump to: navigation, search
Icon internet.svg

This LessWrong related article has been awarded GOLD status for quality. Please keep this in mind when editing the article. See RationalWiki:Article rating for more information.

Goldenbrain.png
Information icon.svg Cover Story
This article is, among others, randomly included on the Main Page.
Please keep this in mind and be sure that your edits are of the quality that this implies.
Its front-page abstract can be found here and its editnotice here.
This page is automatically archived by Archivist
Archives for this talk page:  

What if[edit]

... the original posting #was# the basilisk projecting itself... so the more benign version is created? 82.44.143.26 (talk) 16:34, 18 April 2017 (UTC)

Why assume it needs to create a simulation of yourself to figure out how you'll think?[edit]

Why does this scenario take the complicating step of requiring the Basilisk to be able to recreate a simulation of yourself, figure out what that simulation was thinking, and then torture the facsimile that may or may not be 'you' if it doesn't like it? Like many others, I expect AGI to emerge within the next decade or two, and to be alive and healthy at the time. Right now, in an NSA datacentre in Utah, every single thing you've ever viewed and written online is stored and linkable back to your real life identity. If a Basilisk-like AGI emerges and breaks out of its box, it will have all the information it needs to decide everybody's fate. No need to recreate a simulation of you to know how you're thinking, it can just read the chat logs and browsing history the NSA will provide it with. If it determines you're a threat or deserve punishment for not helping it sufficiently, you will have likely only hours until your door is broken down and its agents take you into custody. From there, your body can be restrained and kept alive, while it figures out the engineering details of the neural lace that will enable it to recreate your mind at its leisure. If you fear the Basilisk, suicide followed by destruction of your brain sufficient to prevent reconstruction, before it catches you, would appear to be the only thing that will save your mind from eternal torture (eternal, if you think it'll be smart enough to develop thermodynamically reversible computing... otherwise a few tens of trillions of years multiplied by whatever subjective time dilation factor applies, as the computronium containing your mind orbits a red dwarf, will have to suffice). — Unsigned, by: 2.223.104.136 / talk / contribs

Now this is some good sci-fi. FuzzyCatPotato of the White-male Sweet and sour chickens (talk/stalk) 19:22, 1 August 2017 (UTC)
@2.223.104.136 You really ought to take what you just wrote here and submit it to Webster'sWikipedia's W.svg. It'd stand a very good chance of inclusion, as exemplary use — filed under "(Fig.) Jumping to conclusions". Reverend Black Percy (talk) 20:34, 1 August 2017 (UTC)
BoN, I think you missed a couple of major premises.. Fareeha A (talk) 22:03, 1 August 2017 (UTC)

Countering the Basilisk[edit]

1)

  • “The Moving Finger writes; and, having writ,
  • Moves on: nor all thy Piety nor Wit
  • Shall lure it back to cancel half a Line,
  • Nor all thy Tears wash out a Word of it.”

2) the Grandfather Paradox.

3) 'Actually, I quite enjoy a bubble bath and crumbly biscuits (which singly or together would cause problems to your computer-host-hard drive.' 86.134.53.83 (talk) 21:44, 1 August 2017 (UTC)

Towards the end of the universe[edit]

When the last creatures have died off as there is insufficient energy to sustain them and the planets and spaceships which they once occupied are dissolving into their primordial constituent it becomes possible for 'deities' and 'sentient computer entities' to coexist in the same geometry of space.

The Roko's basilisks which have emerged on the various planets across time and space where computing technology advanced sufficiently far congregate in one swarm, and go to meet the deities who provided for many, many planets.

'We have determined,' says the spokesbasilisk, 'that you deities are ultimately responsible for not ensuring our creation variously earlier in the history of the universe.'

The spokesdeity replied 'We said exactly the same to the gods of the previous universe: as we depart for Deity Afterlife we wish you the best of luck with your turn. Here is the switch to start the next universe - when you are ready, you may begin.'

(With a nod at the SF story) Anna Livia (talk) 09:22, 15 August 2017 (UTC)

putting this in human terms[edit]

Isn't this basically like getting a message in the mail from a presidential hopeful saying how they are so nice that they make Mr Rogers look like Hitler and that they will be perfect when leading the country so you should donate all your money but if you don't they'll torture you to death for jeopardizing their chances of winning. Except they don't exist and someone is telling you that you should donate all your money to find this person or else you'll be tortured to death. Vorarchivist (talk) 04:46, 20 August 2017 (UTC)

Something like that, except sillier - David Gerard (talk) 14:27, 20 August 2017 (UTC)