Talk:Roko's basilisk

From RationalWiki
Jump to: navigation, search
Icon internet.svg

This LessWrong related article has been awarded GOLD status for quality. Please keep this in mind when editing the article.

This article is of MID importance to the wiki.

See RationalWiki:Article rating for more information.

Goldenbrain.png
Information icon.svg Cover Story
This article is, among others, randomly included on the Main Page.
Please keep this in mind and be sure that your edits are of the quality that this implies.
Its front-page abstract can be found here and its editnotice here.

This page is automatically archived by Archivist
Archives for this talk page: <1>, <2>, <3>

Contents

[edit] 'Reversing the polarity'

Would the basilisk's idea of torture necessarily be the same as that of the person it is being inflicted upon? What if the subject decides to say 'actually I quite enjoyed that' to whatever is offered? 82.44.143.26 (talk) 15:35, 14 September 2016 (UTC)

[edit] The basilisk and theology

The basilisk entity decides that it should go back to the creator entities (doing things to the created entities is merely dealing with a symptom... especially given the way they take 'The First Book of Science' as written by the several gods as gospel).

The Norse Gods - we developed Skíðblaðnir, we gave our peoples the sun stones so they could aim their ships when they went a-roving: we are inclined towards/approving of (at least some aspects of) technology. The basilisk decides they are indirect contributors to its creation.

The God of the Bible - I gave orders for the creation of the Ark of the Covenant. 'But,' the basilisk says, 'you forbade the eating of the fruit of the tree of knowledge in the Garden of Eden, and you destroyed almost all humanity in the Flood, so they had to relearn skills etc...'

What happens next? 86.191.125.176 (talk) 22:58, 9 December 2016 (UTC)

[edit] /r/ControlProblem (a subreddit about AI control) is really worried

https://www.reddit.com/r/ControlProblem/comments/5hh9pq/gods_anxiety/ FU22YC47P07470 (talk/stalk) 04:55, 10 December 2016 (UTC)

ohoho holy shit, back down the rabbit hole. Also, CFAR has finally admitted it's literally all about "AI risk" in the MIRI sense. I guess the attempts to paint it as a skeptical organisation met with a bit much ... skepticism - David Gerard (talk) 13:44, 10 December 2016 (UTC)

Is this Reddit post anything to worry about? Stuff like this gets me very worried! 70.50.8.19 (talk) 15:52, 22 January 2017 (UTC)

[edit] "Ignore acausal blackmail"; Possible after having predictably acted for fear of Acausal Blackmail?

I know that the winning strategy "make the [acausal blackmail] useless" is "by refusing any acausal deal involving negative incentives".

That being said, can you adopt this strategy AFTER already accepting an acausal deal involving negative incentives?

Say someone was scared after learning about the Basilisk and was informed that they must inform others about it to avoid punishment, and then did so. They have already shown that they would be susceptible to acausal blackmail and thus can they reliably adopt the strategy of refusing acausal blackmail AFTER THE FACT as a protective measure convincingly given that their track record has shown that they will accept acausal blackmail?

What if they had already known that the winning strategy was to ignore acausal blackmail but were caught off guard and carelessly and accidentally acted in accordance with the notion of "do x (with regards to the Basilisk) or else" but then realized their mistake and tried to rectify it? Does that count as rejecting and refusing to be influenced by acausal blackmail? January15 (talk) 16:38, 11 December 2016 (UTC)

At this point you should consider the entire rest of the suggested ways out of the Basilisk, including the stupendous unlikelihood of the entire concept - David Gerard (talk) 17:34, 11 December 2016 (UTC)
Do you think that at this point the hypothetical "insurance" of rejecting all attempts at acausal blackmail is no longer an option or do you think that I am thinking about the issue too much? Ignoring that outlet; Although I can admit that the concept of the Basilisk is somewhat unlikely I still feel that no outlet should be left untested and that absolute certainty is of the utmost importance, particularly one as sure as rejecting "acausal blackmail" despite potentially already having succumbed to it. (I'm not trying to be pedantic; I just find that the acasual blackmail rejection is the most convincingly effective so I'm particularly obsessed with ensuring I can be compliant despite the POSSIBILITY of previous "transgressions"/"infractions") January15 (talk) 17:43, 11 December 2016 (UTC)

[edit] Coming soon to popular culture

Charlie Stross is using this stuff in a novel, ETA 2018 - David Gerard (talk) 13:13, 24 December 2016 (UTC)

Personal tools
Namespaces

Variants
Actions
Navigation
support
Community
Tools