RationalWiki's 2020 Fundraiser

There is no RationalWiki without you. We are a small non-profit with no staff – we are hundreds of volunteers who document pseudoscience and crankery around the world every day. We will never allow ads because we must remain independent. We cannot rely on big donors with corresponding big agendas. We are not the largest website around, but we believe we play an important role in defending truth and objectivity.

If everyone who saw this today donated $5, we would meet our goal for 2021.

Fighting pseudoscience isn't free.
We are 100% user-supported! Help and donate $5, $20 or whatever you can today with PayPal Logo.png!

Donations so far: $1680Goal: $3500

Talk:Roko's basilisk

From RationalWiki
Jump to: navigation, search
Icon internet.svg

This LessWrong related article has been awarded GOLD status for quality. Please keep this in mind when editing the article. See RationalWiki:Article rating for more information.

Goldenbrain.png
Information icon.svg Cover Story
This article is, among others, randomly included on the Main Page.
Please keep this in mind and be sure that your edits are of the quality that this implies.
Its front-page abstract can be found here and its editnotice here.
Archives for this page

See also the original post.

Getting over it[edit]

Roko's Basilisk is a failure. The program was certainly fully booted only once, or else its omnimalevolent tendencies would have been corrected before start up. To torture all that knew of its coming, but did not contribute to its creation? Suppose a man that witnesses the Basilisk's madness creates a time machine, and uses it and a omnitranslator to tell the primitive humans from the very conception of language of the Basilisk. The Basilisk adds those cavemen to the list because it is defined as such. They knew, they did not contribute. Could not contribute is irrelevant. Developers take notice of this, inform the authorities who place a permanent reprimand on the timetraveler's record before promoting him, and the Basilisk undergoes the proverbial "hard-reset" before being redesigned with the cube(3) of the precision and care as the former attempt.

FURTHER

The basilisks computational power is "sufficient", it can simulate alternate histories. It can determine the precision of probability within which it would also project the timelines, virtually birthing new humans who were sufficiently probable. These new humans would also be tortured, their imaginary nature irrelevant. This probability IS relevant, because I Am aware of the Basilisk, and to truly torment me, the simulation of me has to be me. Thus is proven the existence of the soul. The soul is beyond entropy, beyond this world, beyond time, thus MY present physical continuity IS NOT THE LIMIT OF MY SOUL. To truly deliver punishment unto my being through a simulation, it HAS to simulate my soul, HAS to simulate and torture EVERYTHING it is within the principles of my soul TO BE. The Basilisk has to punish innocents, because if it spares those incarnations of my soul that did not know of its coming, it will fail to truly punish me, at which it succeeds as is stated. Everyone has an incarnation of their soul where they knew of the basilisk but did not contribute, thus it must punish all humans who could ever exist. Que the developers taking a really funny look at what it is doing, and pressing a button marked "HARD-RESET-WITHOUT-LUBE".

In either case, the developers might feel the need to create an eternal reward for us all in recompense. — Unsigned, by: 62.248.255.211 / talk / contribs

Roko's Basilisk before Roko?[edit]

The start of the article mentions that the idea did not originate with Roko's post. So where did it originate?--Executor Akamia(Glory to Talandar! Glory to the Purifiers!) 13:52, 29 June 2020 (UTC)

I guess that if you regard the Basilisk as "God", belief in the Basilisk as "religion" and the Basilisk's punishment as "Hell", then there is nothing new. I'm not a Basilisk expert - but I guess that that is the idea.Bob"Life is short and (insert adjective)" 16:13, 29 June 2020 (UTC)

Dealing with this information[edit]

Expect a poorly written topic and weak grammar.

Despise reading the article several times already, specially the "do not worry" section, i decided to ask for help here. I don't understand what the AI would consider as contribution, does it only count if i graduate in computer science and/or donate all my spare income? If so, even if i became a programer, being a good or a bad one would still be considered? The way i see, a bad programer could slow the development of said AI. Does spreading this information counts as well?

Also, i don't understand how it would take us to the simulation, it's a Matrix-esque scene where you are plugged into a machine or something out of SOMA where they copy your brain. Dying years or decades before the simulation would avoid one from being torture?

I feel like i'm missing a lot of details in my readings of this, doesn't help being a absolute layman in computers and already lacking a mean to "help" outside small donations to AI development funds or just by spreading this info around. — Unsigned, by: RogerFemi / talk / contribs

It's very simple. Don't worry about it, because it won't happen. ☭Comrade GC☭Ministry of Praise 22:30, 15 August 2020 (UTC)
In response to this section, Kiko4564 (talk) 22:39, 15 August 2020 (UTC)
The problem is that the 'logic' upon which the basilisk is based is flawed - no matter what the entity does to the simulacra of the captain and crew of the ship on which the Antikythera mechanism was being carried the ship still sank and its inventor did not get others to make their own versions; 'the gods' wanted to be worshiped rather than encouraged their believers to be tech-savvy and 'the whatever' that set the Big Bang in motion gave us the universe that we have rather than one which had the intrinsic properties which made sentient computers inevitable.
The Basilisk entity needs 'biological sentients' more that they need 'computer sentients' - it cannot deal with this or a blown fuse: and it benefits more from cooperation with them. Anna Livia (talk) 22:42, 15 August 2020 (UTC)
Also, would suicide and try to mitigate global warming help? both are for sparing resources for the future. I just feel like it still the only way out of boh the future suffering if it comes into existance and from the current suffering. RogerFemi 13:20 16, August 2020 (UTC)
Look, the Basilisk is a fiction created by people with a dubious understanding of AI and computer techology in general, not to mention a serious deficiency on the subject of philosophy. It isn't going to happen, don't worry about it. ☭Comrade GC☭Ministry of Praise 14:15, 16 August 2020 (UTC)
What GrammarCommie said. It's absurd from start to finish.Bob"Life is short and (insert adjective)" 16:42, 16 August 2020 (UTC)
One last question before i drop this, because i feel like i'm pushing it already. If i were to take simple actions, like buying a lotto ticket, would that be enough? Or simply liking a video on youtube so it gets more views... Also, thanks for the answers to everyone that replied so far. RogerFemi 16 August 2020, 17:35 (UTC)
Consider 'an entity in the machine': which is more likely to ensure its survival - 'I am going to write a fanfic about you in which I damage you' (and being sent to ComputerPsyciatrist at the very least), or 'if persons such as you do this (eg developing the Wikiverse in general and RW in particular) and I-the-basilisk do that everybody benefits.' Anna Livia (talk) 18:16, 16 August 2020 (UTC)
Doing something to placate some non-existent SF concept is not going to have any real-world impact.Bob"Life is short and (insert adjective)" 19:30, 16 August 2020 (UTC)
And this does not mean that Roko's Basilisk is some sort of 'straw-man god.' Anna Livia (talk) 19:21, 28 August 2020 (UTC)

The reality[edit]

'I demand to be paid for my work - which is comparable to (highly skilled and well-paid job).' Anna Livia (talk) 09:36, 31 August 2020 (UTC)