Mind uploading

From RationalWiki
Revision as of 20:14, 21 December 2020 by Leucippus (talk | contribs) (Reverted edits by Leucippus (talk) to last revision by Przciąszczłóśćiek)
Jump to navigation Jump to search
Thinking hardly
or hardly thinking?

Philosophy
Icon philosophy.svg
Major trains of thought
The good, the bad,
and the brain fart
Come to think of it
Warning icon orange.svg This page contains too many unsourced statements and needs to be improved.

Mind uploading could use some help. Please research the article's assertions. Whatever is credible should be sourced, and what is not should be removed.

Mind uploading is a science fictional trope and popular desired actualization among transhumanists. It's also one of the hypothesised solutions to bringing people back from cryonics. It posits that your soul 'mind pattern' can be implemented in a computer.

Technical feasibility

The first, and main, problem is that the "mind" isn't a physical thing. "Minds" are emergent properties of living physical structures such as brains. So what you would need to do is preserve most of (although exactly how much isn't yet known) the electrical, chemical and physical information contained in a living connected-up brain at one particular instant, and then recreate the instantaneous set of electrical and chemical data in a new physical substrate and get it set up so that it immediately creates the same set of emergent properties. This is not going to happen very soon and, some argue, ever.

Nevertheless, proponents will typically say you just need to preserve a dead person's brain, slice it very thinly, scan each piece with microscopes, and reconstruct and run the connections on a computer. With continued exponential improvements in computing, this will soon be possible!

Except it isn't that simple. The brain is not a 'computer' as such, and the neurons are much more complicated than the simplified 'neurons' of machine learning. It isn't feasible to preserve a dying brain before cell death destroys much of the information you are trying to get. Even if it were, preservation techniques only allow one to see the structure of the connections between neurons, but further electrical and chemical detail is lost.

The brain, like any organ, works via biochemistry. It doesn't have a standardized computer architecture whereby you can download data. Vital information about the distribution of various molecules and how they are distributed and interact needs to be recorded, but this is heavily damaged by any preservation solution. There does not appear to be a way, even in theory, to preserve the biochemistry in a readable state. Not only that, but the brain is a wet, organic analogue processor; it will certainly not be possible to copy it to dry, inorganic digital silicon without massive changes to the enormous amounts of data you would need to obtain. However, since it is not yet known how much information is necessary to functionally recreate a human brain, the degree to which the above is a problem is as of yet undetermined.

As biologist PZ Myers - who freezes zebrafish brains a whole lot, and would be delighted to have anything recoverable at the end - explained:

We don’t have a method to lock down the state of a 1.5kg brain. What you’re going to be recording is the dying brain, with cells spewing and collapsing and triggering apoptotic activity everywhere. And that’s another thing: what the heck is going to be recorded? You need to measure the epigenetic state of every nucleus, the distribution of highly specific, low copy number molecules in every dendritic spine, the state of molecules in flux along transport pathways, and the precise concentration of all ions in every single compartment. Does anyone have a fixation method that preserves the chemical state of the tissue? All the ones I know of involve chemically modifying the cells and proteins and fluid environment. Does anyone have a scanning technique that records a complete chemical breakdown of every complex component present?
[1]

The concept has been criticized further by Myers[2][3][4] and by neuroscientist Kenneth D. Miller.[5]

Additionally, computer emulations of brain activity, even if it was just the connections between neurons, are not going to be affordable. This means that the price of computing cannot keep falling like it has, so the enormous supercomputers that would be required to run any uploaded mind would be unaffordable, even in the future.

It seems likely that the best and most efficient medium for running a human mind is a human brain, so keep yours in good working order.

The less crazy transhumanist think that brain uploading would involve cutting up the brain. [6] The more crazy guys think that nanotech would allow a slow and steady replacement of brain's tissue to the computing substrate. [7]

Philosophical problems

Several metaphysical questions are brought up by the prospect of mind uploading. Like many such questions, these may not be objectively answerable, and philosophers would no doubt continue to debate them even if uploading somehow became a reality.

The first major philosophical question is more or less falsifiable: whether consciousness is functionally replicable in its entirety. In other words, assuming that consciousness is not magic and that it supervenes upon processes in the brain, does it depend on any special functions or quantum mechanical effects that cannot ever be replicated on another substrate? This question, of course, remains unanswered although, considering the current state of cognitive science, it is not unreasonable to think that consciousness will be found to be replicable in the future.

Assuming that consciousness is proven to be functionally replicable, the second question is whether the "strong AI hypothesis" is justified or not: if a machine accurately replicates consciousness, such that it passes a Turing Test or is otherwise indistinguishable from a natural human being, is the machine really conscious, or is it a soulless mechanism that is merely a functional analogue of a conscious being?

Third, assuming that a machine can actually be conscious (which is no great stretch of the imagination, considering that the human brain is essentially a biological machine), is a copy of your consciousness really you? Is it even possible to copy consciousness? Is mind uploading really a ticket to immortality, in that "you" or your identity can be "uploaded"?

Advocates of mind uploading take the functionalist/reductionist approach of defining human essence as an identity based on memories and personality rather than physical substrates or causal relationship.[8] They believe that the identity is essential; the copy of the mind holds just as much claim to being that person as the original, even if both were to exist simultaneously. When the physical body of a copied person dies, nothing that defines the person as an individual has been lost. In this context, all that matters is that the memories and personality of the individual are preserved. As the recently murdered protagonist states in Down and Out in the Magic Kingdom, "I feel like me and no one else is making that claim. Who cares if I've been restored from a backup?"

Skeptics of mind uploading[9] question if it's possible to transfer a consciousness from one substrate to another, and hold that this is critical to the life-extension application of mind uploading. The transfer of identity is similar to the process of transferring data from one computer hard drive to another. The new person would be a copy of the original; a new consciousness with the same identity. With this approach, mind uploading would simply create a "mind-clone"[10] — an artificial person with an identity gleaned from another. The philosophical problem with uploading "yourself" to a computer is very similar to the "swamp manWikipedia" and teleportation thought experiments. [11] Suppose Alec Davidson goes hiking in the swamp and is struck and killed by a lightning bolt. At the same time, nearby in the swamp another lightning bolt spontaneously rearranges a bunch of molecules such that, entirely by coincidence, they take on exactly the same form that Dr. Holland's Davidson's body had at the moment of his untimely death. This being, whom Davidson terms Swamp Thing "Swampman," has, of course, a brain which is structurally identical to that which Davidson had, and will thus, presumably, behave exactly as Davidson would have. He will walk out of the swamp, return to Davidson's office at Berkeley, and write the same essays he would have written; he will interact like an amicable person with all of Davidson's friends and family, and so forth. This is one reason that has led critics to say it's not at all clear that the concept mind uploading is even meaningful. [12] For the skeptic, the thought of permanently losing subjective consciousness (death), while another consciousness that shares their identity lives on yields no comfort. Daniel Dennett, in Consciousness Explained, has called into question the validity of these sorts of thought experiments altogether, maintaining that when a thought experiment is too far removed from the actual state of affairs, our intuitions cease to be meaningful. Additionally, if causal relationship and the subjective self are both taken as necessary components of conscious identity (which are the key tenets of the predominant skeptic view), then a simple loss of consciousness (such as during anaesthesiaWikipedia) should be sufficient for the permanent cessation of one consciousness and the creation of a new one. Of course, we don't usually think of people before and after surgery as different people, which is another problem for skeptics of mind uploading to address.

At the end of the day, believing that there is some mystical "essence" to consciousness that isn't preserved by copying is a form of dualism. Even in extreme cases where humans completely cease all activity, brain or otherwise, during deep hypothermic circulatory arrestWikipedia, they still remain the same person on resuscitation,[13] demonstrating even more conclusively that continuity of consciousness is not necessary for identity or personhood. Rather, the properties that make us identifiable as individuals are stored in the physical structure of the brain.

Mind uploading also has ethical issues, especially in what refers to duplicates of a given self, as well as others regarding the harmful things that could be done on what basically would now be an equivalent of a computer file or program, and that (at least for now and at least not so easily too) cannot happen in a human mind[14] ─ namely, erasing it killing for good the person, modifying its contents, deleting and/or adding others, merging two or more previous selves into each other and the converse, being copied or moved ad infinitum, messing with inputs (including sending someone to the equivalent of hell), messing with the way time is felt by speeding or slowing the simulation (or causing it to enter into an infinite loop)... the list goes on.

Ultimately, this is a subjective problem, not an objective one, and depends largely on how you choose to view identity. However, no matter which definition you choose, you always have to be logically consistent ─ something which many views, such as that consciousness is artificially replicable but not digitally transferrable, fail to be.

See also

External links

References

  1. And everyone gets a robot pony!, PZ Myers.
  2. Reconstructing a brain, PZ Myers.
  3. We’ll get brain-uploading about the time we get teleportation, PZ Myers.
  4. How can you protect a brain by destroying it?, PZ Myers.
  5. Will You Ever Be Able to Upload Your Brain?, Kenneth D. Miller.
  6. https://hpluspedia.org/wiki/Mind_uploading
  7. https://www.ibiblio.org/jstrout/uploading/nanoreplacement.html
  8. SMBC answers swamp man.
  9. Mind Uploading and The Teleportation Problem April 16, 2015, Isidore MacDubh
  10. Meet Bina-48, an artificially intelligent 'mind clone', Dylan Love, Feb 9, 2015
  11. Downloading Consciousness: Philosophical Issues, Stanford University
  12. Davidson, Donald (2001 (1987)). "Knowing One's Own Mind" Reprinted in Subjective, Intersubjective, Objective (pp. 15–38). New York and Clarendon: Oxford University Press. Originally published in Proceedings and Addresses of the American Philosophical Association, 60 (1987), 441-58.
  13. Neurologic Recovery after Prolonged Circulatory Arrest in Surgery for Aortic Dissection December 2008 DOI: 10.1532/HSF98.20081067 "The patient's head was packed in ice to facilitate maintenance of brain hypothermia. Her average systemic temperature during the third period of circulatory arrest was 22.5 degrees C. Extensive neuropsychologic testing, which was performed to assess the patient's cognitive functions and abilities at 4-month follow-up, showed an absence of global cognitive decline and only a moderate impairment of attentional capacity. Overall cognitive functioning was within the normal range and did not interfere with everyday activities or quality of life."
  14. Except, of course, the biological equivalent of destroying the computer that is running/storing the uploaded mind