. /../Definition of.../ 1
lost, not forgotten
written by Alex on Nov 05, 2016 03:41
I don't mean the self object in programming languages, nope. I mean what we consider our very own self - what we are while we think and feel. I've given a try at defining the "self" in what I'd consider the most credible way. And here are my thoughts about this. It's a pretty scary topic, although I've gradually came to accept its consequences. It's hard as it goes against lifelong assumptions and conservation instincts, but it can be done.
 
I'm thinking all we are can be condensed into one word: information. I am the sum of all information encompassed by my organic nervous system. As long as such information is being processed according to the laws of physics governing a live organism, i.e. as long as my nervous system is processing the said information, and accordingly evolving, I'm alive and functioning.
 
But the implications of this are extremely hard to accept, especially because the above definition assumes our feelings of identity and continuity are merely epiphenomena of our mind; in simpler terms, they both are just illusions. And that's the fascinating point, the most scary one, far beyond proving that the above definition is true, beyond giving up the idea of an immortal soul of some kind, belief in reincarnation or in any other form of afterlife. I've been sometimes proposing this concept to others, and found the same extreme resistance, stubborn aversion to the hypothesis by which personal identity is not a unique, inherent property of our self.

If we are only information, then we can be reproduced by means other than our sole organic nervous system. We could be copied into a fictional, but plausible, whole-brain emulator (WBE). We could be merely treated as "files", copied and moved from one support to another one. As long as the "file" isn't processed in a WBE, it can't be considered alive; as soon as the WBE begins simulating the due physics, however, the "file" comes to life and is basically a person. What I find the most fascinating idea arises as you think that an identical copy of yourself, at that point, would BE yourself as much as you are.

Sure, if you let both the original and the duplicate in a working condition at the same time, then the duplicate will quite instantly diverge, it wouldn't be the same information as contained in the original. But what if the original is dead, i.e. no longer functioning? Imagine your brain (and whatever else takes to constitute yourself) was vitrified, then scanned by advanced technology in a remote future, and encoded into a WBE: well, what's really hard to accept is that that copy of yourself, so distant in time, would have to be considered to be yourself, your own continuation. From your point of view, according to your memories and feelings, nothing would have happened in-between, while you were pretty much dead: you'd remember losing consciousness, probably, but then it wouldn't be much different from falling asleep for a split second, to then wake up again. Centuries may have passed, but from your point of view, they'd feel like the blink of an eye.

And would that be still you? That's the really interesting point. Those I've been discussing the idea with, all tended to somehow deny it would be the same person who died long time before the "duplicate" was brought to life in a WBE. If you're concerned about the rest of the body, that's not the point: it could be cloned, then the WBE could be interfaced to it in some way. It's a thought experiment so we don't really have to limit the possibilities, as long as they're plausible.

My conclusion was that, yeah, as weird and scary as it feels, yep, the duplicate would be me: a hundred percent myself, not at all different from my tomorrow's self as I wake up from sleeping. The original support would be gone and forgotten a long time before, but the information would survive and I would have to admit that would be... me.
written by Logicalerror on Nov 07, 2016 19:52
I'm not a neurological expert, but I'm fairly certain the state of our brain has an amount to do with our "self" as does the information it contains. Even so, assuming that the WBE accounts for that, I believe the copy is not your "self". It is functionally identical outwardly, but you would know that it is not yourself or simply not be aware of it at all if you're dead.

A more interesting thought experiment, in my opinion, is that of gradual replacement of the brain. Imagine that a hypothetical machine replacement for the brain existed. If you slowly replaced your brain part by part, training the replacements with the rest of the original brain's capabilities, at the point that the original brain is gone would you still maintain your "self" (minus whatever was lost in the very first replacement)?

I think that you would be yourself because it it is a continuation of you.

I define my "self" not as my mind's state, but the continuity of that state.
lost, not forgotten
written by Alex on Nov 08, 2016 03:36
Logicalerror said:
I'm not a neurological expert,
Me neither...

Logicalerror said:
but I'm fairly certain the state of our brain has an amount to do with our "self" as does the information it contains. Even so, assuming that the WBE accounts for that,
It does. The state is part of the information. It's a thought experiment, there are no limits to what the magic WBE can do. We only need to determine a plausible way it could do that. In this case, how could the state be reproduced? I guess we can forget the state to be somehow constituted by action potentials (those pulsed electric signals traveling along axons), because if that was the case, then for example entering a coma would change the person undergoing that experience. It might be safe to assume your mind, to keep being yourself, doesn't depend on action potentials. It could be a mixture of physical structure and substances stored into synapses: it's probably something tangible.

Logicalerror said:
I believe the copy is not your "self". It is functionally identical outwardly, but you would know that it is not yourself or simply not be aware of it at all if you're dead.
Heh, that's the hard part, see? The subjective point of view doesn't exist anymore. The point is: the "copy" doesn't need the "original" to be alive to function like "me", and if "I" - as the original - don't exist or function anymore, then I'm simply out of the equation, it's not that I can prove I'm in some way more important or more self-aware than my own copy. In short: I don't need my copy, but my copy doesn't need me, and if my copy doesn't exist, then I'm the self-aware self; but if my copy exists and I'm no more, then my copy becomes, a hundred percent, the self-aware self.

Logicalerror said:
A more interesting thought experiment, in my opinion, is that of gradual replacement of the brain. Imagine that a hypothetical machine replacement for the brain existed. If you slowly replaced your brain part by part, training the replacements with the rest of the original brain's capabilities, at the point that the original brain is gone would you still maintain your "self" (minus whatever was lost in the very first replacement)?
To some extent, this kind of replacement already occurs throughout life. I don't remember the details, but there are like new neurons being created to help "expanding" the mind's capabilities in the constant attempt to cope with life's challenges, especially while other neurons die. But anyway...

Logicalerror said:
I think that you would be yourself because it it is a continuation of you.

I define my "self" not as my mind's state, but the continuity of that state.
And why isn't my far-future copy not a continuation of me (in the state I was just before death)?
reading this thread
no members are reading this thread
. /../Definition of.../ 1
22811, 10 queries, 0.053 s.this frame is part of the AnyNowhere network