This post has been de-listed
It is no longer included in search results and normal feeds (front page, hot posts, subreddit posts, etc). It remains visible only via the author's post history.
In my setting, there are a lot of moral dilemmas, that arise from this exact question.
I'll describe one of the situations, and then talk about my views:
"The squad encounters 4 beings. First thing seems like a normal human. Second looks like a machine, with a large head. third thing looks like a cyborg, and the fourth looks just like the first dude.
All of these things seem "asleep", not waking up, even as they get closer and shoot warning shots.
They're tasked with dispatching non conscious beings, and taking conscious as captives, to be integrated into the society.
Problem?
They can only take 3 "beings". And, after scanning all of their brains and circuitry, a shocking revelation occurs.
They're basically all the same people, with exactly the same amount of neurons(the robot also has code, which basically functions as neurons) with same connections.
They can only bring back 3 "people", and dispatch others."
What should they do?
My views on the brain:
I view the brain as a sort of an "art piece" or a theseus ship.
Let's say you have the original Mona Lisa. Now, if you scan every atom and replicate it in digital space or on a painting, that'll just be a copy.
Here's the thing - if you do the same for the brain, I believe you're creating another "consciousness". It'll basically be you, but here's the thing:
We still have the original Mona Lisa.
Let's say, over time, the painting gets degraded, so we replace the parts one-by-one.
How much of the original Mona Lisa do we need to have, to say that the thing we're looking at is the "original"?
Take the prefrontal cortex - "I think therefore I am". This is a part of the brain that does most of the thinking and data processing. Basically makes us self aware and conscious.
You'd think, that "aha, just replace every neuron of the prefrontal cortex over time to allow continuation of consciousness!". Well, maybe you're right. Problem? Neurons don't replicate(or at a very slow pace), so we can't be sure for now.
"A person" needs memories and emotions, but a "consciousness" needs a part of the brain that makes it self aware. If you copy that part, you're basically creating another "person", or a being that should be given rights lol.
I'm exploring this exact topic in my sci-fi novel. There are 3 factions:
1) They believe, that if the continuation of consciousness is present during the procedure(E.I there's no death through ceasing of brain activity) of replacing each neuron, then the subjective experience doesn't change.
2) They don't care. Objectively, a consciousness is just a collection of some data. Why should they care, if it's infinitely copied? Who cares about the original Mona Lisa, when everyone in the world can use the digital copy?
3) They think that the prefrontal cortex is "the original Mona Lisa" and other parts are just "additions". They try to preserve the prefrontal cortex, while repairing the others.
My personal view?
If you replace every one of my neuron with objects(nanomachines or neurons) that perform the same exact functions as the neuron you're replacing, but keep the continuation of consciousness, I'm fine. I think I'm the same person. But if you copy my stuff, then that's just a good way to keep my clone around after my death.
What do you think?
I thought long and hard on it.
Then I realized that most discourse on it is just self serving monologue and made the soul a measurable but irreplecable quality. Mind you, it's still not considered polite to extinguish awareness. But wHat iS A mAn has been done to death and while I'm pretty insane I don't think myself to be cultured or unique enough to add anything genuinely thought provoking to a 6000 year old discourse.
I want to talk with the reader. Not at them.
Subreddit
Post Details
- Posted
- 3 weeks ago
- Reddit URL
- View post on reddit.com
- External URL
- reddit.com/r/scifiwritin...