Hello there, I didn't know whether to post this here or in the NT forum but this question has been bugging me. One of the craziest and audacious scientific ideas I've found floating around the Internet over the years has been a scheme to completely reverse engineer the human brain on to a computer, possible even leading the way to the birth to AI, for good or ill. At the time (2006) they speculated that computer processing power could match that of the human brain by 2020 and that they would be able to essentially completely emulate a human mind on a computer, like a ghost in the machine. But futurism is often like a person who can't ever meet homework assignment deadlines, disappointment sets in and there's less belief in success for when the second deadline comes around. Now they think the date will be 2030 at the earliest at least, I would leave predictions where they will do it by then or not up to Ni users.
This however is mere backstory. My question is more philosophical in my opinion and less about the technological implications of such a feat. If they were to completely replicate every neuron and synapse then perhaps they could emulate "the soul" and every aspect of what made the person, memories as well. If this was achieved, then who's brain would they ultimately emulate? The MBTI is clear that there are numerous potential set ups with each type having their own strengths and weaknesses and I wonder what MBTI type they would make such a mind. I think they should choose Einstein. Don't they have his brain in some shape or form in places all over the world? Not just that but he was quite the scientific juggernaut with a strong moral and even religious conscience as well. However who would you choose?
- Would you choose a scientist, who could lead the way as a software mastermind, but at the possible risk of the "cyber-mind" turning its back on humanity and human values.
- A laid-back philosopher or religious thinker which would probably find itself on the wrong side of the scientists/engineers/contractors who funded and worked on the project.
- Would you choose yourself or a close friend or family member, but at the potential cost of a loss of individuality on your part or their part? (Heck, maybe this is an implication for all the points.)
- Would you just choose a random person (or make up from scratch a random set-up of brain chemistry) but at the potential expense of making a really annoying asshole if not a totally evil douche?
- Would you choose a "super-brain" EISNTFJP type which will represent all of humanity but with the potential risk of thinking it is fundamentally better or superior than its more limited human counterparts?
- Or something else?
What do you guys think? This has been bothering and confusing me for a while now and I doubt my mind would be selected in such a scenario. :happy:
This however is mere backstory. My question is more philosophical in my opinion and less about the technological implications of such a feat. If they were to completely replicate every neuron and synapse then perhaps they could emulate "the soul" and every aspect of what made the person, memories as well. If this was achieved, then who's brain would they ultimately emulate? The MBTI is clear that there are numerous potential set ups with each type having their own strengths and weaknesses and I wonder what MBTI type they would make such a mind. I think they should choose Einstein. Don't they have his brain in some shape or form in places all over the world? Not just that but he was quite the scientific juggernaut with a strong moral and even religious conscience as well. However who would you choose?
- Would you choose a scientist, who could lead the way as a software mastermind, but at the possible risk of the "cyber-mind" turning its back on humanity and human values.
- A laid-back philosopher or religious thinker which would probably find itself on the wrong side of the scientists/engineers/contractors who funded and worked on the project.
- Would you choose yourself or a close friend or family member, but at the potential cost of a loss of individuality on your part or their part? (Heck, maybe this is an implication for all the points.)
- Would you just choose a random person (or make up from scratch a random set-up of brain chemistry) but at the potential expense of making a really annoying asshole if not a totally evil douche?
- Would you choose a "super-brain" EISNTFJP type which will represent all of humanity but with the potential risk of thinking it is fundamentally better or superior than its more limited human counterparts?
- Or something else?
What do you guys think? This has been bothering and confusing me for a while now and I doubt my mind would be selected in such a scenario. :happy: