Back in college (I know, yawn) I took a class about the broader moral and ethical implications of my chosen field (Computer Science, shockingly). A large section of it was about the future of computer science and what ethical issues we would face if someday we were able to create human (or superhuman) artificial intelligence, for example. During this part of the class, we read several essays written by Ray Kurzweil. The musically-inclined among you may recognize that as a brand name for synthesizers. That's the same guy.
When not inventing synthesizers, Dr. Kurzweil writes about what the future may look like if we continue on our present explosive growth of technology track (not in a pessimistic way, but the things he predicts are pretty wild, so you could interpret some of it in a dystopian fashion if you were so inclined).
I remember one of his most interesting essays that we read was about the nature of the human mind. The idea was that your mind was the software that makes you you and this "You" program runs on your brain (like Mozilla Firefox runs on your computer). Starting from that premise, he says that it's not that crazy to think we could build a little computer chip that mimicked the behavior of a single neuron. They are, after all, relatively simple in and of themselves. You could then take one of these synthetic neuro-chip doo dads and have a brain surgeon replace one of the neurons in your brain with it. Assuming it did its job, you probably wouldn't notice anything weird. You'd go right on thinking and feeling like you were you.
But then suppose the brain surgeon didn't stop there and just kept right on replacing your neurons with synthetic ones. Would you notice a change at any point? In theory, replacing them one at a time, you could even stay conscious throughout the procedure. Eventually, your whole brain would be made up of synthetic neurons. You'd have a computer brain. You could then upgrade that brain w/ more or faster synthetic neurons or copy the "software" (you) onto a different computer (possibly faster, more powerful one).
So then you have some interesting questions to ask at that point. What makes you, you? If you were able to augment your brain to make it faster, able to store more memories, or more accurate memories, would that mean you were no longer human at some level? What about an AI that is as smart (or smater than) a human that we created ourselves? Should it be afforded the same rights that a human would be? After all, if you were to put your mind onto a computer, you'd probably still want basic human rights like it being illegal to pull the plug on you or install Windows on you. ;)
Anyway, my friend Emily and I were discussing these things while I was staying with her in Chicago. I think it kinda freaked her out. Sorry, Emily! :) I found this site, which has all kinds of fun things to make you afraid of what the world might be like by the middle of this century. Just kidding, sorta.
And no, this didn't come up because I'm in Boulder right now. The contact high isn't that good.
1 comment:
Obviously the only way to know the answers to these questions is to try it out. I personally know some people we can knock over the head with a tire iron and replace all their neurons. What harm ever came out of acting on something before fully exploring the consequences. None, that's what.
Post a Comment