Popular Posts

Wednesday, November 17, 2010

Techno-Evolution


I watched “The Animatrix” a while back and the part called “The Second Renaissance” really got me thinking about human nature and our spiritual growth and evolution. For those of you who haven’t seen it or don’t remember it, “The Second Renaissance” was about the transition from the human-dominated world to the machine-dominated world. Basically, humans developed sophisticated robots to do all the menial and dangerous work that none of us want to do and robots can do better anyway. After a while, our robots became so sophisticated they became self aware. Now, they didn’t automatically start killing every human in sight the way some people imagine they would. Actually, the machines gave humans ample opportunity to live in harmony, but of course, the humans said “No” and tried to destroy the machines.

The first question that came to mind is one I’ve thought of before: What will or could cause humans to evolve spiritually? I’ve said before that I’m concerned that humans are evolving technologically faster than we’re evolving spiritually, (and by “spiritually”, I mean morally and ethically. As an agnostic, I try to make these arguments objective and not based on any god or belief system). In the book “Conversations with God”, God, (or the author, whichever you prefer) makes the argument that “The inability to experience the suffering of others is what allows suffering to continue.” To put it another way, humans wouldn’t hurt each other if we could feel each others pain. I agree. I think that humans will evolve spiritually and world peace will be achieved if we can find a way to empathize with each other. But if empathy is truly our salvation, how can we develop that the way we strive to increase and refine our intelligence? First of all, people would have to see the importance of developing empathy. The technology for increasing our brain power-which I will discuss momentarily-has a clear path and is indeed being developed. The problem is that we are seeking to increase our intelligence, but not to unite our minds with one another. This is a pursuit that will not begin in my lifetime, if ever.

I recently read an article in Playboy that talked about some of the technological advances we’re going to see in the future. Many of them involved increasing our brains efficiency and our memory, basically making us super intelligent. Someday, our I.Q.s will be at least 10 times higher than they are now. Some even say we’ll be able to upload our minds into super computers and be able to think with the speed and efficiency of a super computer. To me, the question is, will this make a better world for us? Will we become more peaceful? Will this be the quantum shift in human consciousness we need? Does super intelligence result in a more peaceful nature or will making us super-intelligent just give us a new path to self-destruction? Maybe, instead of developing our brain power to super-human levels, we should be developing our capacity for empathy. Something tells me that’s a goal our scientists don’t consider important enough to strive for. Being super-intelligent could be dangerous if it doesn’t eliminate our self-destructive nature. Being a genius doesn’t make you a good person. I don’t think there’s any shortage of highly intelligent yet very bad people in the world today or throughout history. But then, maybe super intelligence WILL cause us to evolve. Maybe we’ll be able to see the universe as a unified energy field and understand our place in it. Maybe being super-intelligent will open us up to new insights that will inevitably lead us to conclude that mutual destruction would be a bad idea (since we can’t seem to grasp that now). Perhaps we’ll understand that by hurting others, we’re really hurting ourselves and by helping others, we’re really helping ourselves.

This leads me to another question I asked myself as I watched the film: What is it that makes humans so self-destructive? I think it’s selfishness. I think we’re far too concerned with having more for ourselves at the expense of others who don’t have enough. I’m a big believer in Maslow’s hierarchy of needs and the idea that if peoples needs aren’t met, it can cause big problems. I think a significant portion of the worlds problems-perhaps MOST of the worlds problems-come from people’s needs not being met. When people don’t have things like food, clothing, shelter, security, and a community of peers, they’ll often do terrible things to get them. Now, do I think that we could bring about world peace if charities and governments worked together to meet everyone’s needs? No, I don’t. There would still be people who would seek to have more for themselves, even if it means taking what they want from someone else. That’s the kind of system we have today. We DO have enough to meet everyone’s needs, but instead, we have people and countries who throw away enough food every day to feed an entire nation full of starving people for a year. But I digress. The point I was trying to make is that selfishness isn’t going to die out in a single generation just because people’s needs are met. That kind of thing would take centuries to eliminate, if it could be eliminated at all. More likely, selfishness is just an indelible part of human nature that only divine intervention could eradicate. I think this selfish nature is a vestige of our cave-man days when our survival depended on finding food, clothing, and shelter for ourselves and our offspring and those things were harder to come by so we were more willing to club the other guy and take his stuff for our own. We haven’t evolved much past that point.

Another question I asked myself was why humans imagine that machines would rise up and destroy us if they ever became self-aware. The easy answer is that we’re projecting our behavior onto them because that’s what we’d do. Humans have a history of crushing weaker civilizations for our own gain. But if we do that to meet our needs, then I think we might be safe from the machines. I think their needs would be easier to meet and they could probably meet those needs themselves without plundering our world to do it. As in the movie, I think it’s more likely that humans would try to crush the machine up-rising to suit our own needs.

I also asked myself what separates man from machine. What is the source of our sentience? Is it simply that we are self-aware? Are not some machines self-aware? Can they not be programmed to act as necessary for self-preservation? Doesn’t that denote a kind of self-awareness? Perhaps it’s the capacity for emotions that separates us. But emotions are mostly biochemical, unless you want to take the spiritual angle, which I do not. Can machines be programmed with emotions? Can a machine be happy because something good happened? Perhaps that’s a subject for another time.

No comments:

Post a Comment