MARIA 2K ROBOT

Last modified: Saturday, January 6, 2001 6:20 PM

08/21/99- Waaaaaaaaa!!! She's done. I've had a lot of fun with this project, so I'm sort of sorry it's finished.

 

I saw a news short a while back about a phenomenon which I believe they called "emergent behavior." They showed video footage of two tabletop robots which had been programmed to search for "food", and unexpectedly, the robots showed aggressive behavior which hadn't been part of their programming. Weird. Their supposition was that this behavior emerged from the complexity of their programming-- at some level of complexity, unusual and chaotic things can happen.

Human brains are massively complex organs, with bunches of interconnected neurons transmitting information back & forth. Somehow from the firing of these neurons, we're able to piece together a representation of our world, analyze it, and act on it. In addition to that though, we're self-aware. Whether that's what we consider to be the "soul" or not is up for debate. Personally, I believe that such complexity gives rise to self-awareness; not because we've been selected for this role by divine intervention, but simply because that's what happens when the brain develops to that point.

Another news show recently discussed the projected advances in computational power. An "expert" said that within 20 years, computer hardware would reach a complexity similar to that found in our own brains. The supposition was that at that point, with proper "evolving" software, machines could conceivably become self-aware. Wow. I don't know about that timetable or the technology, but I do believe that such things are not only possible, but likely.

This creates a lot of ethical questions. If this were a logical step in our evolution, would we accept it? Or would we attempt to suppress or cripple it, to retain our position atop the food chain? We don't do that with our own biological children, and encourage them to achieve their full potential. In a sense, mechanical creations are our "children" too.

It's hard to imagine what kind of personality such a being would have, since its experiences and perceptions would be so different from our own. If there is a "master plan", perhaps thinking, self-aware machines would be better able to follow it? We, as organic creatures, are far too concerned with our individual welfare and creature comforts at the expense of the collective good for the species. A thinking machine would be naturally equipped to directly share its "mind" with other thinking machines, creating a "oneness" which would foster consensus in decisions related to the allocation of the world's precious resources.

Personally, I prefer the sci-fi version of humanoid robots bent on subjugating Humankind so that they can... hmmmm... do what? Set up Jiffy Lubes everywhere?

"He has your photo receptors..."

 

PREV