http://youtube.com/watch?v=l-jptjnFVYk This is a real hot button topic that myself and a few engineers were hypothesizing about at a engineering brain dump recently. Hoping to get some of the hard core geeks to chime in
Yeah man. I've thought this way for a while but I am just labeled crazy or told that I am anti-something or another. We can wake more people up and have far greater control on our futures as Americans if we help Ron Paul win in '08. Put down your xbox and read.
I've actually been interested in the concept of a singularity for quite a while (picked up my Master's in CS specializing in AI at GaTech a couple years back). I don't know if you caught it, but on this week's Terminator, they sort they sort of misused the definition of a singularity, which is somewhat vague itself. I don't remember the exact quote, but they basically said that a singularity is when robots become aware and start killing people. I suppose that could be an example of a singularity, or really just the beginning of one. To me the important part about a singularity is the point in which it is nearly impossible to predict past do to the rapid level of technological change. I think it is possible if humans don't manage to snub themselves out before technology reaches that point. A big name and a big proponent in the singularity world is Ray Kurzweil. http://kurzweilAI.net