Singularity is a favorite topic among geeks… I find it unsettling.
The Singularity in the film’s title refers to a point in time when the planet’s non-biological intelligence will be one billion times more powerful than the sum of all human intelligence existing today. At that point, the non-biological intelligence will have begun to analyze and improve itself in increasingly rapid redesign cycles. Technical progress will be so fast that un-enhanced humans would be unable to follow it. Kurzweil posits that this will occur in 2045 and concedes the extreme difficulty in making predictions past that point.
Kurzweil makes some really compelling predictions in a thesis that is broadly called singularity and at the core of the notion is the suggestion that machines will be developed with capacity for creativity and invention which mimic that of humans but as they take on this new dimension they will achieve something dramatic, the ability to self-iterate and evolve at a rate far greater than if that development were solely at the hands of humans.
I have to admit that I find myself thinking “who am I to question RAY KURZWEIL, the guy is a legitimate legend?” but it’s not the predictions he makes that bother me but rather the glee that his cult-like followers bubble with. I get that the realization of singularity could result in some wonderful advances that diminish the frailties of being human, and I really would look forward to a world without serious physical disability and totally powered by clean renewable energy. If we could just accomplish those two things we would have done something that pays dividends into future generations.
However, there is something rather creepy about the singularity movement that goes beyond solving diseases and creating energy, it often comes across that singularity is a way to better organize society and substitute real person to person contact (think Second Life at a whole new level). This is find troubling because the thing that makes us human is our imperfection, both in our form and our organization, and the emotional stimulus that comes from connecting and interacting with our fellow humans cannot be replaced or replicated by machines.
There are also legitimate concerns about machines becoming autonomous, all Terminator jokes aside. The proliferation of nanotechnology could also result in some serious and irreversible consequences, in effect becoming an infection that mimics behaviors of a virus by mutating to avoid detection and eradication.
Kurzweil, for all his genius, could also just be wrong and while the march of technology continues unabated it may not be accelerating after all but slowing down, just like the early phase of the industrial revolution saw explosive innovations it later became held down by inertia that dramatically slowed the innovation and resulted in counter productive behaviors that still plague us (such as labor unions that are inflexible to technological change).
Complexity theory rears it’s ugly head here as well as technological innovation leads to self-limited returns as a result of the technical complexity it imposes on the system as a whole. We see this today manifest itself as a something as trivial as a mis-typed entry in a routing table bringing down an entire system or an errant line of code that spawns a billion processes which crash a complex server environment. In other words, just having more technology does not mean we are closer to Kurzweil’s vision, it just means we have more technology with more points of failure.
I can’t help but think of the cryogenics proponents from the 1980’s… you remember them, they wanted to cut off your head when you die and freeze it so that when science achieves the ability to create life (already has, btw) they can put your head on a new body and you are good to go. After I got over the irrepressible giggling at the notion of freezing my head for future generations I wondered why I would want to put my head on a new body in the pursuit of immortality when all the people I care about and the life I know is rooted in the here and now? Singularity is the same way for me, I’m not sure I want to have a world where technology is autonomous and I’m just another cog in a machine, literally.