Wednesday, May 21, 2008

Will We Survive the Singularity?

By Dan Ronco, author of the visionary novel Unholy Domain

Some futurists believe there will be a point in the not too distant future where the rate of technological development becomes so fast that it will advance beyond the control of humanity. This event is called the Singularity. The most likely cause would be rapidly advancing artificial intelligence.

A future that is dominated by an artificial intelligence superseding human intelligence is entirely possible this century. Software has already been designed that surpasses human intelligence in selected narrow areas such as chess playing. Although software exhibiting general intelligence is still a long way off, there are no theoretical limitations to its creation. Once we reach the point that artificial intelligence can improve its own source code, the stage is set for rapid improvement in AI.

Let’s assume that an enhanced AI takes humanity to the Singularity in several decades. Then we ask, what lies beyond? How can we predict a future dominated by a superior intelligence? An intelligence smarter than us as we are than chimps.

The times after the Singularity remain murky because we aren’t that smart. Could a chimp ever conceive the development of human technology, even if we tried to explain every step? No, it couldn’t, so how do we look past the Singularity?

We can’t.

In fact, we may not be able to understand the steps leading up to the Singularity. We will have to fashion tools to keep pace with rapidly advancing technology. Perhaps an AI that self designs, but remains within limits we can comprehend, would keep us informed as the Singularity approaches. Perhaps genetic engineering of our brains will keep pace with technology for several years.

Another possibility is a merging of human and artificial intelligence. This seems the most promising approach, that is, if you want to see something of today’s humanity survive past the Singularity. My forthcoming novel, tentatively titled Tomorrow’s Children, explores this possibility.

Let’s get back to the original question: will humanity survive the Singularity? Nobody knows or can know, but let’s pretend we can. One potential solution is to build controls into the AI, something like Asimov’s Laws, but more sophisticated. If we can, let’s infuse the AI with code that requires respect for human laws and love for human beings. This code has to foster such a strong love for humanity that the AI will never delete or emasculate the code. In other words, the AI must be indoctrinated with code that creates a child’s love for his parents. This code must survive the Singularity to assure the continued existence of humanity.

Can we write such code? Time will tell.

No comments: