The concept of the Singularity is that at a point in the future, the paths of human and computer development will become one. As Moravec predicts, technology will develop in waves, and robots and computer systems will continue to gain intelligence and independence until they “wake up” and are able to think completely on their own. At this point, human labor would be largely unneeded, and Moravec suggests that instead of work, humans may live like the current idle rich and pursue humanistic fulfillment. As Vinge writes, there may be many different ways that the Singularity may take place. He suggests that it could happen on an individual computer scale, in a computer network, and through biological interfaces that would elevate human intelligence to superhuman levels. While he is skeptical of it happening, Vinge says that if the Singularity is possible then it will happen because of the inevitability within the theories. Many critiques of the theory are directed at Ray Kurzweil because of a lack of scientific evidence to support his predictions. Davies says that Kurzweil is mistaken in assuming that the exponential growth of processing power will continue unimpeded. He believes that the growth will taper off far short of what would be required for total human-computer interface. The philosophical idea behind Kurzweil’s theory is explained best through the compilation of human thought and experience onto a type of grid or cloud that would outlive any human. Whether or not consciousness would continue with this data upload would be impossible to say, but this theory is essentially based on the desire to escape death. If the Singularity occurs, the entirety of human knowledge could be compiled and used for future generations, potentially improving the lives of everyone on Earth when it occurs. Kurzweil predicted that the Singularity would occur in the next twenty years. While this is an optimistic prediction, the relationship between humans and computers will continue to develop slowly and eventually the Singularity may occur without anyone taking note because technology is so intrinsically linked with humanity.
This concept that computer systems will be able to gain their own self reliance and power is a common theme throughout many movies and novels all pertaining to artificial intelligence. For example, does anyone remember the Disney movie Smart House? Where the house at first only did what you told it to, and then eventually gets out of control and has a mind of its own. It is this same concept of singularity that has always been something society has been afraid of. The self growth of a robot can frighten people because what if they outgrow the humans?