Dating site for intelligent people only
Many writers also tie the singularity to observations of exponential growth in various technologies (with Moore's Law being the most prominent example), using such observations as a basis for predicting that the singularity is likely to happen sometime within the 21st century.
The technological singularity is a hypothetical event in which artificial general intelligence (constituting, for example, intelligent computers, computer networks, or robots) would be capable of recursive self-improvement (progressively redesigning itself), or of autonomously building ever smarter and more powerful machines than itself, up to the point of a runaway effect—an intelligence explosion—that yields an intelligence surpassing all current human control or understanding.Because the capabilities of such a superintelligence may be impossible for a human to comprehend, the technological singularity is the point beyond which events may become unpredictable or even unfathomable to human intelligence.The first use of the term "singularity" in this context was made by Stanislaw Ulam in his 1958 obituary for John von Neumann, in which he mentioned a conversation with von Neumann about the "ever accelerating progress of technology and changes in the mode of human life, which gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue".The term was popularized by mathematician, computer scientist and science fiction author Vernor Vinge, who argues that artificial intelligence, human biological enhancement, or brain–computer interfaces could be possible causes of the singularity.These iterations of recursive self-improvement could accelerate, potentially allowing enormous qualitative change before any upper limits imposed by the laws of physics or theoretical computation set in.Many of the most recognized writers on the singularity, such as Vernor Vinge and Ray Kurzweil, define the concept in terms of the technological creation of superintelligence.
They argue that it is difficult or impossible for present-day humans to predict what human beings' lives will be like in a post-singularity world.
The term "technological singularity" was originally coined by Vinge, who made an analogy between the breakdown in our ability to predict what would happen after the development of superintelligence and the breakdown of the predictive ability of modern physics at the space-time singularity beyond the event horizon of a black hole.
In 2012, Stuart Armstrong and Kaj Sotala published a study of artificial general intelligence (AGI) predictions by both experts and non-experts and found a wide range of predicted dates, with a median value of 2040.
Although technological progress has been accelerating, it has been limited by the basic intelligence of the human brain, which has not, according to Paul R. If a superhuman intelligence were to be invented—either through the amplification of human intelligence or through artificial intelligence—it might be able to bring to bear greater problem-solving and inventive skills than current humans are capable of.
It might then design an even more capable machine, or re-write its own software to become even more intelligent.
This more capable machine could then go on to design a machine of yet greater capability.