Sunday, July 7, 2024

I. J. Good speculated in 1965 that superhuman intelligence might bring about an intelligence explosion:

I would say that we are presently experiencing the very beginnings of the "intelligence Explosion" that will more and more reverberate around the world and becoming something none of us can fully predict even now.

partial quote from: https://en.wikipedia.org/wiki/Technological_singularity#Intelligence_explosion

Intelligence explosion

Although technological progress has been accelerating in most areas, it has been limited by the basic intelligence of the human brain, which has not, according to Paul R. Ehrlich, changed significantly for millennia.[14] However, with the increasing power of computers and other technologies, it might eventually be possible to build a machine that is significantly more intelligent than humans.[15]

If a superhuman intelligence were to be invented—either through the amplification of human intelligence or through artificial intelligence—it would, in theory, vastly improve over human problem-solving and inventive skills. Such an AI is referred to as Seed AI[16][17] because if an AI were created with engineering capabilities that matched or surpassed those of its human creators, it would have the potential to autonomously improve its own software and hardware to design an even more capable machine, which could repeat the process in turn. This recursive self-improvement could accelerate, potentially allowing enormous qualitative change before any upper limits imposed by the laws of physics or theoretical computation set in. It is speculated that over many iterations, such an AI would far surpass human cognitive abilities.

I. J. Good speculated in 1965 that superhuman intelligence might bring about an intelligence explosion:[18][19]

Let an ultraintelligent machine be defined as a machine that can far surpass all the intellectual activities of any man however clever. Since the design of machines is one of these intellectual activities, an ultraintelligent machine could design even better machines; there would then unquestionably be an 'intelligence explosion', and the intelligence of man would be left far behind. Thus the first ultraintelligent machine is the last invention that man need ever make, provided that the machine is docile enough to tell us how to keep it under control.

One version of intelligence explosion is where computing power approaches infinity in a finite amount of time. In this version, once AIs are performing the research to improve themselves, speed doubles e.g. after 2 years, then 1 year, then 6 months, then 3 months, then 1.5 months, etc., where the infinite sum of the doubling periods is 4 years. Unless prevented by physical limits of computation and time quantization, this process would literally achieve infinite computing power in 4 years, properly earning the name "singularity" for the final state. This form of intelligence explosion is described in Yudkowsky (1996).[20]


No comments: