People often think "Teminator" but that is only one potential outcome.
Here are some others:
1. Nanobots placed in people's food by a political faction altering behaviors of people in specific ways somewhat controllable using Iphone or computer technology or wifi. People no longer have full control of their faculties after Nanobots are put in something like Milk products, Flour products like Bread or pizza etc. However, likely it would be something that isn't cooked or that doesn't need to be recooked to be logical.
2. "Terminator" the movie.
3. Alliances between some of the richest 10% on earth and Robotics companies winding up a lot like in
begin partial quote from:
https://en.wikipedia.org/wiki/Technological_singularity
The technological singularity (also, simply, the singularity)[1] is the hypothesis that the invention of artificial superintelligence (ASI) will abruptly trigger runaway technological growth, resulting in unfathomable changes to human civilization.[2] According to this hypothesis, an upgradable intelligent agent (such as a computer running software-based artificial general intelligence) would enter a "runaway reaction" of self-improvement cycles, with each new and more intelligent generation appearing more and more rapidly, causing an intelligence explosion and resulting in a powerful superintelligence that would, qualitatively, far surpass all human intelligence. Stanislaw Ulam reports a discussion with John von Neumann "centered on the accelerating progress of technology and changes in the mode of human life, which gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue".[3] Subsequent authors have echoed this viewpoint.[2][4] I. J. Good's "intelligence explosion" model predicts that a future superintelligence will trigger a singularity.[5]Emeritus professor of computer science at San Diego State University and science fiction author Vernor Vinge said in his 1993 essay The Coming Technological Singularity that this would signal the end of the human era, as the new superintelligence would continue to upgrade itself and would advance technologically at an incomprehensible rate.[5]
Four polls, conducted in 2012 and 2013, suggested that the median estimate was a 50% chance that artificial general intelligence (AGI) would be developed by 2040–2050.[6][7]
In the 2010s, public figures such as Stephen Hawking and Elon Musk expressed concern that full artificial intelligence could result in human extinction.[8][9] The consequences of the singularity and its potential benefit or harm to the human race have been hotly debated.
No comments:
Post a Comment