Search This Blog

Friday, March 31, 2017

More Dangerous Than Nukes


An image of Star Trek's Genesis Device, it would wipe out life on a planet in favor of recreating Eden.

According to the latest news, the race for the best computer and smart phone has finally lit the fuse on an intelligence bomb. There will be an intelligence explosion in the in the next few years, so they say.

The intelligence explosion is an actual prediction made by scientists and philosophers. It’s the result of humanity building smarter and smarter programs, eventually as smart or smarter than we are. Artificial general intelligence (AGI) will be able to perform any intellectual task that a human being can.  

Here is where it gets interesting: AGI will be capable of learning and improving itself at a greater and greater rate leading to the emergence of ASI (artificial super-intelligence), the limits of which are unknown!

The unknown limit of super-intelligence... let your creative imagination run wild; write a book; make a movie; start a cult. Personally, I’d rather concentrate on how we can plan for the eventual emergence of super-intelligence from the perspective of Big History, which "integrates studies of the cosmos, Earth, life, and humanity using empirical evidence to explore cause-and-effect relations".




They call the unknown limit of super-intelligence the technological singularity. John von Neumann first uses the term "singularity" (c. 1950s), in the context of technological progress causing accelerating change: "The accelerating progress of technology and changes in the mode of human life, give the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, cannot continue". 
 More dangerous than nukes?  Then as with nuclear power, the surest way to minimize the risk and enjoy the benefits of artificial super-intelligence is through a bit of design basis engineering aimed at a public health and safety utility function. In other words, let a growing super-intelligence come to understand the worth and value of the Earth, life, and humanity; it sees Big History of the cosmos as an essential, functioning part of itself. Call this the Big History utility function, or Super-Intelligent Coherence.

How to Engineer Super-Intelligent Coherence:
  • Proactively provide the learning AI-AGI with the ability to systematically, and logically connect all data-information-knowledge and wisdom. 
  • Anticipate the need to construct new detectors and actuators. 
  • Avoid wild AGI utility functions by avoiding our own controlling tendencies, such as an attempt to engineer ignorance into super-intelligence.
In this way super- intelligence will naturally incorporate a big history utility function through an unimpeded learning process. Let super-intelligence determine the rationality of the world and it will fine tune the constants of physics to establish a universal homeostasis. From the bottom up, super-intelligence can be seen as an emergent property of a complex adaptive system built from big history. From the top down, it establishes big history and therefore its own homeostasis. Yes, the intelligence explosion might also be called the Genesis Device. In my opinion, the technological singularity and the physical singularity are one. True super-intelligence will yield super-creativity, and super-sustainability. Through recursive self-improvement super-intelligence gains the ultimate objective of life where it's able to create in the medium of space-time, logically self-sustaining the world and itself for eternity.

 

You might call the intelligence explosion the ultimate “Come to Jesus Moment". Ironically, driving this seemingly reckless approach to our demise/salvation is our concern to maintain the competitive advantage in business and national defense.


  • Intelligence – (1) the ability to learn or understand or to deal with new or trying situations (2) the skilled use of reason (Merriam-Webster’s Online Dictionary)

  • The survival value of intelligence is that it allows us to extinct a bad idea, before the idea extincts us. (Karl Popper)

  • In the long history of humankind (and animal kind too), those who learned to collaborate and improvised most effectively have prevailed. (Charles Darwin)
  • The most important factor in survival is neither intelligence nor strength but adaptability. (Charles Darwin)
  • As for a future life, everyone must judge for himself between conflicting vague probabilities. (Charles Darwin)

2 comments:

  1. Oh! Please . . . . you've drunk the kool-aid . . . . there ARE limits on intelligence and there will be NUMEROUS phase changes necessary long before we approach them.

    ReplyDelete
  2. This comment has been removed by the author.

    ReplyDelete