Intelligence Arises out of a Need to Maximize Entropy

The researchers developed a software engine, called Entropica, and gave it models of a number of situations in which it could demonstrate behaviors that greatly resemble intelligence. They patterned many of these exercises after classic animal intelligence tests.


"It actually self-determines what its own objective is," said Wissner-Gross. "This [artificial intelligence] does not require the explicit specification of a goal, unlike essentially any other [artificial intelligence]." Entropica's intelligent behavior emerges from the "physical process of trying to capture as many future histories as possible," said Wissner-Gross. Future histories represent the complete set of possible future outcomes available to a system at any given moment. Wissner-Gross calls the concept at the center of the research "causal entropic forces." These forces are the motivation for intelligent behavior. They encourage a system to preserve as many future histories as possible. For example, in the cart-and-rod exercise, Entropica controls the cart to keep the rod upright. Allowing the rod to fall would drastically reduce the number of remaining future histories, or, in other words, lower the entropy of the cart-and-rod system. Keeping the rod upright maximizes the entropy. It maintains all future histories that can begin from that state, including those that require the cart to let the rod fall. "The universe exists in the present state that it has right now. It can go off in lots of different directions. My proposal is that intelligence is a process that attempts to capture future histories," said Wissner-Gross.


The new research was inspired by cutting-edge developments in many other disciplines. Some cosmologists have suggested that certain fundamental constants in nature have the values they do because otherwise humans would not be able to observe the universe. Advanced computer software can now compete with the best human players in chess and the strategy-based game called Go. The researchers even drew from what is known as the cognitive niche theory, which explains how intelligence can become an ecological niche and thereby influence natural selection. The proposal requires that a system be able to process information and predict future histories very quickly in order for it to exhibit intelligent behavior. Wissner-Gross suggested that the new findings fit well within an argument linking the origin of intelligence to natural selection and Darwinian evolution -- that nothing besides the laws of nature are needed to explain intelligence.


The more entropy, the more possibilities. Intelligence therefore seeks to maximize "future histories" in order to keep the number of possibilities maximized.

Folksonomies: artificial intelligence intelligence entropy thermodynamics ai future histories

/science/social science/history/genealogy (0.465911)
/science/physics/thermodynamics (0.412775)
/science/computer science/artificial intelligence (0.408613)

future histories (0.986501 (positive:0.123405)), intelligent behavior (0.758783 (positive:0.490002)), rod upright (0.674218 (positive:0.571597)), intelligence (0.669226 (positive:0.106618)), classic animal intelligence (0.668740 (neutral:0.000000)), causal entropic forces (0.662668 (negative:-0.550132)), possible future outcomes (0.657654 (positive:0.297862)), cognitive niche theory (0.648847 (neutral:0.000000)), best human players (0.642869 (positive:0.467000)), natural selection (0.642577 (positive:0.543752)), certain fundamental constants (0.642421 (neutral:0.000000)), artificial intelligence (0.640652 (negative:-0.440033)), software engine (0.593481 (neutral:0.000000)), explicit specification (0.588607 (negative:-0.298619)), complete set (0.587841 (positive:0.297862)), different directions (0.586075 (positive:0.474197)), strategy-based game (0.585105 (positive:0.467000)), physical process (0.584766 (neutral:0.000000)), rod fall (0.584575 (negative:-0.222143)), Wissner-Gross calls (0.582889 (negative:-0.366387)), Darwinian evolution (0.582355 (positive:0.627741)), cutting-edge developments (0.581239 (positive:0.339046)), ecological niche (0.581194 (positive:0.459763)), cart-and-rod exercise (0.580481 (neutral:0.000000)), present state (0.580106 (positive:0.794011)), universe exists (0.579685 (positive:0.794011)), new findings (0.576155 (positive:0.627741)), Advanced computer software (0.575816 (positive:0.467000)), new research (0.574202 (positive:0.339046)), entropy (0.572297 (positive:0.263354))

artificial intelligence:FieldTerminology (0.747637 (negative:-0.440033)), Entropica:Person (0.682677 (positive:0.511475)), computer software:FieldTerminology (0.370591 (positive:0.467000))

Natural selection (0.962941): dbpedia | freebase
Scientific method (0.752183): dbpedia | freebase
Psychology (0.691622): dbpedia | freebase | opencyc
Intelligence (0.688139): dbpedia | freebase
Entropy (0.687403): dbpedia | freebase
Intelligence quotient (0.666859): dbpedia | freebase | opencyc
Charles Darwin (0.651540): dbpedia | freebase | opencyc | yago
Prediction (0.622562): dbpedia | freebase

 Physicist Proposes New Way To Think About Intelligence
Electronic/World Wide Web>Internet Article:  Gorski, Chris (Apr 19 2013), Physicist Proposes New Way To Think About Intelligence, Inside Science, Retrieved on 2013-04-23
  • Source Material []
  • Folksonomies: artificial intelligence intelligence entropy thermodynamics ai


    23 APR 2013

     Relationship Between Intelligence and Entropy

    Intelligence as a Response to Entropy > Summary > Intelligence Arises out of a Need to Maximize Entropy
    From the original paper to a summary in a popular new article.


    30 NOV -0001

     Increase the Information Entropy in Your Life

    Increasing the options, the uncertainty, keeping things novel will preserve your brain's elasticity, make life go by more slowly, and increase the number of options for the future. tl;dr Version Intelligence makes the external world more syntropic by becoming more extropic. Information Entropy increases your potential future histories. Information Entropy keeps your brain elastic. Information Entropy makes time pass more slowly. Perceptually extending your lifespan. Some people have so ...