Relationship Between Intelligence and Entropy

From the original paper to a summary in a popular new article.

Folksonomies: artificial intelligence entropy thermodynamics summary

Intelligence as a Response to Entropy

Recent advances in fields ranging from cosmology to computer science have hinted at a possible deep connection between intelligence and entropy maximization. In cosmology, the causal entropic principle for anthropic selection has used the maximization of entropy production in causally connected space-time regions as a thermodynamic proxy for intelligent observer concentrations in the prediction of cosmological parameters [1]. In geoscience, entropy production maximization has been proposed as a unifying principle for nonequilibrium processes underlying planetary development and the emergence of life [2–4]. In computer science, maximum entropy methods have been used for inference in situations with dynamically revealed information [5], and strategy algorithms have even started to beat human opponents for the first time at historically challenging high look-ahead depth and branching factor games like Go by maximizing accessible future game states [6]. However, despite these insights, no formal physical relationship between intelligence and entropy maximization has yet been established. In this Letter, we explicitly propose a first step toward such a relationship in the form of a causal generalization of entropic forces that we show can spontaneously induce remarkably sophisticated behaviors associated with the human ‘‘cognitive niche,’’ including tool use and social cooperation, in simple physical systems. Our results suggest a potentially general thermodynamic model of adaptive behavior as a nonequilibrium process in open systems.


Researchers create an AI that seeks to maximize its potential future histories and therefore tackles goals like playing the stockmarket and balancing balls on sticks because to let the money run out or the ball drop would reduce its potential future histories. Intelligence seeks to maximize entropy.

Folksonomies: artificial intelligence computer science intelligence entropy thermodynamics


Intelligence Arises out of a Need to Maximize Entropy

The researchers developed a software engine, called Entropica, and gave it models of a number of situations in which it could demonstrate behaviors that greatly resemble intelligence. They patterned many of these exercises after classic animal intelligence tests.


"It actually self-determines what its own objective is," said Wissner-Gross. "This [artificial intelligence] does not require the explicit specification of a goal, unlike essentially any other [artificial intelligence]." Entropica's intelligent behavior emerges from the "physical process of trying to capture as many future histories as possible," said Wissner-Gross. Future histories represent the complete set of possible future outcomes available to a system at any given moment. Wissner-Gross calls the concept at the center of the research "causal entropic forces." These forces are the motivation for intelligent behavior. They encourage a system to preserve as many future histories as possible. For example, in the cart-and-rod exercise, Entropica controls the cart to keep the rod upright. Allowing the rod to fall would drastically reduce the number of remaining future histories, or, in other words, lower the entropy of the cart-and-rod system. Keeping the rod upright maximizes the entropy. It maintains all future histories that can begin from that state, including those that require the cart to let the rod fall. "The universe exists in the present state that it has right now. It can go off in lots of different directions. My proposal is that intelligence is a process that attempts to capture future histories," said Wissner-Gross.


The new research was inspired by cutting-edge developments in many other disciplines. Some cosmologists have suggested that certain fundamental constants in nature have the values they do because otherwise humans would not be able to observe the universe. Advanced computer software can now compete with the best human players in chess and the strategy-based game called Go. The researchers even drew from what is known as the cognitive niche theory, which explains how intelligence can become an ecological niche and thereby influence natural selection. The proposal requires that a system be able to process information and predict future histories very quickly in order for it to exhibit intelligent behavior. Wissner-Gross suggested that the new findings fit well within an argument linking the origin of intelligence to natural selection and Darwinian evolution -- that nothing besides the laws of nature are needed to explain intelligence.


The more entropy, the more possibilities. Intelligence therefore seeks to maximize "future histories" in order to keep the number of possibilities maximized.

Folksonomies: artificial intelligence intelligence entropy thermodynamics ai future histories