Quanta Magazine recently ran an article called Why Everything in the Universe Turns More Complex, which discusses a paper from 2023, On the Roles of Function and Selection in Evolving Systems, which explores a hypothesis addressing a core tension in physics: how do complex systems arise under the dominion of the second law of thermodynamics? Essentially, the hypothesis states that everything in the universe evolves, a kind of theory of evolution akin to the biological one but applicable to information and, by extension, the building blocks of everything. It proposes a kind of functional selection based on information density which would lead to more ordered states without violating the principle of increasing entropy.
Life is said to comply with the second law of thermodynamics because it is energy-inefficient, so entropy increases overall even though the entropy of the components of life itself is locally lower. In fact, it is even sometimes said that life evolved as a kind of natural drive to create entropy engines which will even more rapidly cause an overall increase in entropy. More commonly, the local low-entropy state that is life is considered the result of randomness in an infinite, quantum universe. Entropy increases overall, but that randomness can result in localized decreases in entropy, including the development of complex life. This is generally considered a rather unsatisfying explanation, though. Few scientists, especially in the physics community, like to invoke statistical randomness as a deterministic explanation for something as central to our conception of the universe as the very existence of life.
Functional information is a term to a refer to a kind of nonbiological evolutionary process, or, perhaps more helpfully, to a nonbiological version of “survival of the fittest.” Consider there are numerous possible minerals that can be formed from a given combination of elements in the Earth’s crust, but one is more easily created because of the ambient conditions, even though it is not the lowest energy solution. The resulting dominant mineral will therefore be at a higher level of complexity, holding a higher level of functional information. The paper Open-Ended versus Bounded Evolution: Mineral Evolution as a Case Study claims to prove that minerals on Earth have “evolved” to a state of higher complexity and specialization of a sort in accordance with this notion, and the extrapolation is that the idea is applicable to the differentiation of matter from the primordial soup, the collapse of gas and dust into stars, and the creation of heavy elements through stellar fusion and supernovae.
It’s an interesting contextualization and way of thinking about the development of the material universe, but it suffers from a lack of provability – functional information is entirely contextual – and, perhaps more significantly, the theory is not mechanistic. In other words, it does not explain the reason for its own existence, but leaves its core claim axiomatic. This rather undermines is potential utility as a physical theory, at least as long as it remains unconnected from the broader field of information theory. While the Quanta article does a fine job explaining the hypothesis, the concept of functional information, and its utility for thinking about the increasing complexity of systems, it spends its time discussing the relationship with entropy rather than exploring the connection to information theory, which seems like a necessary dimension of integration.
Most thought-provoking to me is that relationship between complexity and entropy which the hypothesis illuminates. Traditionally, complexity is depicted as the inverse of entropy. If entropy is a measure of how differentiable a system is, with high entropy representing a high degree of uniformity or undifferentiability, then high complexity would correspond to low entropy, because high complexity corresponds to highly differentiable states. It is this way of thinking about entropy that leads to the seeming contradiction with evolution of more complex forms and all the caveats around increasing overall entropy when decreasing local entropy. In the functional information hypothesis, though, high complexity is not the opposite of high entropy, but more of a different way of looking at the probability or phase space of a system to describe its entropic states. Because the functional information must increase just like entropy must increase, the apparent complexity of the system does not reflect its entropic complexity. A highly complex system might, in fact, be the entropic base state of the system because it is the inevitable outcome for that set of conditions.
Ultimately, this is a way of exploring different ways of thinking about notions of information, complexity, and entropy. Personally, I doubt if this hypothesis will be found to have a grounding in reality, or, at least, that it is unlikely to have a mechanistic, physical underpinning in the same way that genetic changes form a physical underpinning for the qualitative theory of biological evolution. Its value instead lies in prompting us to question some of our assumptions about how we think about supposedly familiar laws of the universe.

One thought on “Complexity and Entropy – Two Sides of a Coin?”