Since Claude Shannon gave birth to Information Theory, we have seen how a number of subject has been influenced by his ideas. Information Theory is fundamentally based in the idea of entropy, a very well known, but not always well understood magnitude in Thermodynamics.

I recently came across two interesting articles where Information Theory is applied: the creation of Maxwell’s Demon and the relation between information and living systems.

In the first one researchers have confirmed that information comes with a cost, and the Second Law of Thermodynamics cannot be cheated.

As interesting as confirming that information has an energetic cost, I find the second article more exciting. In it, Christoph Adami at Michigan State University proposes the idea that life is basically a matter of information. Adami’s idea is that a living system is a system that is not in thermodynamic equilibrium, but maintains itself in a state that differs from maximum entropy. This difference is the information that the living system contains. Somehow, the living system keeps this difference while it is alive.
In his mathematical model, he found out that in systems close to equilibrium he can find molecules that can replicate themselves (living systems). Whenever the difference between the system and the environment is large, the probability of finding a living system is very small.

This is not the first attempt to mathematically understand Biological problems. It reminded me of the application of Information Theory to Ecology by Margalef , or the attempts of understanding systems far from equilibrium by Prigogine, for example.

Can we define life in such an abstract way? Is it so simple?

Photo from Unicam.

## No comments :

Post a Comment