1. Biology, Physics, and All That.
In 1944 Erwin Schrodinger published an enlightening monography titled “What is life?” where he wrote [
1]: The large and important and very much discussed question is: How can the events in space and time which take place within the spatial boundary of a living organism be accounted for by physics and chemistry? We accept that living beings do not violate fundamental laws of physics. However, living beings are strange. As put by the chemist Addy Pross [
2]: living beings seem to circumvent or mock such laws as we understand them. A stone will fall if let fall due to gravity, however birds fly whenever they want to do so. Biologists have invented a term for this (teleonomy) to express the fundamental fact that living beings have their own agenda. With that they mean that living beings move, jump, play, eat, reproduce, plan, do business, do research, and so on. Biology spans all possible length and time scales, from molecules to cells, tissues, organs, communities, and ecosystems etc., from the very ancient times (as attested by the fossil record) to the present. Therefore biology is not so different from physics which laws also pervade all length and time scales. Universality, the seeking for a few basic principles unifying the most diverse phenomena has been the main tenet of experimental sciences that have made mankind progress possible. The most diverse physico-chemical phenomena are often explained by advocating a few universal physical laws (e.g., conservation laws, the second law of thermodynamics, etc.). In a similar way, the most diverse biological processes are recurrently explained by similar if not fundamentally identical mechanisms. There is one difference though. The marvelous complexity of living matter lies far from being just a simple or casual interplay of the laws of physics and chemistry alone. Evolution is the word that must always be called upon to justify the striking complexity of living beings. The egg and chicken dilemma plagues all niches in biology inasmuch as teleonomy does. In contrast, no egg and chicken and teleonomy are found in physics where observations can be explained and predicted from a few fundamental laws.
Biology is recognized as the natural science devoted to study all aspects of living organisms. It is not exaggerated to say that by tradition physics has eluded the study of living matter. Except for its apparent complexity living matter has been deemed equal to ordinary matter. It is often attributed to chemists of the 19th century the appreciation of the importance of understanding living matter, biochemistry being the newborn discipline at that time. However, concepts such as space, time, force, and energy are not only fundamental quantities in physics they are also central to biology. Biophysics has emerged as the discipline that applies concepts and techniques from physics to study living beings. In this way, physical techniques such as X-ray diffraction, nuclear magnetic resonance, electron microscopy among others have largely contributed to revolutionize biology. During the recent decades we are witnessing the opposite trend, physicists are now using biological systems as inspiring physical models to test and scrutinize new physical theories [
3]. An example of this new trend are single molecule experiments [
4] where individual biological molecules are manipulated one at a time while the exerted force is measured (
Box 1).
Box 1. Single molecule experiments.
“Take a single DNA molecule and pull from its extremities while recording the force-extension curve until it gets fully straightened”. This thought experiment, which was just a dream a few decades ago, has now become standard in many research institutes worldwide. By labeling the ends of a DNA molecule with specific chemical groups (biotin, avidin, digoxigenin), it is possible to tether a single DNA between two surfaces. By moving one surface relative to the other and using one of them as a force sensor, it is possible to measure the force-extension curve of single biopolymers, from DNA to RNA and proteins. Optical tweezers are based on the principle of conservation of linear momentum by which a microscopic nearly transparent object (e.g., a polystyrene or silica bead) with an index of refraction higher than its surrounding medium deflects an incoming light ray exerting a net force on the object. Invented by Arthur Ashkin in Bell Labs in 1970, optical tweezers have revolutionized research in physics, chemistry, and biology. Ashkin’s discovery has been awarded Nobel Prize in Physics 2018. An optical trap for manipulating single molecules is produced by focusing an infrared beam inside a fluidics chamber, optically trapping a micrometer-sized bead and measuring either the deflected light using position sensitive detectors, the bead position with a CCD camera or back focal plane interferometry. Pulling experiments use dumbbells made of a molecule tethered between two beads (
Figure 1, left). In single-trap setups one bead is immobilized in a pipette by air suction, the other is captured in an optical trap and measures the force exerted on the molecule. By moving the optical trap relative to the pipette we can record the so-called force-distance curve. In
Figure 1 (right) we show the force-extension curve obtained by stretching a 24kb DNA molecule. These experiments are used not only to measure the elastic properties of individual biopolymers (e.g., the persistence length or bending stiffness of biopolymers) but also to unravel the most complex molecular reactions, from protein folding to enzymatic reactions. They are also used to study the viscoelastic properties of single cells and even to challenge fundamental physical theories related to energy and information.
Figure 1.
(Left) Experimental setup in a DNA pulling experiment. (Right) Force-extension curve measured by pulling a single 24kb DNA at standard conditions (T = 298K and 1 M NaCl Tris buffer). At least three force regimes can be identified. At forces below 10 pN the DNA random coil is extended against thermal fluctuations (entropic regime). At forces between 10pN and 60 pN the DNA is stretched above its contour length (8.3 µm) (enthalpic regime). Above 65pN DNA is overextended by ≈ 70% of its contour length (overstretching regime).
Figure 1.
(Left) Experimental setup in a DNA pulling experiment. (Right) Force-extension curve measured by pulling a single 24kb DNA at standard conditions (T = 298K and 1 M NaCl Tris buffer). At least three force regimes can be identified. At forces below 10 pN the DNA random coil is extended against thermal fluctuations (entropic regime). At forces between 10pN and 60 pN the DNA is stretched above its contour length (8.3 µm) (enthalpic regime). Above 65pN DNA is overextended by ≈ 70% of its contour length (overstretching regime).
Biological matter is intrinsically out-of-equilibrium with intra and intermolecular forces determining its thermodynamic and kinetic stability. Typical energies involved in remodeling processes fall in the low energy k
BT scale (1 k
BT ≈ 4 · 10
−21 Joules at standard conditions, T = 298 K). This is also the level of the thermal or Brownian noise determined by the characteristic kinetic energy of water molecules freely moving in the aqueous environment (
Box 2). In recent years there have been many breakthroughs in the physical study of non-equilibrium small systems, i.e., systems where the magnitude of weak interacting forces and the Brownian forces present are comparable [
5,
6]. This has led to a fruitful transfer and cross-fertilization of theoretical concepts and experimental methods in physics and biology. Rows of physicists have embarked to study the most astonishing non-equilibrium state in nature: living matter. It is in living matter where scientists recognize energy, entropy, and information as the three main driving forces of nature. While thermodynamic processes in ordinary matter are driven by free-energy minimization (i.e., competition between energy and entropy), living matter seems to be predominantly governed by information flows across all organizational and stratification levels (from molecules to cells, tissues, organs, organisms…). This paper discusses the abovementioned aspects of living matter from a physicist’s perspective. I sustain that a breakthrough in novel physical concepts is required to reach a satisfactory understanding of living matter and life in general.
In 1827 Robert Brown a well-known botanist for his detailed descriptions of the nucleus and cytoplasm of the cell and his contributions to the taxonomy of plants, made an important discovery during his pollination studies. While examining through the microscope the motion of the grains of pollen of a plant suspended in water, he observed motion to be erratic and unpredictable as if the grains were alive. After the atomic theory was developed in the beginning of the 20th century it became clear that what Brown had observed was the effect of the stochastic or random collisions of the molecules of water against the grains of pollen. Kicked by water molecules coming from all directions the much bigger grains of pollen jiggled erratically in the water solution. Such erratic motion has received the name of Brownian motion and is key to all molecular reactions (
Figure 2). At the dawn of the 20th century the theory of Brownian motion was developed by M. Smoluchovsky in Krakow and A. Einstein in Bern, nearly independently from each other. Later experiments in the 1920s conducted by J.-B. Perrin on diffusive colloidal particles provided the final proof. Perrin was also able to obtain an estimate of the Avogadro number using physical methods alone that agreed with those obtained by chemists. The study of Brownian motion has recently expanded to include endogeneous Brownian-like forces, the so-called self-propelled Brownian particles or active matter, an inspiring and broad research field that raises new exciting questions about nonequilibrium phenomena [
7,
8].
Figure 2.
(Left) Illustration of Brownian motion. A pollen grain (central circle, yellow) embedded in water (water molecules represented as tiny black dots). (Center) Confocal image of a colloidal solution. Colloidal particles are subjected to Brownian motion in water solution (grey background). Water molecules are unobservable. (Right) Trajectories drawn by Perrin in his original experiments on latex particles a century ago. By measuring the mean squared displacement from the data it is then possible to extract the diffusion constant D. From the water viscosity η and the expression for the friction coefficient (sphere case) one can extract the value of the Boltzmann constant kB. From there the Avogadro number follows: NA=R/kB with R the ideal gas constant.
Figure 2.
(Left) Illustration of Brownian motion. A pollen grain (central circle, yellow) embedded in water (water molecules represented as tiny black dots). (Center) Confocal image of a colloidal solution. Colloidal particles are subjected to Brownian motion in water solution (grey background). Water molecules are unobservable. (Right) Trajectories drawn by Perrin in his original experiments on latex particles a century ago. By measuring the mean squared displacement from the data it is then possible to extract the diffusion constant D. From the water viscosity η and the expression for the friction coefficient (sphere case) one can extract the value of the Boltzmann constant kB. From there the Avogadro number follows: NA=R/kB with R the ideal gas constant.
2. Living Mater is Heterogeneous and Plastic
Biological matter is intrinsically soft with weak molecular forces (electrostatic, hydrophobic,…) providing thermodynamic stability. Moreover typical energies involved in remodeling processes fall in the kBT energy range, at the level of thermal noise fluctuations. This means that living matter is subject to strong fluctuations due to the comparable magnitude of weak interacting forces and the Brownian forces present in cellular environments. This feature distinguishes biological matter with respect to ordinary matter making the former an ideal playground to investigate nonequilibrium phenomena.
A main feature of living matter as compared to inanimate matter is its high complexity. A cell can be seen as a tiny bag crowded of different types of molecules interacting through a myriad of regulatory pathways. Such complexity differs from what is observed, for example, in a water droplet. There are two fundamental aspects of living mater that make it unique to the physicist. Biological individuals and populations are fundamentally heterogeneous and plastic. What do these words mean? Let us first digress on heterogeneity. Phenotypic and genotypic variations across individuals of a given species population are the rule. A population of cells of the same strain is intrinsically heterogeneous making experiments not quite reproducible: the same strain, the same environmental conditions, the same “
everything” often produces different results. At a larger biological scale, the unpredictable evolution of evolutionary diseases in multicellular organisms (e.g., cancer) is a prominent example that exhibits the major role of heterogeneity at the level of cells and tissues. The consequence of heterogeneity is tragic for cancer treatment: there is no one magic bullet to defeat cancer [
9].
Heterogeneity is not just a feature of cell populations but it is also present throughout all biological scales. At the structural molecular level, myoglobin, the oxygen carrier protein in the muscle tissue of vertebrates, is known to fold into a heterogeneous set of different native structures, all them functional for binding oxygen [
10]. The recent discovery of a multiplicity of native states for RNA enzymes [
11], DNA unwinding helicases [
12] and the large variability observed in evolved polyclonal antibodies in the immune system [
13] are just manifestations of the same fundamental fact.
The second basic feature of living matter is plasticity. Plasticity is Janus-faced with two apparently opposite features: adaptability (capacity of changing) and resilience (capacity of resistance to changes). Life and living matter adapt to intermittent environmental changes and resist to continued aggressions. Subject to the continued action of remodeling forces biological structures must be stable enough to maintain their structural integrity (resilience) and, at the same time, malleable enough to adapt to important environmental changes (adaptability). Failing to do so impairs biological function. Plasticity is also essential in an evolutionary context where mutations (adaptability) and amplification of the best fitted individuals (resilience) lead to diversifying and better adapted populations. Resilience and adaptation are forces of opposite character and essential ingredients of the plasticity of living matter. Too much resilience impairs adaptation and too much adaptation inhibits resilience. In the evolutionary context a balance between these two counteractive forces is needed to guarantee sufficiently long-lived organisms that reproduce at high enough rates to avoid the extinction of species.
Plasticity is present in biology throughout all scales embedded with life: from molecules and cells to tissues, organs, individuals, and even populations, societies, and communities. Interestingly, these two types of counteractive forces (resilience and adaptation) are even present in the inanimate physical world. Embedded in noisy aqueous environments the forces that hold biological matter must be strong enough for molecules to be stable and functional and, at the same time, weak enough to allow for remodeling and adaptation. The coexistence of these two features is possible in the presence of Brownian motion, the noise background due to the erratic motion of water molecules in an aqueous environment (
Box 2). Two counteractive opposite forces operate in the observed Brownian motion that cancel out in average: the active collision by water molecules on the grain of pollen and the frictional drag force experienced by the grain when moving through water. The energy exchange between the grain and the environment follows the rule of
what you get equals what you give. The average kinetic energy delivered to the grain by the water collisions is lost in the form of heat to the environment due to friction with water. In physics this energy balance is known as Stokes–Einstein relation or, in more general and technical terms, fluctuation-dissipation theorem [
14]. The Stokes–Einstein relation states that the diffusion constant of the grain, D, equals the thermal energy unit k
BT divided by the friction coefficient γ or
. The diffusion constant D is a measure of how much the grain of pollen jiggles in all directions. It is therefore the equivalent of the adaptability in biology, the easiness to change configuration or state. In comparison, the friction coefficient γ is the equivalent of resilience in biology, the resistance to motion induced by the viscous collective forces exerted by the colliding water molecules.
3. Life at the Edge of Chaos
The balanced equilibrium between adaptive and resilient forces has a visible consequence in living matter at the molecular scale. The fundamental biological forces regulating intra and intermolecular interactions operate at the edge of chaos. This means that the thermal noise level and the stabilizing energies of macromolecules in tissues are comparable and on the same order, typically a few kcal/mol or kBT (1 kBT = 0.6kcal/mol at 298 K). This balance can only be accomplished by a fine compensation between enthalpic and entropic forces, often known as enthalpy-entropy compensation. In thermodynamics enthalpy (H) and entropy (S) are the two contributions to free energy (G): G = H − TS. The free energy G quantifies the amount of work a system can exert at specific conditions of temperature (T) and pressure. Most intra-molecular and inter-molecular interactions result from the combined action of several weak forces (hydrogen bonding, electrostatic, hydrophobic…). The typical enthalpy of a single hydrogen bond in a water molecule is about 7 kcal/mol, therefore the enthalpy of formation ΔH of most biomolecular complex involving at least several hydrogen bonds (e.g., the native state of a protein) can easily reach a few hundreds of kcal/mol. However, the overall stability of such complex measured by its free energy of formation ΔG is not larger than a few tens of kcal/mol, i.e., ten times smaller (this explains why proteins typically melt at temperatures well below the boiling point of water, 100 °C). This can only be achieved if the entropy contribution to the formation of such complex ΔS is comparable to ΔH and of the same sign, making the difference, , smaller than the magnitude of the two terms, ΔH and TΔS. Pictorially one could talk of proteins as being like rocks that, however, melt at moderate temperatures.
A remarkable feature of molecular plasticity is molecular folding. Under appropriate conditions nucleic acids (DNA and RNA), proteins and other biopolymers spontaneously fold into their stable and native structures (i.e., the properly folded and biologically functional three dimensional structure). Upon heating above the melting temperature T
M, the multiplicity of interactions stabilizing nucleic acids and proteins are disrupted generating a random coiled polymer. The reverse process, molecular folding, is obtained upon cooling the sample below T
M. It is remarkable that heating and cooling processes are often quasi-reversible meaning that biomolecules smoothly transit between the folded and unfolded conformations avoiding being trapped by misfolded or partially folded states. Molecular plasticity is observed in single molecule unzipping experiments, i.e., the disruption of the native structure of a biomolecule by applying mechanical forces. For example, in DNA unzipping the two strands of a DNA molecule are pulled apart until the double helix dissociates into its single strands. The re-zipping process is quasi-reversible, followed by the smooth re-annealing of the two strands into a double helix (
Box 3).
Box 3. olecular unzipping.
A beautiful and simple experiment that demonstrates the plasticity of biomolecules is DNA unzipping, the physical process by which the double helix is mechanically disrupted by pulling the two strands apart. Such experiments can be carried out with optical tweezers (
Box 1) by attaching each of the two strands at one extremity of a DNA hairpin to micron-sized polysterene or silica beads via flexible DNA linkers (
Figure 3, left). One of the beads is then captured in a steerable optical trap that acts as force sensor. By moving the optical trap away from the pipette it is possible to exert gradually increasing forces, first to stretch the linkers and then, upon reaching 15 pN, break the bonds (base pairing and stacking) that stabilize the double helix. The measured force-extension curves display a characteristic sawtooth pattern indicative of “force-induced” melting of the double helix. A given unzipping pattern is characteristic of a particular DNA sequence. The plasticity of DNA molecules is revealed upon reversing the movement of the optical trap. The double helix can then be reversibly reannealed (i.e., without exhibiting hysteresis) providing a measurement of the thermodynamic force-extension curve (
Figure 3, right). Fitting such curve to polynucleotide models of DNA duplex formation (such as the nearest-neighbor model) allows us to extract improved energy numbers for the hybridization of complementary nearest-neighbor motifs, useful for predicting melting temperatures in DNA duplexes of arbitrary sequence [
15,
16]. The unzipping assay can also be used for DNA footprinting or the determination of the position at which small ligands bind DNA with one basepair resolution [
17].
Figure 3.
(
Left) Experimental setup of unzipping experiments. Figure not to scale. (
Right) Force versus optical-trap position measured in an unzipping (black curve) and re-zipping (red curve) experiment of a DNA hairpin of 2.2 kb at 1kHz acquisition frequency. The force sawtooth-like pattern at 15pN shows the progressive disruption of base pairs along the sequence. The rightmost part of the curve corresponds to the elastic response of the single-stranded DNA. Note the force fluctuations along the curve due to thermal noise and the low hysteresis between unzipping and rezipping curves (black and red data superimpose). The inset below are the same data but filtered to 1Hz bandwith. Data from Hughet et al. [
15].
Figure 3.
(
Left) Experimental setup of unzipping experiments. Figure not to scale. (
Right) Force versus optical-trap position measured in an unzipping (black curve) and re-zipping (red curve) experiment of a DNA hairpin of 2.2 kb at 1kHz acquisition frequency. The force sawtooth-like pattern at 15pN shows the progressive disruption of base pairs along the sequence. The rightmost part of the curve corresponds to the elastic response of the single-stranded DNA. Note the force fluctuations along the curve due to thermal noise and the low hysteresis between unzipping and rezipping curves (black and red data superimpose). The inset below are the same data but filtered to 1Hz bandwith. Data from Hughet et al. [
15].
Non-native states are troublesome for biological function: misfolding often impairs molecular reactions and regulatory processes in general. Nucleic acids and proteins are disordered polymers that convey biological information, either at the level of their monomer sequence (primary structure) or in their three-dimensional folded structure (secondary and tertiary structure). In the kinetic process of folding proteins should either remain trapped in one among a multiplicity of states of comparable thermodynamic stability or even should not fold at all. However, they often manage to reproducibly fold into a highly stable native structure uniquely determined by the sequence (Anfinsen hypothesis, see however the recent developments mentioned in
Section 2). Designing nucleic acid (single stranded) sequences or polypeptide chains that fold into specific structures is a hard mathematical problem (belonging to the class of non-polynomial complete optimization problems [
18]). Therefore molecular folding (the problem of the existence of well-defined and unique native folds in disordered polypeptide chains) is a key topic in biophysics that still remains poorly understood. The clue, as in most problems in biology, lies in the evolutionary context. To understand molecular folding one must comprehend primary sequences and tertiary folds from a co-evolutionary perspective. This is the classical egg–chicken dilemma that pervades biology.
Most reaction pathways and regulatory processes in the cell suffer from the same problem: the thermodynamic and kinetic parameters are finely tuned to operate in a narrow range of conditions, yet in another manifestation of life at the edge of chaos. Unfortunately, the tedious and time consuming task of precisely determining the parameters of most molecular pathways in the cell (in ex-vivo or in-vivo conditions) becomes a hard task due to the utmost complexity of the myriad of contributing endogenous and exogenous factors, heterogeneity remaining as the ultimate roadblock limiting accurate measurements. In this new era of information and biology we may witness a new fundamental indeterminism in biology reminiscent of the role the theory of chaos played in mathematics and physics at the beginning of the past century.
The current trend in biology of classifying collective characterization and quantification tools using the neologism
omics (genomics, proteomics, metabolomics, interactomics, etc.) may ultimately prove insufficient to better understand biological complexity and to make reliable predictions in life science. As in the mathematical theory of chaos, the fact that our knowledge of biological details may never be sufficiently complete will prevent reliable predictions of the
agenda of living beings:
teleonomy (
Section 1) will remain inaccessible. In a related fashion, progress in the medical prognosis of most evolutionary and degenerative diseases (at the moment possibly the greatest challenge in medicine) will reveal frustrating despite massive financial investments and efforts. Evolutionary diseases, such as cancer and many ageing related neurological disorders, will remain a challenge. In this regard, inference reasoning rather than deductive knowledge spurs as a disruptive strategy to deal with the new challenges in biology.
4. Energy, Entropy, and Information
Entropy is one of the most relevant quantities in physics. It is a relative of energy, the other extensive quantity in physics that measures the capability of a physical system to do work [
19]. However, entropy is endowed of some features that make it special, sometimes elusive and, why not, even mysterious. Energy was introduced in the late 17th century by father and son duo Johann and Daniel Bernoulli and by the inventor of calculus Leibnitz to quantify the
vis viva, the ability of a system to be
alive and kicking, i.e., to generate motion. Entropy was introduced much later in mid-nineteenth-century by Clausius who tried to quantify the higher quality of work as compared to heat in thermal processes. According to the first law of thermodynamics, heat and work are two fully exchangeable kinds of energy. However, although work can be fully converted into heat the reverse is not true. Entropy governs the fate of thermodynamic transformations as per the
second law: the total entropy of the universe always increases. Thermodynamics is probably the only discipline in physics whose laws firmly stand despite of all revolutionary advances in physics witnessed during the 20th century. Even the birth of quantum mechanics was spurred by thermodynamics: the studies by Boltzmann, Wien, and others on the black body radiation were the testing ground for Planck radiation law and the many developments that came after [
20].
mEnergy and entropy however exhibit striking differences. First, energy is conserved (first law) whereas entropy is not (second law). That makes the second law even stranger because it is probably the only physical law described by a mathematical inequality while the rest of laws describe conservation of physical quantities (energy-mass, linear and angular momentum, electric charge, and so on). Were it not for the fact that the second law is so firmly established, the suspicious scientist might think that something is missing in the entropy balance that redeems the second law into a full equality. Second, energy is a deterministic quantity that is assigned to a state characterized by a set of probabilities
pi for the different availabel configurations
i. While it is possible to take a snapshot of a system and define the energy content at a given time, it is not possible to define its entropy content. Gibbs statistical entropy is usually defined by the mathematical relation
where the sum runs over all possible experimental outcomes, and equals thermodynamic entropy in equilibrium conditions. Moreover, Gibbs statistical entropy coincides with the mathematical definition of information. But what is information? In science, and physics in particular, a given quantity has true meaning only if it is measurable. Information, a widely used term in the most diverse human ambits, saw its most fruitful development in the work by C. Shannon in 1948 [
21] who set the basis of modern information theory [
22]. According to Shannon
“Information is the resolution of uncertainty” and its quantitative measure is the so-called
uncertainty function H equal to the Gibbs statistical entropy. It has been attributed to John Von Neumann, the father of modern computer, the following quote: “
You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, no one really knows what entropy really is, so in a debate you will always have the advantage” [
23]. Entropy and information are therefore two sides of the same coin. It is natural to expect that, if we know how to measure energy and entropy then we should know how to measure information too. Hints to this were discovered a century ago by chemists and physicists during the fierce discussions about irreversibility in statistical mechanics (e.g., Loschmidt and Maxwell demon paradoxes) unleashed by the molecular hypothesis (
Box 4).
Box 4. Entropy and Information.
The relation between entropy and information dates back to J. C. Maxwell in 1867 who proposed a thought experiment to violate the second law of thermodynamics. Maxwell imagined “
a very small intelligent being endowed with free will, and fine enough tactile and perceptive organization to give him the faculty of observing and influencing individual molecules of matter” [
24,
25]. By observing the speed of molecules in an isolated vessel made of two-compartments separated by a wall but connected through a small gate, the demon could, effortlessly and without any expenditure of work, open and close the gate to separate cold (slow moving) from hot (fast moving) molecules (
Figure 4). In doing so the purposeful demon generates a temperature gradient in an otherwise temperature uniform isolated system, thereby decreasing the total entropy, against the second law. A variant of the Maxwell demon is the Szilard engine where the demon operates in an isothermal (rather than isolated) system and uses the measurement to extract heat from the bath to fully convert into work, reaching the maximum of
kBTlog(2) for a single-bit (two-state) Szilard engine, also called the Landauer limit. The resolution of this paradox (colloquially, exorcising the demon) came from the thermodynamics of data processing. In 1961 R. Landauer demonstrated that any irreversible logical operation is also thermodynamically irreversible requiring heat dissipation typically of the order of
kBT [
26]. Next C. Bennett demonstrated that bit erasure is an intrinsically irreversible logical operation needed to restore the initial state of the demon [
27]. This explicit connection rescues the second law resolving the long-standing centennial paradox. One cannot avoid making comparisons between feedback action by the Maxwell demon and the many regulatory processes in living beings processed by molecular machines. How far can we push the analogy? Whether information is a relevant and measurable quantity in biological processes remains an open problem.
Figure 4.
(Left) The demon (green) observes the moving molecules in a gas and separates those moving fast (red dots) from those moving slow (blue dots) by the effortless opening of a small gate. (Right) A temperature gradient is established against the second law. The image «Maxwell's demon», by Htkym, is under license CC BY-SA 3.0, Wikimedia Commons.
Figure 4.
(Left) The demon (green) observes the moving molecules in a gas and separates those moving fast (red dots) from those moving slow (blue dots) by the effortless opening of a small gate. (Right) A temperature gradient is established against the second law. The image «Maxwell's demon», by Htkym, is under license CC BY-SA 3.0, Wikimedia Commons.
How to measure information [
21,
22] in arbitrary systems and conditions remains unclear although some clues have been obtained by recent developments on fluctuation theorems. These are recently discovered mathematical relations in statistical mechanics that extend results such as the fluctuation–dissipation theorem and the Stokes–Einstein relation (section 2) to far from equilibrium systems [
6,
28,
29]. Key to measure information is measuring energy differences by work measurements. In the same way that entropies correspond to free energy differences measured at two close temperatures, information corresponds to free energy differences measured at two given experimental conditions. In the case of the Maxwell demon (
Box 4) the average work per cycle that can be extracted corresponds to the free energy difference of erasing a single bit. Recent experimental realizations of the Szilard engine using various kind of single particle or molecular systems have successfully implemented information-to energy conversion and tested the validity of the Landauer limit [
30,
31,
32,
33]. The generalization of these results to arbitrary nonequilibrium systems and conditions will for sure generate new fascinating developments in physics and biology.
5. Concluding Remarks
In biology living beings and individuals (from molecules to organisms) are not alone, they are always part of a population or an ensemble of individuals. Living beings, populations and life in general are the result of natural evolution. These populations evolve under the rules of Darwinian selection where the individuals best fit to the environmental pressure overdo the rest. Darwinian evolution rests on a dynamics of a very special kind where mutations and selective amplifications of the fittest species determine the evolving phenotypes. From a physicist point of view evolving populations produce a startling non-stationary state where basic thermodynamics concepts such as energy, matter, entropy, and information appear intertwined in a complex and undecipherable way [
34]. Elucidating how to define and measure information in biological systems seems a crucial step towards further advance our understanding of biological complexity. Plasticity, the most salient feature of living matter, comes from the interplay between energy (resilience) and entropy (adaptability). However, it is unclear how the two driving forces alone (energy and entropy) drive living matter into such apparent marvelous complexity. There is no evidence that the living state is in conflict against any fundamental law of physics and chemistry, yet we have no clue about how to explain teleonomy (or the fact that living beings have agendas to quote Pross [
2] again) from physical principles alone. Physical information might be the missing link that we need to make teleonomy a universal aspect of living matter, and maybe from inanimate matter too. One might speculate that information, the sibling of entropy, is a physically measurable quantity governed by laws that, despite of the great advances in biology, has passed unobserved to the scientist and is waiting to be discovered in the future. For sure this is pure speculation. On what researchers tackling in their everyday life with living matter do agree upon is that information is out there pervading all niches of the living world [
35]. One might even dare more and claim that information is the missing contribution that makes the strangest physical law (the second law) to be a mathematical equality (rather than an inequality). Seeking theories that lever information to the same scientific level energy and entropy currently have is most needed to successfully pursue this line of thought. Without a dedicated and concomitant theoretical and experimental work it will not be possible to unravel the threads of this mystery.
Many discoveries in science can be attributed to experimental tests of theories that report discrepancies with the expected predictions. The crucial experiment or observation that revolutionizes science by completely changing our view of the world (Kuhn’s change of paradigm [
36]) is a recurrent theme in science. Examples of such experiments are the Michelson–Morley experiment of the earth motion relative to the ether (that gave birth to relativity theory) or the photoelectric effect (that spurred the development of quantum theory). To date all theoretical and experimental developments aiming to better understand living matter have not contested a single fundamental law of physics and chemistry. But a time may arrive in which, in the course of a new experimental observation or test, something comes in stark disagreement with theoretical predictions. That day may spur a change of paradigm unleashing a new revolution in science that unifies physics and biology (as quantum theory did for physics and chemistry a century ago). Maybe information is the crucial element of the next revolution in science.