Skip to main content
Home
  • Log in
  • My Cart

Advanced Search

  • Home
  • Articles
  • Front Matter
  • News
  • Podcasts
  • Authors
  • Submit
Research Article

Information-based fitness and the emergence of criticality in living systems

Jorge Hidalgo, Jacopo Grilli, Samir Suweis, Miguel A. Muñoz, Jayanth R. Banavar, and Amos Maritan

    See allHide authors and affiliations

    PNAS July 15, 2014 111 (28) 10095-10100; first published June 30, 2014; https://doi.org/10.1073/pnas.1319166111
    1. Edited* by William Bialek, Princeton University, Princeton, NJ, and approved May 27, 2014 (received for review October 12, 2013)

    • Article
    • Figures & SI
    • Info & Metrics
    • PDF

    Significance

    Recently, evidence has been mounting that biological systems might operate at the borderline between order and disorder, i.e., near a critical point. A general mathematical framework for understanding this common pattern, explaining the possible origin and role of criticality in living adaptive and evolutionary systems, is still missing. We rationalize this apparently ubiquitous criticality in terms of adaptive and evolutionary functional advantages. We provide an analytical framework, which demonstrates that the optimal response to broadly different changing environments occurs in systems organizing spontaneously—through adaptation or evolution—to the vicinity of a critical point. Furthermore, criticality turns out to be the evolutionary stable outcome of a community of individuals aimed at communicating with each other to create a collective entity.

    Abstract

    Empirical evidence suggesting that living systems might operate in the vicinity of critical points, at the borderline between order and disorder, has proliferated in recent years, with examples ranging from spontaneous brain activity to flock dynamics. However, a well-founded theory for understanding how and why interacting living systems could dynamically tune themselves to be poised in the vicinity of a critical point is lacking. Here we use tools from statistical mechanics and information theory to show that complex adaptive or evolutionary systems can be much more efficient in coping with diverse heterogeneous environmental conditions when operating at criticality. Analytical as well as computational evolutionary and adaptive models vividly illustrate that a community of such systems dynamically self-tunes close to a critical state as the complexity of the environment increases while they remain noncritical for simple and predictable environments. A more robust convergence to criticality emerges in coevolutionary and coadaptive setups in which individuals aim to represent other agents in the community with fidelity, thereby creating a collective critical ensemble and providing the best possible tradeoff between accuracy and flexibility. Our approach provides a parsimonious and general mechanism for the emergence of critical-like behavior in living systems needing to cope with complex environments or trying to efficiently coordinate themselves as an ensemble.

    • evolution
    • adaptation
    • self-organization

    Physical systems undergo phase transitions from ordered to disordered states on changing control parameters (1, 2). Critical points, with all their remarkable properties (1, 2), are only observed upon parameter fine tuning. This is in sharp contrast to the ubiquity of critical-like behavior in complex living matter. Indeed, empirical evidence has proliferated that living systems might operate at criticality (3)—i.e. at the borderline between order and disorder—with examples ranging from spontaneous brain behavior (4) to gene expression patterns (5), cell growth (6), morphogenesis (7), bacterial clustering (8), and flock dynamics (9). Even if none of these examples is fully conclusive and even if the meaning of “criticality” varies across these works, the criticality hypothesis—as a general strategy for the organization of living matter—is a tantalizing idea worthy of further investigation.

    Here we present a framework for understanding how self-tuning to criticality can arise in living systems. Unlike models of self-organized criticality in which some inanimate systems are found to become critical in a mechanistic way (10), our focus here is on general adaptive or evolutionary mechanisms, specific to biological systems. We suggest that the drive to criticality arises from functional advantages of being poised in the vicinity of a critical point.

    However, why is a living system fitter when it is critical? Living systems need to perceive and respond to environmental cues and to interact with other similar entities. Indeed, biological systems constantly try to encapsulate the essential features of the huge variety of detailed information from their surrounding complex and changing environment into manageable internal representations, and they use these as a basis for their actions and responses. The successful construction of these representations, which extract, summarize, and integrate relevant information (11), provides a crucial competitive advantage, which can eventually make the difference between survival and extinction. We suggest here that criticality is an optimal strategy to effectively represent the intrinsically complex and variable external world in a parsimonious manner. This is in line with the hypothesis that living systems benefit from having attributes akin to criticality—either statistical or dynamical (3)—such as a large repertoire of dynamical responses, optimal transmission and storage of information, and exquisite sensitivity to environmental changes (2, 5, 12⇓⇓⇓–16).

    As conjectured long ago, the capability to perform complex computations, which turns out to be the fingerprint of living systems, is enhanced in “machines” operating near a critical point (17⇓–19), i.e., at the border between two distinct phases: a disordered phase, in which perturbations and noise propagate unboundedly—thereby corrupting information transmission and storage—and an ordered phase where changes are rapidly erased, hindering flexibility and plasticity. The marginal, critical situation provides a delicate compromise between these two impractical tendencies, an excellent tradeoff between reproducibility and flexibility (12, 13, 16) and, on larger time scales, between robustness and evolvability (20). A specific example of this general framework is genetic regulatory networks (19, 21). Cells ranging from those in complex organisms to single-celled microbes such as bacteria respond to signals in the environment by modifying the expression of their genes. Any given genetic regulatory network, formed by the genes (nodes) and their interactions (edges) (22), can be tightly controlled to robustly converge to a fixed almost-deterministic attractor—i.e. a fixed “phenotype”—or it can be configured to be highly sensitive to tiny fluctuations in input signals, leading to many different attractors, i.e., to large phenotypic variability (23). These two situations correspond to the ordered and disordered phases, respectively. The optimal way for genetic regulatory networks to reconcile controllability and sensitivity to environmental cues is to operate somewhere in between the two limiting and impractical limits alluded to above (19) as has been confirmed in different experimental setups (5, 7, 24). Still, it is not clear how such tuning to criticality comes about.

    Our goal here is to exploit general ideas from statistical mechanics and information theory to construct a quantitative framework showing that self-tuning to criticality is a convenient strategy adopted by living systems to effectively cope with the intrinsically complex external world in an efficient manner, thereby providing an excellent compromise between accuracy and flexibility. To provide some further intuition, we use genetic regulatory networks as a convenient guiding example, but one could equally well consider neural networks, models for the immune response, groups of animals exhibiting collective behavior, etc., with each specific realization requiring a more detailed modeling of its special attributes.

    We uncover coevolutionary and coadaptive mechanisms by which communities of living systems, even in the absence of other forms of environmental complexity, converge to be almost critical in the process of understanding each other and creating a “collective entity.” The main result is that criticality is an evolutionary/adaptive stable solution reached by living systems in their striving to cope with complex heterogeneous environments or when trying to efficiently coordinate themselves as an ensemble.

    Results

    Mathematical Framework.

    The external environment in which living systems operate is highly variable, largely unpredictable, and describable in terms of probability distribution functions. Living systems need to modify their internal state to cope with external conditions, and they do so in a probabilistic manner. To be specific, but without loss of generality, we represent an environmental cue “perceived” and processed by a living system as a string of N (binary) variables, s = (s1, s2, … sN). A specific environmental source is modeled by the probability distribution Psrc with which it produces each of the 2N possible states. For concreteness, this distribution is assumed to depend on a set of parameters, α = (α1, α2, …), accounting for environmental variability. We turn now to an individual living system or “agent,” which seeks to adapt itself to cope with the perceived stimuli/signals emanating from a given environmental source. This is accomplished by changing its internal state, encapsulated in a second probability distribution function, Pint, specified by a different—smaller in principle—parameter set β = (β1, β2, …) aimed at capturing the essential features of Psrc in the most efficient—although in general imperfect—way (see Fig. 1). Henceforth we will denote the external source and its internal representation by Psrc(s|α) and Pint(s|β) respectively.

    Fig. 1.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Fig. 1.

    Living systems coping with the environment. A illustrates a living system responding to an environmental source (e.g., a bacteria responding to some external conditions such as the presence/absence of some nutrients, pH concentration, or temperature). A given source, labeled by the set of parameters α, can only be probabilistically gauged by the system. Psrc(s|α) is the most accurate representation that the system can potentially generate in terms of the Boolean variables (or bits) s. However, such a representation might not be accessible to the system by merely changing its internal state parameters, β, and the actual internal state, Pint(s|β) (e.g., the probability of a gene expression pattern), is usually an imperfect proxy for Psrc(s|α). The optimal choice of parameters β—aiming at capturing the most relevant features of the environment—is obtained by minimizing the KL divergence of Pint(s|β) from Psrc(s|α). In genetic networks, changing internal parameters is equivalent to changing the interactions between the different (Boolean) variables (nodes of the networks in the figure). B shows a more complex scenario, where the system has to cope with multiple and diverse sources. The internal state has to be able to accommodate each of them. In C, the environment is not imposed ad hoc but, instead, it is composed of other individuals, and every agent needs to cope with (“understand”) the states of the others. Each agent evolves similarly to the others in the community, trying to exhibit the same kind of state, generating in this way a self-organized environment. In the case of sufficiently heterogeneous externally imposed sources as well as in the self-organized case, we find that evolutionary/adaptive dynamics drive the systems to operate close to criticality.

    In our guiding example, the external cues could be, for instance, the environmental (temperature, pH, …) conditions, which are variable and can only be probabilistically gauged by a cell/bacterium. The binary vector s = (s1, s2, … sN) can be thought of as the on/off state of the different N genes in its (Boolean) genetic regulatory network (19, 21, 22). In this way, Psrc(s|α) can be interpreted as the probability that the most convenient state aimed at by the system to cope with a given environmental condition is s, while Pint(s|β) is the actual probability for the genetic network state (attractor) of a given individual—with its limitations—to be s. Without loss of generality, we consider that there is (at least) one control parameter, say β1, such that—other parameters being fixed—it determines in which phase the network is operating.

    Our thesis is that the capacity of living systems to tune their internal states to efficiently cope with variable external conditions provides them with a strong competitive advantage. Thus, the internal state Pint(s|β) should resemble as closely as possible the one most in accord with the environmental signal Psrc(s|α); in other words, one seeks the distribution that the system should express to best respond to the external conditions. Information theory provides us with a robust measure of the “closeness” between the aimed (source) and the actual (internal) probability distribution functions. Indeed, the Kullback−Leibler (KL) divergence (25), D(α|β), quantifies the information loss when the internal state is used to approximate the source (see Materials and Methods). The KL divergence is asymmetric in the two involved probability distributions, it is never negative, and it vanishes if and only if the two distributions are identical (SI Appendix, section S2). Minimizing the KL divergence with respect to the internal state parameters, β, generates the optimal, although in general imperfect, internal state aimed at representing or coping with a given source (see Fig. 1A).

    More generally, in an ever-changing world, the requirement for an individual is not just to reproduce a single source with utmost fidelity but rather to be able to successfully cope with a group of highly diverse sources (see Fig. 1B). A particularly interesting example of this would comprise a community of similar individuals who together strive to establish some kind of a common collective language (see Fig. 1C). In any of these complex situations, our working hypothesis is that an individual has a larger “fitness” when a characteristic measure, e.g., the mean, of its KL divergences from the set of diverse sources is small, i.e., fit agents are those whose internal states are close to those required by existing external conditions.

    As an illustrative example, consider two individual agents A and B—the source for A is B and vice versa—each of them with its own probabilistic gene network. The relative fitnesses of A and B are determined by how well the set of cues (described by the probability distribution Psrc) of one organism is captured by the other with minimum information loss, and vice versa [for utter simplicity, we could assume that the distributions associated with A and B correspond to equilibrium distributions of an Ising model (1, 2) at similar inverse temperatures βA and βB]. If βA = βB, the two distributions would be identical and the KL divergence would vanish. However, this is not a stable solution. Indeed, if the two parameters are not identical but close, the difference between their respective KL divergences from each to the other is (see Materials and Methods):

    D(βA+δβ|βA)−D(βA|βA+δβ)≃16∇χ(βA)δβ3,D(βA+δβ|βA)−D(βA|βA+δβ)≃16∇χ(βA)δβ3,
    [1]where χ is the generalized susceptibility also known as “Fisher information” (defined in Materials and Methods). This implies that the individual whose parameters correspond to the state with larger χ has a smaller KL divergence and is thus fitter. However, it is well known that χ peaks at the critical point, and thus our key finding is that, for a family of individuals with similar parameters, the fittest possible agent sits exactly at criticality, and it is best able to encapsulate a wide variety of distributions. As we illustrate in what follows with a number of examples, the optimal encoding parameters of stable solutions lie always around the peak of the generalized susceptibility χ, which is the region of maximal variability, where different complex sources can be best accounted for through small parameter changes (see Materials and Methods). This is in line with the recent finding—based on concepts of information geometry—that many more distinguishable outputs can be reproduced by models poised at the peak of χ, i.e., at criticality (26).

    Computational Experiments

    We have developed diverse computational evolutionary and adaptive models exploiting the ideas above. The dynamical rules used in these models are not meant to, necessarily, mimic the actual dynamics of living systems; rather, they are efficient ways to optimize fitness. In the evolutionary models, inspired by the genetic algorithm (21, 27), a community of M individuals—each one characterized by its own set of internal parameters β—evolves in time through the processes of death, birth, and mutation (see Materials and Methods). Individuals with larger fitness, i.e., with a smaller mean KL divergence from the rest of sources, have a larger probability to produce an offspring, which—apart from small random mutations—inherits its parameters from its ancestor. On the other hand, agents with low fitness are more likely to die and be removed from the community. In the adaptive models, individuals can change their internal parameters if the attempted variation implies an increase of their corresponding fitnesses (see Materials and Methods). These evolutionary/adaptive rules result in the ensemble of agents converging to a steady state distribution, which we aim at characterizing. We obtain similar results in two families of models, which differ in the way in which the environment is treated. In the first, the environment is self-generated by a community of coevolving/coadapting individuals, while, in the second, the variable external world is defined ad hoc.

    Coevolutionary Model.

    The environment perceived by each individual consists of the other M − 1 systems in the community, which it aims at “understanding” and coping with. In the simplest computational implementation of this idea (see Materials and Methods), a pair of individual agents is randomly selected from the community at each time step and each of these two individuals constitutes the environmental source for the other. Given that the KL divergence is not symmetric (see Materials and Methods), one of the two agents has a larger fitness and thus a greater probability of generating progeny, while the less fit system is more likely to die. This corresponds to a fitness function of agent i, which is a decreasing function of the KL divergence from the other. In this case, as illustrated in Fig. 2 (and in Movies S1 and S2), the coevolution of M = 100 agents—which [n their turn are sources—leads to a very robust evolutionarily stable steady-state distribution. Indeed, Fig. 2 Left shows that for three substantially different initial parameter distributions (very broad, and localized in the ordered and in the disordered phases, respectively), the community coevolves in time to a unique localized steady state distribution, which turns out to be peaked at the critical point (i.e., where the Fisher information peaks; see Fig. 2 Right and SI Appendix, section S4). This conclusion is robust against model details and computational implementations: the solution peaked at criticality is an evolutionary stable attractor of the dynamics. The same conclusions hold for an analogous coadaptive model in which the systems adapt rather than dying and replicating (see SI Appendix, section S6).

    Fig. 2.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Fig. 2.

    Coevolutionary model leads self-consistently to criticality: A community of M living systems (or agents) evolves according to a genetic algorithm dynamics (27). Each agent i (i = 1, …, M) is characterized by a two-parameter (βi1,βi2β1i,β2i) internal state distribution Pint(s|βi1,βi2)Pint(s|β1i,β2i), and the rest of the community acts as the external environment it has to cope with, i.e., the agents try to “understand” each other. At each simulation step, two individuals are randomly chosen and their respective relative fitnesses are computed in terms of the KL divergence from each other’s internal state probability distribution. One of the two agents is removed from the community with a probability that is smaller for the fitter agent; the winner produces an offspring, which (except for small variations/mutations) inherits its parameters. (Left) These coevolutionary rules drive the community very close to a unique localized steady state. As shown (Right), this is localized precisely at the critical point, i.e., where the generalized susceptibility or Fisher information of the internal state distribution exhibits a sharp peak (as shown by the contour plots and heat maps). The internal state distributions are parameterized as Pint(s|β1,β2)∝exp{β1N2(∑Nk=1skN)2+β2∑Nk=1sk}Pint(s|β1,β2)∝exp{β1N2(∑k=1NskN)2+β2∑k=1Nsk} representing a global (all-to-all) coupling of the internal nodes (see Materials and Methods). Much more complex probability distributions in which all units are not coupled to all other units—i.e., more complex networked topologies—are discussed in SI Appendix, section S4.

    • Open in new tab
    • Download original movie
    Movie S1.

    Movie S1. Simulation of the coevolutionary model leading self-consistently to criticality. A community of agents or cognitive systems coevolve according to a genetic algorithm. Different colors represent different initial conditions and so individuals of different colors do not interact with each other. Each agent has an internal representation of the rest of the community, symbolized with a dot in the two-parameter space (β1 and β2). Individuals with better representations reproduce more probably, and the offspring inherit the parameters from their parents with small mutations. The simulation shows how, independently of the initial condition, the individuals self-tune to the maximum of the Fisher information or critical point.


    • Open in new tab
    • Download original movie
    Movie S2.

    Movie S2. Simulation of the coevolutionary model for small systems. As in Movie S1, a community of agents coevolve to understand each other. Each agent constructs a representation of the environment, symbolized with a dot in the two-parameter space (β1 and β2). The information is encoded in strings of N binary variables. Every agent has N = 10 in the upper panel and 100 in the lower one. The Fisher Information for the two parameters’ distribution is plotted in the background. We can see how, after iterating the genetic algorithm, the agents localize at the maximum of the Fisher Information. When N increases, the peak approaches the critical point.


    Evolutionary Model.

    An ensemble of M agents are exposed at each particular time to a heterogeneous complex environment consisting of S independent environmental sources, each one with a different Psrc and thus parametrized by diverse αs (see Fig. 3). The set of S sources is randomly extracted from a broadly distributed pool of possible sources occurring with different probabilities, ρsrc(α). The fitness of an individual with parameters β with respect to any given environment is taken to be a decreasing function of the average KL divergence from the diverse external stimuli: d(ρsrc|β):=∫dαρsrc(α)D(α|β)d(ρsrc|β):=∫dαρsrc(α)D(α|β). In the special case of just one internal state parameter, β1, we find that upon iterating the genetic algorithm (see Materials and Methods), the distribution evolves toward a steady-state stable distribution. Computer simulations show that in the case of very homogeneous environments, occurring when all sources in the pool are similar—ρsrc(α) sufficiently narrow—the optimal β strongly depends on the specific sources, resulting in detail-specific internal states (see Fig. 3 Bottom). On the other hand, if the external world is sufficiently heterogeneous (see SI Appendix, section S5), the optimal internal state becomes peaked near the critical point (see Fig. 3 and Movie S3 illustrating the evolution of agents toward the vicinity of the critical point). We note that the approach to criticality is less precise in this model than in the coevolutionary one, in which the environment changes progressively as the agents coevolve, allowing the system to systematically approach the critical point with precision. Similar conclusions hold for an analogous “adaptive model” (see SI Appendix, section S6). Finally, one might wonder whether the resulting closeness to criticality in these models is not just a byproduct of the environment itself being critical in some way. In fact, it has been recently shown that complex environments, when hidden variables are averaged out, can be effectively described by the Zipf’s law (28), a signature of criticality (3). This observation applies to some of the heterogeneous environments analyzed here which, indeed, turn out to be Zipfian; however, as shown in SI Appendix, section S5, there are simple yet heterogeneous environments, which are not Zipfian but nevertheless result in the same behavior.

    Fig. 3.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Fig. 3.

    Evolutionary model leading to near to criticality in complex environments. A community of M agents undergoes a genetic algorithm dynamics (27). Each agent is simultaneously exposed to diverse stimuli s provided by S different sources, each one characterized by a probability Psrc(s|αu) with u = 1, …, S, fully specified by parameters αu. At each time step, S sources are randomly drawn with probability ρsrc(αu) (in this case, a uniform distribution with support in the colored region). Each agent i (i = 1, …, M) has an internal state Pint(s|βi) aimed at representing—or coping with—the environment. Agents’ fitness increases as the mean KL divergence from the set of sources to which they are exposed decreases. The higher the fitness of an individual, the lower its probability of dying. An agent that is killed is replaced by a new individual with a parameter β inherited from one of the other agents (and, with some probability, a small variation/mutation). The community dynamically evolves and eventually reaches a steady state distribution of parameters, p(β). The six panels in the figure correspond to different supports (colored regions) for uniform source distributions, ρsrc(αu). The dashed line is the generalized susceptibility (Fisher information) of the internal probability distribution, which exhibits a peak at the critical point separating an ordered from a disordered phase. Heterogeneous source pools (Top and Middle) lead to distributions peaked at criticality, whereas for homogeneous sources (Bottom), the communities are not critical but specialized. Stimuli distributions are parametrized in a rather simplistic way as Psrc(s|αu)∝exp{αuN2(∑Nk=1skN)2}Psrc(s|αu)∝exp{αuN2(∑k=1NskN)2}, while internal states are identical but replacing αu by βi (see Materials and Methods). In the guiding example of genetic regulatory networks, this example corresponds to an extremely simple fully connected network in which the state of each gene is equally determined by all of the other genes, and hence the probability of a given state depends only on the total number of on/off genes, controlled by a single parameter.

    • Open in new tab
    • Download original movie
    Movie S3.

    Movie S3. Simulation of the evolutionary model leading to criticality in complex environments. A community of agents in different environments evolves according to a genetic algorithm. Every agent is represented with a black dot on the vertical axis. At each time step, different sources are generated from the colored region, every one characterized by a parameter β. The agents have an internal representation of the sources, encoded in their own internal parameter β. Individuals with better representations have more chances to reproduce, and the offspring inherit the parameter β from their parents with a small mutation. We can see that, when the source pools are heterogeneous, as occurs for the left and right panels, the community evolves near the maximum of the Fisher Information, or, in other words, the critical point. However, when the sources are very specific, the agents do not become critical, as occurs for the central panels.


    Analytical Results for the Dynamical Models

    A generic probability distribution can be rewritten to parallel the standard notation in statistical physics, P(s|γ) = exp(−H(s|γ))/Z(γ), where the factor Z(γ) is fixed through normalization. The function H can be generically written as H(s|γ)=∑μγμϕμ(s)H(s|γ)=∑μγμϕμ(s), where ϕμ(s) are suitable functions (“observables”) of the variables s. For a specific set of parameters α characterizing an environmental source, the best possible internal state—minimizing the KL divergence—can be shown to obey ⟨ϕμint⟩α=⟨ϕμint⟩β〈ϕintμ〉α=〈ϕintμ〉β, where the index μ runs over the whole set of parameters and (ϕμint)α:=∑sϕμint(s)Pint(s|α)(ϕintμ)α:=∑sϕintμ(s)Pint(s|α) and (ϕμint)α:=∑sϕμint(s)Pint(s|α)(ϕintμ)α:=∑sϕintμ(s)Pint(s|α). This result implies that the optimal internal state is the one which best reproduces the lowest moments of the original source distribution it seeks to cope with (the number of moments coinciding with—or being limited by—the number of free parameters). By evaluating the second derivatives (Hessian matrix), it is easy to verify that, if a solution exists, it actually corresponds to a minimum of the KL divergence (see SI Appendix, section S3).

    To proceed further, we need to compute the internal state distribution in the presence of diverse sources distributed with ρsrc(α). In this case, we compute the value of β which minimizes the average KL divergence to the sources α as written above (an alternative possibility—which is discussed in SI Appendix, section S3—is to identify the optimal β for each specific source and then average over the source distribution), leading to the condition: ⟨ϕμint⟩β=∫dα ρsrc(α) ⟨ϕμint⟩α〈ϕintμ〉β=∫dα ρsrc(α) 〈ϕintμ〉α. We consider the simple example in which both the sources and the system are characterized by a single parameter and, assuming that a phase transition occurs at some parameter value α = αc, i.e., 〈ϕ〉α has a sigmoid shape (which becomes steeper as N increases) with an inflection point at α = αc (our analysis can be extended to more general cases where there is no built-in phase transition in the source distributions but they are merely sufficiently heterogeneous). The two plateaus of the sigmoid function correspond to the so-called disordered and ordered phases, respectively. When ρsrc(α) has support on both sides of the sigmoid function, i.e., when it is “heterogeneous,” by solving the equation for the optimal β, it is obvious that the moment to be reproduced lies somewhere in between the two asymptotic values of the sigmoid with the values of β for which intermediate moments are concentrated near the inflection or critical point, αc. Indeed, as χ=−d⟨ϕ⟩βdβχ=−d〈ϕ〉βdβ, the critical region, where the generalized susceptibility χ has a peak, is the region of maximal variability in which different complex sources can be best accounted for through small parameter changes, in agreement with the finding that many more distinguishable outputs can be reproduced by models poised close to criticality (26).

    Discussion and Conclusions

    Under the mild assumption that living systems need to construct good although approximate internal representations of the outer complex world and that such representations are encoded in terms of probability distributions, we have shown—by using concepts from statistical mechanics and information theory—that the encoding probability distributions do necessarily lie where the generalized susceptibility or Fisher information exhibits a peak (25), i.e., in the vicinity of a critical point, providing the best possible compromise to accommodate both regular and noisy signals.

    In the presence of broadly different ever-changing heterogeneous environments, computational evolutionary and adaptive models vividly illustrate how a collection of living systems eventually clusters near the critical state. A more accurate convergence to criticality is found in a coevolutionary/coadaptive setup in which individuals evolve/adapt to represent with fidelity other agents in the community, thereby creating a collective “language,” which turns out to be critical.

    These ideas apply straightforwardly to genetic and neural networks—where they could contribute to a better understanding of why neural activity seems to be tuned to criticality—but have a broader range of implications for general complex adaptive systems (21). For example, our framework could be applicable to some bacterial communities for which a huge phenotypic (internal state) variability has been empirically observed (29). Such a large phenotypic diversification can be seen as a form of “bet hedging,” an adaptive survival strategy analogous to stock market portfolio management (30), which turns out to be a straightforward consequence of individuals in the community being critical. Usually, from this point of view, generic networks diversify their “assets” among multiple phenotypes to minimize the long-term risk of extinction and maximize the long-term expected growth rate in the presence of environmental uncertainty (30). Similar bet-hedging strategies have been detected in viral populations and could be explained as a consequence of their respective communities having converged to a critical state, maximizing the hedging effect. Similarly, criticality has been recently shown to emerge through adaptive information processing in machine learning, where networks are trained to produce a desired output from a given input in a noisy environment; when tasks of very different complexity need to be simultaneously learned, networks adapt to a critical state to enhance their performance (31). In summary, criticality in some living systems could result from the interplay between their need for producing accurate representations of the world, their need to cope with many widely diverse environmental conditions, and their well-honed ability to react to external changes in an efficient way. Evolution and adaptation might drive living systems to criticality in response to this smart cartography.

    Materials and Methods

    Kullback–Leibler Divergence.

    Given two probability distributions P(s) and Q(s) for variables s, the KL divergence of Q(s) from P(s),

    D(P|Q):=∑sP(s)log(P(s)Q(s)),D(P|Q):=∑sP(s)log(P(s)Q(s)),
    [2]

    quantifies the loss of information when Q(s) is used to approximate P(s) (25). Indeed, in the large T limit, the probability Lℒ that the model Q(s) generates a sequence of T observations compatible with P(s) can be computed as L∼exp(−TD(P|Q))ℒ∼exp(−TD(P|Q)) up to leading order (see SI Appendix, section S2). Therefore, maximizing the likelihood of a trial probability distribution function Q is equivalent to minimizing its KL divergence from the original one, P. In Results we use the notation D(α|β) when P(s) = Psrc(s|α) and Q(s) = Pint(s|β).

    Fisher Information and Criticality.

    Given a probability distribution P(s|γ)—where γ can stand either for α or β—the Fisher Information is defined as

    χμν(γ):=⟨∂logP(⋅|γ)∂γμ∂logP(⋅|γ)∂γν⟩γ,χμν(γ):=〈∂log⁡P(⋅|γ)∂γμ∂log⁡P(⋅|γ)∂γν〉γ,
    [3]

    where μ and ν are parameter labels and the average 〈⋅〉γ is performed with respect to P(⋅|γ). It measures the amount of information encoded in the states s about the parameters γ (25). This follows from the Cramér−Rao inequality, which states that the error made when we estimate γ from one state s is, on average, greater than (or at least equal to) the inverse of the Fisher information (25). In particular, if χ happens to diverge at some point, it is possible to specify the associated parameters with maximal precision (26). With the parametrization used in the main text, the Fisher information is the generalized susceptibility in the statistical mechanics terminology and measures the response of the system to parameter variations: χμν(γ)=−∂⟨ϕμ⟩γ∂γν=⟨ϕμϕν⟩γ−⟨ϕμ⟩γ⟨ϕν⟩γχμν(γ)=−∂〈ϕμ〉γ∂γν=〈ϕμϕν〉γ−〈ϕμ〉γ〈ϕν〉γ, and is well known to peak at critical points (1, 2).

    Coevolutionary Model.

    The kth agent of the community is described by a probability distribution Pint(s|βk) ∝ exp{−Hint(s|βk)}, with Hint(s|βk)=∑Iμβkμϕμint(s)Hint(s|βk)=∑μIβμkϕintμ(s), depending on parameters βk. Starting with an ensemble of M agents whose internal parameters are extracted from an arbitrary distribution, p(β), two individuals, i and j, are randomly selected at each time step. Their relative fitnesses f(j)ifi(j) and f(i)jfj(i) are computed as f(j)i=1−D(βj∣∣βi)/[D(βj∣∣βi)+D(βi∣∣βj)]fi(j)=1−D(βj|βi)/[D(βj|βi)+D(βi|βj)], and similarly for f(i)jfj(i) (as the KL divergence is not symmetric, f(j)i≠f(i)jfi(j)≠fj(i) unless βi = βj). One of the two individuals—selected with probability equal to its relative fitness—creates an offspring, while the other one is removed from the community. The offspring inherits its parameters from its ancestor (with probability 1 − ν) or mutates with a probability ν, modifying its parameters from β to β → β + ξ, where ξ is a multivariate Gaussian random vector, with uncorrelated components, zero mean, and deviation σ. Time is updated to t → t + 1/M, another couple of individuals i′ and j′ is picked, and the process is iterated. Variants of this model are described in SI Appendix, section S4.

    Evolutionary Model.

    A community of agents receiving external stimuli from an outer and heterogeneous environment is modeled as follows. Every specific environmental source corresponds to a probability distribution Psrc(s|α) ∝ exp(−Hsrc(s|α)), with Hsrc(s|α)=∑Eμαμϕμsrc(s)Hsrc(s|α)=∑μEαμϕsrcμ(s), where the parameters α are drawn from the distribution ρsrc(α). The kth agent in the community constructs an internal representation of the observed source described by Pint(s|βk) ∝ exp(−Hint(s|βk)) with Hint(s|βk)=∑Iμβkμϕμint(s)Hint(s|βk)=∑μIβμkϕintμ(s), with parameters βk. We start with M individuals, each one equipped with some initial parameter set extracted from some arbitrary distribution p(β). At every time step, we generate S external sources, {αu}u=1, …, S, from the source pool ρsrc(α). Then we compute the average KL divergence of every individual’s internal state distribution from the external sources d({αu}|βk):=∑Su=1D(αu|βk)/Sd({αu}|βk):=∑u=1SD(αu|βk)/S The kth individual of the community is removed with a probability proportional to its average KL divergence (or any increasing function of it) Pkill(k)=d({αu}|βk)/∑ld({αu}|βl)Pkill(k)=d({αu}|βk)/∑ld({αu}|βl), and it is replaced by an offspring of another individual randomly selected from the rest of the community. The offspring inherits its parameters from the parent, and time is updated as in the coevolutionary model.

    Acknowledgments

    We are indebted to T. Hoang, D. Pfaff, J. Uriagereka, S. Vassanelli, and M. Zamparo for useful discussions and to W. Bialek and two anonymous referees for many insightful suggestions. A.M., J.G., and S.S. acknowledge Cariparo Foundation for financial support. M.A.M. and J.H. acknowledge support from J. de Andalucia P09-FQM-4682 and the Spanish MINECO FIS2009-08451.

    Footnotes

    • ↵1J.H. and J.G. contributed equally to this work.

    • ↵2To whom correspondence may be addressed. Email: mamunoz@onsager.ugr.es or amos.maritan@pd.infn.it.
    • Author contributions: J.H., J.G., S.S., M.A.M., J.R.B., and A.M. designed research; J.H., J.G., and S.S. performed research; and J.H., J.G., S.S., M.A.M., J.R.B., and A.M. wrote the paper.

    • The authors declare no conflict of interest.

    • ↵*This Direct Submission article had a prearranged editor.

    • This article contains supporting information online at www.pnas.org/lookup/suppl/doi:10.1073/pnas.1319166111/-/DCSupplemental.

    References

    1. ↵
      1. Stanley HE
      (1987) Introduction to Phase Transitions and Critical phenomena (Oxford Univ Press, London).
      Google Scholar
    2. ↵
      1. Binney J,
      2. Dowrick N,
      3. Fisher A,
      4. Newman M
      (1993) The Theory of Critical Phenomena (Oxford Univ Press, Oxford).
      Google Scholar
    3. ↵
      1. Mora T,
      2. Bialek W
      (2011) Are biological systems poised at criticality? J Stat Phys 144(2):268–302.
      CrossRefGoogle Scholar
    4. ↵
      1. Beggs JM,
      2. Plenz D
      (2003) Neuronal avalanches in neocortical circuits. J Neurosci 23(35):11167–11177.
      Abstract/FREE Full TextGoogle Scholar
    5. ↵
      1. Nykter M,
      2. et al.
      (2008) Gene expression dynamics in the macrophage exhibit criticality. Proc Natl Acad Sci USA 105(6):1897–1900.
      Abstract/FREE Full TextGoogle Scholar
    6. ↵
      1. Furusawa C,
      2. Kaneko K
      (2012) Adaptation to optimal cell growth through self-organized criticality. Phys Rev Lett 108(20):208103.
      CrossRefPubMedGoogle Scholar
    7. ↵
      1. Krotov D,
      2. Dubuis JO,
      3. Gregor T,
      4. Bialek W
      (2014) Morphogenesis at criticality. Proc Natl Acad Sci USA 111(10):3683–3688.
      Abstract/FREE Full TextGoogle Scholar
    8. ↵
      1. Chen X,
      2. Dong X,
      3. Be’er A,
      4. Swinney HL,
      5. Zhang HP
      (2012) Scale-invariant correlations in dynamic bacterial clusters. Phys Rev Lett 108(14):148101.
      CrossRefPubMedGoogle Scholar
    9. ↵
      1. Bialek W,
      2. et al.
      (2012) Statistical mechanics for natural flocks of birds. Proc Natl Acad Sci USA 109(13):4786–4791.
      Abstract/FREE Full TextGoogle Scholar
    10. ↵
      1. Jensen HJ
      (1998) Self-Organized Criticality: Emergent Complex Behavior in Physical and Biological Systems (Cambridge Univ Press, Cambridge, UK).
      Google Scholar
    11. ↵
      1. Edlund JA,
      2. et al.
      (2011) Integrated information increases with fitness in the evolution of animats. PLOS Comput Biol 7(10):e1002236.
      CrossRefPubMedGoogle Scholar
    12. ↵
      1. Chialvo DR
      (2010) Emergent complex neural dynamics. Nat Phys 6(10) 744–750.
      Google Scholar
    13. ↵
      1. Beggs JM
      (2008) The criticality hypothesis: How local cortical networks might optimize information processing. Philos Trans A Math Phys Eng Sci 366(1864):329–343.
      Abstract/FREE Full TextGoogle Scholar
    14. ↵
      1. Kinouchi O,
      2. Copelli M
      (2006) Optimal dynamical range of excitable networks at criticality. Nat Phys 2(5) 348–351.
      Google Scholar
    15. ↵
      1. Mora T,
      2. Walczak AM,
      3. Bialek W,
      4. Callan CG Jr.
      (2010) Maximum entropy models for antibody diversity. Proc Natl Acad Sci USA 107(12):5405–5410.
      Abstract/FREE Full TextGoogle Scholar
    16. ↵
      1. Shew WL,
      2. Plenz D
      (2013) The functional benefits of criticality in the cortex. Neuroscientist 19(1):88–100.
      Abstract/FREE Full TextGoogle Scholar
    17. ↵
      1. Langton C
      (1990) Computation at the edge of chaos: Phase transitions and emergent computation. Physica D 42(1-3):12–37.
      CrossRefGoogle Scholar
    18. ↵
      1. Bertschinger N,
      2. Natschläger T
      (2004) Real-time computation at the edge of chaos in recurrent neural networks. Neural Comput 16(7):1413–1436.
      CrossRefPubMedGoogle Scholar
    19. ↵
      1. Kauffman S
      (1993) The Origins of Order: Self-Organization and Selection in Evolution (Oxford Univ Press, New York).
      Google Scholar
    20. ↵
      1. Wagner A
      (2005) Robustness and Evolvability in Living Systems (Princeton Univ Press, Princeton, NJ).
      Google Scholar
    21. ↵
      1. Gros C
      (2008) Complex and Adaptive Dynamical Systems: A Primer (Springer, New York).
      Google Scholar
    22. ↵
      1. de Jong H
      (2002) Modeling and simulation of genetic regulatory systems: A literature review. J Comput Biol 9(1):67–103.
      CrossRefPubMedGoogle Scholar
    23. ↵
      1. Huang S
      (2010) Cell lineage determination in state space: A systems view brings flexibility to dogmatic canonical rules. PLoS Biol 8(5):e1000380.
      CrossRefPubMedGoogle Scholar
    24. ↵
      1. Balleza E,
      2. et al.
      (2008) Critical dynamics in genetic regulatory networks: Examples from four kingdoms. PLoS ONE 3(6):e2456.
      CrossRefPubMedGoogle Scholar
    25. ↵
      1. Cover TM,
      2. Thomas J
      (1991) Elements of Information Theory (Wiley, New York).
      Google Scholar
    26. ↵
      1. Mastromatteo I,
      2. Marsili M
      (2011) On the criticality of inferred models. J Stat Mech 2011:P10012.
      CrossRefGoogle Scholar
    27. ↵
      1. Goldberg DE
      (1989) Genetic Algorithms in Search, Optimization, and Machine Learning (Addison−Wesley Professional, Reading, MA).
      Google Scholar
    28. ↵
      1. Schwab DJ,
      2. Nemenman I,
      3. Mehta P
      (2013) Zipf’s law and criticality in multivariate data without fine-tuning. arXiv:1310.0448.
      Google Scholar
    29. ↵
      1. Kussell E,
      2. Leibler S
      (2005) Phenotypic diversity, population growth, and information in fluctuating environments. Science 309(5743):2075–2078.
      Abstract/FREE Full TextGoogle Scholar
    30. ↵
      1. Wolf DM,
      2. Vazirani VV,
      3. Arkin AP
      (2005) Diversity in times of adversity: Probabilistic strategies in microbial survival games. J Theor Biol 234(2):227–253.
      CrossRefPubMedGoogle Scholar
    31. ↵
      1. Goudarzi A,
      2. Teuscher C,
      3. Gulbahce N,
      4. Rohlf T
      (2012) Emergent criticality through adaptive information processing in Boolean networks. Phys Rev Lett 108(12):128702.
      CrossRefPubMedGoogle Scholar
    PreviousNext
    Back to top
    Article Alerts
    Email Article
    Citation Tools
    Request Permissions
    Share
    • Tweet Widget
    • Mendeley logo Mendeley

    Article Classifications

    • Physical Sciences
    • Physics
    Proceedings of the National Academy of Sciences: 111 (28)
    Table of Contents

    Submit

    Sign up for Article Alerts

    Jump to section

    • Article
      • Abstract
      • Results
      • Computational Experiments
      • Analytical Results for the Dynamical Models
      • Discussion and Conclusions
      • Materials and Methods
      • Acknowledgments
      • Footnotes
      • References
    • Figures & SI
    • Info & Metrics
    • PDF

    You May Also be Interested in

    Plateau pika (Ochotona curzoniae).
    How pikas survive winter without hibernation
    Pikas survive harsh winters by reducing their metabolism to conserve energy and by eating yak feces.
    Image credit: Qingsheng Chi.
    Clouds at different altitudes seen from an airplane over England.
    Effects of clouds on climate change
    A doubling of pre-industrial atmospheric CO2 levels would result in more than 2°C of global warming, due to the amplifying effects of cloud feedback.
    Image credit: Paulo Ceppi.
    New York City.
    Depression and large cities
    Urban environments are thought to be associated with poor mental health, but a study finds per capita prevalence of depression decreases with increasing city size in the United States.
    Image credit: Pixabay/Walkerssk.
    Mother holding up child wearing colorful garb.
    Inner Workings: Can feeding the gut microbiome treat malnutrition?
    Early results suggest the right diet could not only help malnourished microbiomes recover, but could also help treat diabetes and other metabolic conditions.
    Image credit: Shutterstock/Dana Ward.
    WOX5 gene expressed in Arabidopsis cells.
    Journal Club: Stem cell niche within plant callus tissue drives organ regeneration
    Better understanding the spatial organization of the callus could have important implications for crop engineering and biotechnology.
    Image credit: Lin Xu (Chinese Academy of Sciences, Beijing, China).

    Similar Articles

    • Biophysical principles of choanoflagellate self-organization
    • Activity-dependent myelination: A glial mechanism of oscillatory self-organization in large-scale brain networks
    • Metabolic evolution of ecosystems
    • Niche adaptation promoted the evolutionary diversification of tiny ocean predators
    • Adaptive self-organization of Balinese subaks
    See more
    Site Logo
    Powered by HighWire
    • Submit Manuscript
    • Twitter
    • Youtube
    • Facebook
    • RSS Feeds
    • Email Alerts

    Articles

    • Current Issue
    • Special Feature Articles – Most Recent
    • List of Issues

    PNAS Portals

    • Anthropology
    • Chemistry
    • Classics
    • Front Matter
    • Physics
    • Sustainability Science
    • Teaching Resources

    Information

    • Authors
    • Editorial Board
    • Reviewers
    • Subscribers
    • Librarians
    • Press
    • Cozzarelli Prize
    • Site Map
    • PNAS Updates
    • FAQs
    • Accessibility Statement
    • Rights & Permissions
    • About
    • Contact

    Feedback    Privacy/Legal

    Copyright © 2021 National Academy of Sciences. Online ISSN 1091-6490. PNAS is a partner of CHORUS, COPE, CrossRef, ORCID, and Research4Life.