Biological cybernetics: from biology to electronics via statistical mechanics
This research branch stems as a consequence of the (interconnected) investigations in biological networks, namely as an outcome of comparing neural with immune networks and raises with the aim to fill the descriptive gap between genuine theories "theoretical physics oriented" (described by physical order parameters as e.g. the magnetization) and modern (i.e. LabOnChip-like) esperiments "driven by modular molecular biology" (described by chemical order parameters as e.g. the concentration).
The gap to fill can be summarized in a nutshell as follows: in neural networks each single agent -that is the neuron- interacts with others via spike emission (which ultimately means electric current) flowing from the axon of the "sender neuron" (the emitter) to dendrites of the receivers (the two being coupled by positive or negative synapses): despite the underlying pure biological flavor, a lucky coincidence wanted that the information carrier in neural network is a "quantum" (a spike) of electric current, hence, as the latter plays both as an information transporter in biological matter as well as a key observable of physics, this allowed to move some physicists from "standard physics" to research in "neural networks" with relatively low difficulties. Unfortunately, the same analogy does not hold for immune networks, and more generally the large part of biological calculations does not involve "eletric currents".
Indeed these systems, focusing on the immune response to fix ideas, hence where each single agent is a lymphocyte and no longer a neuron, clearly perform pattern recognition, information storage and retrieval (information processing in general) and share many similarities with neural networks and artificial automata, however the key mechanisms at the basis of information processing are not handled by electric current flow, but by suitable chemical concentrations: this makes the cybernetical equivalence much more challenge (but even much more interesting).
Recently we have developed a one-to-one mapping among players in artificial intelligence (flip-flops, operational-amplifiers, latches), statistical mechanical models (respectively antiferromagnets, ferromagnets above the critical noise level and ferromagnets below the cricial noise level) and biochemical kinetics (respectively of anticooperative flavor, cooperative flavor and ultra-sensitive one). The whole is summarized in the next diagram.
It is crucial to understand that for modeling biological complexity, even at this very elementary level, we need both positive interactions (cooperativity, that is ferromagnetism, needed to amplify signals and contribute to information storage) as well as negative interactions (competitivity, that is antiferromagnetism, fundamental to accomplish information storage) and, while experimentally larged verified, this is crucial at theoretical level because ensembles of ferromagnets and antiferromagnets are spin glasses and spin glasses are the harmonic oscillators of complex systems, among whose research branches we may find neural networks too.
The gap to fill can be summarized in a nutshell as follows: in neural networks each single agent -that is the neuron- interacts with others via spike emission (which ultimately means electric current) flowing from the axon of the "sender neuron" (the emitter) to dendrites of the receivers (the two being coupled by positive or negative synapses): despite the underlying pure biological flavor, a lucky coincidence wanted that the information carrier in neural network is a "quantum" (a spike) of electric current, hence, as the latter plays both as an information transporter in biological matter as well as a key observable of physics, this allowed to move some physicists from "standard physics" to research in "neural networks" with relatively low difficulties. Unfortunately, the same analogy does not hold for immune networks, and more generally the large part of biological calculations does not involve "eletric currents".
Indeed these systems, focusing on the immune response to fix ideas, hence where each single agent is a lymphocyte and no longer a neuron, clearly perform pattern recognition, information storage and retrieval (information processing in general) and share many similarities with neural networks and artificial automata, however the key mechanisms at the basis of information processing are not handled by electric current flow, but by suitable chemical concentrations: this makes the cybernetical equivalence much more challenge (but even much more interesting).
Recently we have developed a one-to-one mapping among players in artificial intelligence (flip-flops, operational-amplifiers, latches), statistical mechanical models (respectively antiferromagnets, ferromagnets above the critical noise level and ferromagnets below the cricial noise level) and biochemical kinetics (respectively of anticooperative flavor, cooperative flavor and ultra-sensitive one). The whole is summarized in the next diagram.
It is crucial to understand that for modeling biological complexity, even at this very elementary level, we need both positive interactions (cooperativity, that is ferromagnetism, needed to amplify signals and contribute to information storage) as well as negative interactions (competitivity, that is antiferromagnetism, fundamental to accomplish information storage) and, while experimentally larged verified, this is crucial at theoretical level because ensembles of ferromagnets and antiferromagnets are spin glasses and spin glasses are the harmonic oscillators of complex systems, among whose research branches we may find neural networks too.
This figure summarizes the behavioral analogies in non-organic matter, organic matter and electronic matter (electronic components): In the first row, pictures of three biological systems exhibiting cooperativity, namely Mitogen-activated protein kinase 14 (positive cooperativity, panel a), Ca2+ calmodulin dependent protein kinases II (ultra-sensitive cooperativity, panel b), and the last panel shows Synaptic Glutamate receptors (negative cooperativity, encoded in panel c) are shown.
The related saturation curves (binding isotherms) are shown in the second row (panels d, e and f, respectively), where symbols with the relative error-bars stand for real data and lines are best fits performed through the analytical expression obtained from statistical mechanics (readers with knowledge in statistical mechanics immediately notice the hyperbolic tangent at work). Note further that in panel d it is possible to see the “saturation” phenomenon as the first and the last experimental points are far from the linear fit (red line), while are perfectly accounted by the hyperbolic tangent predicted by statistical mechanics (green line), whose correspondence with saturation in electronics is represented in panel l. Notice further that in panel e, we compared the ultra-sensitive fit (solid line), with a simple cooperative fit (dashed line): at small substrate concentration the latter case does not match, within its variance, the data points (so accurately measured that error bars are not reported), while the former case is in perfect agreement with data points.
In the third row we sketch the cybernetic counterparts, i.e., the operational amplifier (panel g), represented as an inverted flip-flop mirroring the symmetry by which we presented the statistical mechanics framework (the standard amplifier is shown in fig. 3), the analog-to-digital converter (panel h) and the flip-flop (panel i). The (theoretical) transfer functions corresponding to the circuits are finally shown in the fourth row (panels l, m and n, respectively) for visual comparison with the second one.
The related saturation curves (binding isotherms) are shown in the second row (panels d, e and f, respectively), where symbols with the relative error-bars stand for real data and lines are best fits performed through the analytical expression obtained from statistical mechanics (readers with knowledge in statistical mechanics immediately notice the hyperbolic tangent at work). Note further that in panel d it is possible to see the “saturation” phenomenon as the first and the last experimental points are far from the linear fit (red line), while are perfectly accounted by the hyperbolic tangent predicted by statistical mechanics (green line), whose correspondence with saturation in electronics is represented in panel l. Notice further that in panel e, we compared the ultra-sensitive fit (solid line), with a simple cooperative fit (dashed line): at small substrate concentration the latter case does not match, within its variance, the data points (so accurately measured that error bars are not reported), while the former case is in perfect agreement with data points.
In the third row we sketch the cybernetic counterparts, i.e., the operational amplifier (panel g), represented as an inverted flip-flop mirroring the symmetry by which we presented the statistical mechanics framework (the standard amplifier is shown in fig. 3), the analog-to-digital converter (panel h) and the flip-flop (panel i). The (theoretical) transfer functions corresponding to the circuits are finally shown in the fourth row (panels l, m and n, respectively) for visual comparison with the second one.
(A PARENTHESIS: UNIVERSALITY RELOADED)
From the '60 to the '90, "universality" has been a keyword in the statistical mechanical literature on phase transitions and it was meant to highlight the deep structural analogies that several (actually very disparate) systems share "close to criticality".
In recent years, with the extension in the range of applicability of Statistical Mechanics (covering nowadays widespread subjects as biological networks, economical problems, material sciences, etc.), we are discovering a novel class of universal behavior: the ways (i.e. the main schemes, the fundamental "bricks" building the wall of computation) trough which systems perform information processing seem to be very very similar. In this toy-plot I show the response curve for four different systems: (a) an operational amplifiers, where x axes = input voltage and y axes = output voltage (and the link among them is called the "transfer function"); (b) a ferromagnet, where the x axes = the external field and the y axes = the magnetization (and the link among them is called the "self-consistency"); (c) a cortical neuron, where the x axes is the sum of the afferent currents and the y axes is the spike intensity (and their link is called "response function"); (d) a chemical reaction, where the x axes = concentration of the substrate -on a logharitmic scale- and the y axes = the fraction of sites that are binding the substrate (and their relation is known as "saturation curve").
From the '60 to the '90, "universality" has been a keyword in the statistical mechanical literature on phase transitions and it was meant to highlight the deep structural analogies that several (actually very disparate) systems share "close to criticality".
In recent years, with the extension in the range of applicability of Statistical Mechanics (covering nowadays widespread subjects as biological networks, economical problems, material sciences, etc.), we are discovering a novel class of universal behavior: the ways (i.e. the main schemes, the fundamental "bricks" building the wall of computation) trough which systems perform information processing seem to be very very similar. In this toy-plot I show the response curve for four different systems: (a) an operational amplifiers, where x axes = input voltage and y axes = output voltage (and the link among them is called the "transfer function"); (b) a ferromagnet, where the x axes = the external field and the y axes = the magnetization (and the link among them is called the "self-consistency"); (c) a cortical neuron, where the x axes is the sum of the afferent currents and the y axes is the spike intensity (and their link is called "response function"); (d) a chemical reaction, where the x axes = concentration of the substrate -on a logharitmic scale- and the y axes = the fraction of sites that are binding the substrate (and their relation is known as "saturation curve").
Thus we gave some suggestions that there is a structural (formal, logical and analytical) equivalence among transfer functions in electronics, saturation curves in reaction kinetics and self-consistencies in statistical mechanics: we are building the first concrete bridge that allows to surf among different disciplines still keeping "universal glasses" for coding their inner complexity.
One step forward, we enlarged this scheme to cover also for allosteric reactions à la Monod (not accounted in the previous scheme) and we have been able to build a whole theory for biological stochastic gates, showing -trough statistical mechanics (which plays here as a unifying language)- how enzime (or cascades of their interactions) may account for a natural version of the stochastic (bio)-logical gates namely YES, NOT, OR, NOR, AND, NAND logic gates, as shown in the next summarizing table.
One step forward, we enlarged this scheme to cover also for allosteric reactions à la Monod (not accounted in the previous scheme) and we have been able to build a whole theory for biological stochastic gates, showing -trough statistical mechanics (which plays here as a unifying language)- how enzime (or cascades of their interactions) may account for a natural version of the stochastic (bio)-logical gates namely YES, NOT, OR, NOR, AND, NAND logic gates, as shown in the next summarizing table.
These plots have been obtained considering only Monod-Wyman-Changeaux (MWC) allosteric kinetics (the one usually involved in enzimatic catalisys). It should be pointed out that MWC cooperativity is not sharply equivalent to the classical Hill cooperativity (that gave rise to the previous summary-table).
The divergency between MWC cooperativity (which is one-body with an activator extra-degree-of-freedom) and cooperativity (à la Hill or Adair) -which is two-bodies- implies also differences in outcomes from these reaction kinetics.
Now merging statistical mechanics and mathematical logics, we can finally contribute to understand why and in what they do differ.
Often, in the Literature, scientists refer to the allosteric AND gate as a cooperative term: while the intuitive meaning of this paragraph is obvious (because, roughly speaking, both the ligands must be enough present for the activation of the gate), strictly speaking there is not a straight agreement of this statement with the corresponding statistical mechanical denition of cooperativity, as in the latter cooperativity is strictly related to the presence of a
two-body coupling (i.e. a contribution to the Hamiltonian) in the form of a term H = sum_ij J_ij \sigma_i \sigma_j, such that if 0 > J > 1
we speak about positive cooperativity, if J > 1 we deal with ultra-sensitive kinetics, while for J < 0 we speak of negative cooperativity and clearly for J = 0 Michaelis-Menten kinetics is recovered.
The AND gate is thought to simulate a cooperative interaction among its inputs but, this is not strictly true: as discussed above, in a real cooperative system there is a mutual enhancement of the activation probability and the latter grows continuously as the ligand concentration is increased, conversely, in the AND counterpart such a mutual enhancement lacks, and the presence itself of both the ligands is simply necessary for activation, or, in other words, it is possible to realize (biochemically) an AND gate only when a (signicant, that is at high concentration) amount of both ligands is present, independently of the percent concentration relative to any of them. This is evident when comparing the plots of the AND gate and the OR gate (assuming the latter to behave effectively in a cooperative way, as we have proved too): the logical duality of AND and OR would impose these two plots to be symmetrical but it is not the case because the AND plot presents a cut in the (0; 0) quadrant, while
the OR one does not show this pathology. As we do not want to bother the reader with too much insights in biochemical logics, and as however a logica base for computing has been achieved -both in terms of amplifiers and flip-flops and in more abstract terms of logic gates, we move on analyzing the agreement of this theory/perspective and the available phenomenology.
The divergency between MWC cooperativity (which is one-body with an activator extra-degree-of-freedom) and cooperativity (à la Hill or Adair) -which is two-bodies- implies also differences in outcomes from these reaction kinetics.
Now merging statistical mechanics and mathematical logics, we can finally contribute to understand why and in what they do differ.
Often, in the Literature, scientists refer to the allosteric AND gate as a cooperative term: while the intuitive meaning of this paragraph is obvious (because, roughly speaking, both the ligands must be enough present for the activation of the gate), strictly speaking there is not a straight agreement of this statement with the corresponding statistical mechanical denition of cooperativity, as in the latter cooperativity is strictly related to the presence of a
two-body coupling (i.e. a contribution to the Hamiltonian) in the form of a term H = sum_ij J_ij \sigma_i \sigma_j, such that if 0 > J > 1
we speak about positive cooperativity, if J > 1 we deal with ultra-sensitive kinetics, while for J < 0 we speak of negative cooperativity and clearly for J = 0 Michaelis-Menten kinetics is recovered.
The AND gate is thought to simulate a cooperative interaction among its inputs but, this is not strictly true: as discussed above, in a real cooperative system there is a mutual enhancement of the activation probability and the latter grows continuously as the ligand concentration is increased, conversely, in the AND counterpart such a mutual enhancement lacks, and the presence itself of both the ligands is simply necessary for activation, or, in other words, it is possible to realize (biochemically) an AND gate only when a (signicant, that is at high concentration) amount of both ligands is present, independently of the percent concentration relative to any of them. This is evident when comparing the plots of the AND gate and the OR gate (assuming the latter to behave effectively in a cooperative way, as we have proved too): the logical duality of AND and OR would impose these two plots to be symmetrical but it is not the case because the AND plot presents a cut in the (0; 0) quadrant, while
the OR one does not show this pathology. As we do not want to bother the reader with too much insights in biochemical logics, and as however a logica base for computing has been achieved -both in terms of amplifiers and flip-flops and in more abstract terms of logic gates, we move on analyzing the agreement of this theory/perspective and the available phenomenology.
One of the main goal of this research branch is to develop robust bridges among "order parameters" of different disciplines in models whose behavior is "structurally similar", in particular, here, emphasizing this equivalence among mean-field ferromagnetism and classical cooperative chemical kinetics.
In the former the role of order parameter is played by the magnetization (namely, in a system with N spins whose values can be +1 or -1 only, their aritmetical average), while
in the latter this role is made by the saturation fraction, that is the fraction of the sites (over N possible ones) where ligands binded.
In these plots Y represents the "saturation curve", that is the saturation fraction versus the logaritm of the concentration of ligands. In the left panel examples of cooperative kinetics are shown. In particular spots represent real data (respectively blue for Ca2+/calmodulin-dependent protein kinase, and gree and violet for two glycine-dependent riboswitches), while the best fits are achieved using the bridge above on the Curie-Weiss model for ferromagnetic transitions, hence reporting the magnetization versus the external field (at a temperature
above the critical one).
In the right plot an example of "ultrasensitive kinetics" (with high Hill value) is shown a part because the fit uses the low-temperature discontinuity of the model (a first order phase transition) for its order parameter to improve fits (or fitting programs) implicitly assuming continuity and not available with chain models (1-D) of proteins.
In the former the role of order parameter is played by the magnetization (namely, in a system with N spins whose values can be +1 or -1 only, their aritmetical average), while
in the latter this role is made by the saturation fraction, that is the fraction of the sites (over N possible ones) where ligands binded.
In these plots Y represents the "saturation curve", that is the saturation fraction versus the logaritm of the concentration of ligands. In the left panel examples of cooperative kinetics are shown. In particular spots represent real data (respectively blue for Ca2+/calmodulin-dependent protein kinase, and gree and violet for two glycine-dependent riboswitches), while the best fits are achieved using the bridge above on the Curie-Weiss model for ferromagnetic transitions, hence reporting the magnetization versus the external field (at a temperature
above the critical one).
In the right plot an example of "ultrasensitive kinetics" (with high Hill value) is shown a part because the fit uses the low-temperature discontinuity of the model (a first order phase transition) for its order parameter to improve fits (or fitting programs) implicitly assuming continuity and not available with chain models (1-D) of proteins.
One step forward in these plots at the right of the screen, we show the remarkable agreement among anti-cooperative reaction kinetics (respectively of Azide -red-, Thiocyanate -blue- Cyanide -yellow- and Imidazole -green) and their behavior predicted by a statistical mechanical model of a bipartite (mean-field) antiferromagnet.
Note further that, as summarized in the first plot at the beginning of this page, these systems -from a cybernetical perspective- behave as flip-flops, thus while minimizing their free energy to obey thermodynamics -tacitely accounted in the chemical kinetics reactions- these systems are also processing information and, trough a costellation of small steps forward, we are finally becoming able to decript the complexity of information processing hidden in biological matters.
This scheme summarizes the kind of systems we are considering here: Mono-receptor/Mono-ligand (a), Mono-receptor/Double-ligand (b) and Double-receptor/Double-ligand (c). In this cartoon all molecules are shown as dimeric, but cases a and b also work with monomeric structures. In the Mono-receptor/Mono-ligand case only one kind of receptor and one kind of ligand (compatible with the receptor) are considered; in the Mono-receptor/Double-ligand case we still have one kind of
receptor, but two dierent ligands both compatible with the receptor; in the Double-receptor/Double-ligand case we consider molecules displaying two dierent receptors in the presence of two dierent ligands, each compatible with only one receptor. Their logical equivalence is shown too at the end.
receptor, but two dierent ligands both compatible with the receptor; in the Double-receptor/Double-ligand case we consider molecules displaying two dierent receptors in the presence of two dierent ligands, each compatible with only one receptor. Their logical equivalence is shown too at the end.
Example of our biological AND and OR stochastic gates made by enzyme's cascades
The present gates realizations involve two enzymatic processes and the buffering part. The first enzyme is esterase. Here esterase reacts with ethyl butyrate (concentration A(t), Input 1) or methyl butyrate (concentration B(t), Input 2), or both, biocatalyzing production of ethanol and methanol, respectively. Butyric acid (concentration U(t)) is a byproduct of the process, and its production lowers the pH of the system from its initial value of pH(0) = 9.0. The pH of the system varies from 9.0 to as low as 4.2, depending on the input concentrations and whether the buffer is present and its initial quantity.
Tris buffer (initially added as T0 moles per unit volume) was introduced for “filtering” as explained shortly. The experiments were carried out with no buffering: T0 = 0 mM, with buffering yielding a high-quality OR gate, T0 = 4 mM, and with excessive buffering which impairs the OR gate realization, T0 = 8 mM. The added buffer instantaneously equilibrates to produce a protonated form TH+. This hinders the reduction of the pH. However, this mild alkaline buffering effect persists only in the range of pH from 9 to slightly under 7.2, or until Tris is fully in
the protonated form.
Tris buffer (initially added as T0 moles per unit volume) was introduced for “filtering” as explained shortly. The experiments were carried out with no buffering: T0 = 0 mM, with buffering yielding a high-quality OR gate, T0 = 4 mM, and with excessive buffering which impairs the OR gate realization, T0 = 8 mM. The added buffer instantaneously equilibrates to produce a protonated form TH+. This hinders the reduction of the pH. However, this mild alkaline buffering effect persists only in the range of pH from 9 to slightly under 7.2, or until Tris is fully in
the protonated form.