Mutual information between in- and output trajectories of biochemical networks

Reading time: 5 minute
...

📝 Original Info

  • Title: Mutual information between in- and output trajectories of biochemical networks
  • ArXiv ID: 0901.0280
  • Date: 2009-01-04
  • Authors: Filipe Tostevin, Pieter Rein ten Wolde

📝 Abstract

Biochemical networks can respond to temporal characteristics of time-varying signals. To understand how reliably biochemical networks can transmit information we must consider how an input signal as a function of time--the input trajectory--can be mapped onto an output trajectory. Here we estimate the mutual information between in- and output trajectories using a Gaussian model. We study how reliably the chemotaxis network of E. coli can transmit information on the ligand concentration to the flagellar motor, and find the input power spectrum that maximizes the information transmission rate.

💡 Deep Analysis

Deep Dive into Mutual information between in- and output trajectories of biochemical networks.

Biochemical networks can respond to temporal characteristics of time-varying signals. To understand how reliably biochemical networks can transmit information we must consider how an input signal as a function of time–the input trajectory–can be mapped onto an output trajectory. Here we estimate the mutual information between in- and output trajectories using a Gaussian model. We study how reliably the chemotaxis network of E. coli can transmit information on the ligand concentration to the flagellar motor, and find the input power spectrum that maximizes the information transmission rate.

📄 Full Content

Cells continually have to respond to a wide range of intra-and extracellular signals. These signals have to be detected, encoded, transmitted and decoded by biochemical networks. In the absence of biochemical noise, a particular input signal will lead to a unique output signal, allowing the cell to respond appropriately. Recent experiments, however, have vividly demonstrated that biochemical networks can be highly stochastic [1], and a key question is therefore how reliably biochemical networks can transmit information in the presence of noise.

To address this question, we must recognize that the message may be contained in the temporal dynamics of the input signal. A well-known example is bacterial chemotaxis, where the concentration of the intracellular messenger protein depends not on the current ligand concentration, but rather on whether this concentration has changed in the recent past [2]-the response of the network thus depends on the history of the input signal. Moreover, the input signal may be encoded into the temporal dynamics of the signal transduction pathway. For example, stimulation of the rat PC-12 system with a neuronal growth factor gives rise to a sustained response of the Raf-Mek-Erk pathway, while stimulation with an epidermal growth factor leads to a transient response [3]. In all these cases, the message is encoded not in the concentration of some chemical species at a specific moment in time, but rather in its concentration as a function of time. Importantly, whether the processing network can reliably respond to a signal depends not only on the instantaneous value of the signal, but also on the time scale over which it changes. In general, the in-and output signals of biochemical networks are time-continuous signals with non-zero correlation times. To understand how reliably biochemical networks can transmit information, we need to know how accurately an input signal as a function of time-the input trajectory-can be mapped onto an output trajectory. In this article, we take an information theoretic approach to this question.

A natural measure for the quality of information transmission is the mutual information between the input signal I and the network response O, given by M (I, O) = H(O) -H(O|I) [4].

Here [5,6], although in these studies the temporal correlations in the input signals were ignored. Here we investigate the mutual information between in-and output trajectories.

Mutual information between trajectories-We consider a biochemical network in steady state which has one input species S with copy number S and one output species X with copy number X. The mutual information between in-and output trajectories is found by taking the possible input and output signals I and O to be the possible trajectories S(t) and X(t): M (S, X) = DS(t) DX(t)p(S(t), X(t)) log p(S(t), X(t)) p(S(t))p(X(t)) .

(1) Calculating the mutual information between trajectories is in general a formidable task, given the highdimensionality of the trajectory space. However, for a Gaussian model, which we will employ here, the mutual information can be obtained analytically.

In this Gaussian model, it is assumed that the input signal consists of small temporal variations around some steady-state value, obeying Gaussian statistics. This limits our approach, but seems a reasonable simplification given that the input statistics have not been measured for most, if not all, biological systems. Moreover, we assume that the coupling between the components can be linearized and that the intrinsic noise is small and Gaussian, according to the linear-noise approximation [7]; recent modeling studies have shown this gives a good description of the noise properties of a large class of biochemical networks, even when the copy numbers are as low as ten [6,8]. Under these assumptions the joint probability distribution of the in-and output signals is described by a multivariate Gaussian,

The vector v ≡ (s, x), with s = (s(t 1 ), s(t 2 ), . . . , s(t N )) constructed from the input signal sampled at times t = t 1 , . . . , t N , and x = (x(t 1 ), x(t 2 ), . . . , x(t N )); s(t) and x(t) are the deviations of S and X away from their steadystate values, S and X , respectively. The 2N × 2N covariance matrix Z has the form

where

In the limit that the inand output signals are time-continuous, the mutual information rate between the in-and output trajectories

where the power spectrum S αβ (ω) is the Fourier transform of C αβ (t). Measuring the output signal as a function of time, R(s, x) is the rate at which the information on the input trajectory increases with time; importantly, R(s, x) takes into account temporal correlations in the in-and output signal. We emphasize that Eq. 4 is exact only for linear systems with Gaussian statistics. Importantly, however, Eq. 4 can also be applied to systems which do not obey Gaussian statistics and to non-linear systems; in these cases it provides a lower bound on the channel capacit

…(Full text truncated)…

📸 Image Gallery

cover.png

Reference

This content is AI-processed based on ArXiv data.

Start searching

Enter keywords to search articles

↑↓
ESC
⌘K Shortcut