The effect of feedback on the fidelity of information transmission of time-varying signals

Reading time: 6 minute
...

📝 Original Info

  • Title: The effect of feedback on the fidelity of information transmission of time-varying signals
  • ArXiv ID: 1002.2595
  • Date: 2010-02-12
  • Authors: Wiet de Ronde, Filipe Tostevin, Pieter Rein ten Wolde

📝 Abstract

Living cells are continually exposed to environmental signals that vary in time. These signals are detected and processed by biochemical networks, which are often highly stochastic. To understand how cells cope with a fluctuating environment, we therefore have to understand how reliably biochemical networks can transmit time-varying signals. To this end, we must understand both the noise characteristics and the amplification properties of networks. In this manuscript, we use information theory to study how reliably signalling cascades employing autoregulation and feedback can transmit time-varying signals. We calculate the frequency-dependence of the gain-to-noise ratio, which reflects how reliably a network transmits signals at different frequencies. We find that the gain-to-noise ratio may differ qualitatively from the power spectrum of the output, showing that the latter does not directly reflect signaling performance. Moreover, we find that auto-activation and auto-repression increase and decrease the gain-to-noise ratio for all of frequencies, respectively. Positive feedback specifically enhances information transmission at low frequencies, while negative feedback increases signal fidelity at high frequencies. Our analysis not only elucidates the role of autoregulation and feedback in naturally-occurring biological networks, but also reveals design principles that can be used for the reliable transmission of time-varying signals in synthetic gene circuits.

💡 Deep Analysis

Deep Dive into The effect of feedback on the fidelity of information transmission of time-varying signals.

Living cells are continually exposed to environmental signals that vary in time. These signals are detected and processed by biochemical networks, which are often highly stochastic. To understand how cells cope with a fluctuating environment, we therefore have to understand how reliably biochemical networks can transmit time-varying signals. To this end, we must understand both the noise characteristics and the amplification properties of networks. In this manuscript, we use information theory to study how reliably signalling cascades employing autoregulation and feedback can transmit time-varying signals. We calculate the frequency-dependence of the gain-to-noise ratio, which reflects how reliably a network transmits signals at different frequencies. We find that the gain-to-noise ratio may differ qualitatively from the power spectrum of the output, showing that the latter does not directly reflect signaling performance. Moreover, we find that auto-activation and auto-repression incre

📄 Full Content

Living cells constantly have to respond and adapt to a changing environment. In some cases, such as in response to a changing sugar concentration [1], a cell may wish to integrate out rapid variations and only respond to slow variations of the environmental signal, while in other cases, such as osmo adaptation [2] or bacterial chemotaxis [3], the cell needs to do the opposite -respond to rapid but not slow variations (adaptation). Indeed, to understand how cells cope with a fluctuating environment, we have to understand how cells transduce time-varying signals. Cells detect, process, and transduce signals via biochemical networks, which are the information processing devices of life. However, experiments in recent years have demonstrated that biochemical networks are often highly stochastic [4,5]. This raises the question how reliably biochemical networks can transmit time-varying signals in the presence of noise.

Interestingly, biochemical networks exploit commonly recurring architectures [6,7], such as autoregulation, cascades, and feedback, to process signals. These network motifs often implement signal amplification in order to raise the level of the input signal relative to the noise. Amplification can be characterised by the gain, the foldchange in the signal amplitude. However, it is important to recognise that such amplification can not only increase the levels of the desired signal, but can also amplify the noise itself. Therefore, to understand the possibilities and limitations of different network motifs for enhancing the fidelity of signal transduction, we need to understand * deronde@amolf.nl how both the signal and the noise are propagated through these motifs. Specifically, information theory indicates that the reliability of signal transmission is determined by the ratio of the gain of the network to the total noise in the output signal -the gain-to-noise ratio. Moreover, to assess how reliably signals of different temporal characteristics are transduced, we have to understand the frequency dependence of the gain and the noise. Importantly, we expect that different network architectures will affect the frequency-dependence of the gain and the noise differently, which means that we have to study both these quantities. In this manuscript, we study the frequencydependence of the gain-to-noise ratio for simple cascades, and for cascades employing autoregulation and feedback. This allows us to elucidate how autoregulation and feedback can shape the frequency range over which signals can be transduced reliably.

Information theory provides a formalism for quantifying the reliability of information transmission in the presence of noise [8]. A natural measure for the fidelity of signal transmission from an input signal S to an output signal X (the network response) is the mutual information between S and X, which is defined as Here, p(S) and p(X) are the probability distributions of possible input and output signals respectively, and p (S|X) is the conditional probability of S once X is specified. The mutual information quantifies the reduction in entropy of (or uncertainty about) the signal after one ob-tains knowledge of the network response, averaged over all possible responses. In other words, I(S, X) is how much we learn (on average) about S by measuring X. For a deterministic system, every S leads to a unique X (we assume no degeneracy). Measuring X thus precisely specifies S, such that the uncertainty in S after a measurement of X is H(S|X) = 0 and I (S, X) = H (S). However, in the presence of noise in the network each input S will lead to a distribution of possible outputs X. As a result, an observed X can correspond to multiple S values and I (S, X) ≤ H (S). For completely uncorrelated S and X, I(S, X) = 0. By construction, the mutual information is symmetric, such that I (S, X) = I (X, S).

Recently, the mutual information has been used to study the reliability of information transmission in biochemical networks [9][10][11][12]. However, these studies considered only the steady-state response of a network to a distribution of constant input signals, which do not change on the timescale of the network response. Yet, in many biological systems, it cannot be assumed that the input signal is constant on the timescale of the network response.

Indeed, in many systems the message is encoded in the temporal dynamics of the input signal. A well-known example is bacterial chemotaxis, where the concentration of the intracellular messenger protein depends not on the steady-state ligand concentration, but rather on the change of this concentration in the recent past [13] -the response of the network thus depends on the history of the input signal. Moreover, the extracellular signal may be encoded in the temporal dynamics of the intracellular signal transduction pathway. An interesting example is provided by the rat PC-12 system: while stimulation with a neuronal growth factor gives rise to a sustaine

…(Full text truncated)…

📸 Image Gallery

cover.png

Reference

This content is AI-processed based on ArXiv data.

Start searching

Enter keywords to search articles

↑↓
ESC
⌘K Shortcut