Determination of signal-to-noise ratio on the base of information-entropic analysis

Reading time: 5 minute
...

📝 Original Info

  • Title: Determination of signal-to-noise ratio on the base of information-entropic analysis
  • ArXiv ID: 1609.09212
  • Date: 2016-09-30
  • Authors: Z. Zh. Zhanabaev, S.N. Akhtanov, E.T. Kozhagulov, B.A Karibayev

📝 Abstract

In this paper we suggest a new algorithm for determination of signal-to-noise ratio (SNR). SNR is a quantitative measure widely used in science and engineering. Generally, methods for determination of SNR are based on using of experimentally defined power of noise level, or some conditional noise criterion which can be specified for signal processing. In the present work we describe method for determination of SNR of chaotic and stochastic signals at unknown power levels of signal and noise. For this aim we use information as difference between unconditional and conditional entropy. Our theoretical results are confirmed by results of analysis of signals which can be described by nonlinear maps and presented as overlapping of harmonic and stochastic signals.

💡 Deep Analysis

Figure 1

📄 Full Content

The main characteristics of communication electronic systems are SNR and bit error rate (BER) defined via SNR [1].

As usual, SNR can be defined as signal power to noise power ratio [2][3][4][5][6]. This approach is developed in many works. For example, [7] contains a description of using of multiple linear regression with coefficients chosen for different types of noise for defining of SNR.

Estimation of SNR can be made via relation between autocorrelation functions of a signal and noise shifted by time [8], but noise and signal levels have arbitrary chosen values. Authors of [9] use wavelet signal filtering with certain chosen coefficients, and define SNR as relation between variance of signal and variance of noise. It was specified that in this case the SNR calculation is relatively fast. However, due to the limited bandwidth of wavelet functions spectrum, this method is not applicable to analyze very noisy signals. Effective SNR estimation defined as difference between standard SNR and signal quality indicator which also must be given previously is described in [10]. The single distinction between modified segmental SNR and standard SNR is in the fact that for calculation of modified SNR it is necessary to summarize SNR for separated time intervals [11].

SNR is used in many areas of science and engineering such as in wireless telecommunications systems [12][13][14][15], medicine [16][17], nuclear physics [18], neuroscience [19], sound technique [20][21][22], optoelectronics [23][24], nanotechnology [25][26], astrophysics [27][28], etc.

However, generally accepted methods and original algorithms for calculation of SNR used in the mentioned above papers have the following limitations:

-necessity to set value of noise level according to experimental details or conditional criteria; -absence of standard algorithms for SNR definition; -absence of a universal theoretical approach for SNR definition for signals with unknown noise level.

So, the problem follows from the described above: is it possible to define SNR as a relation between information and entropy? We formulate this problem by the following way because information is a universal measure of determinacy of a signal, and entropy is a measure of its uncertainty (noise). For solving of this problem we can accept that information is not a local characteristic, but an averaged value defining via difference between unconditional and conditional entropy [29]. Information entropy is often used in different research. For example, comparison of composite, refined and multi-scale cross-sample entropy of complex signals is described in [30]. Entropic analysis can be applied for the description of such complex signals as multi-fractal signals, financial time series, etc. Informational entropy can be used for signal filtering [31]. Entropy can be also applied for classification of infrasound signals [32]. Cognitive state of human subjects on the base of entropic analysis of physiological signal is described in [33].

Relationship between SNR and entropy is specified in [34], in this work value of maximum entropy of probability density function for convolutional noise is used for the description of modulation and SNR. An image filtering based on calculation of entropy is suggested in [35]. Seismic signal filtering based on wavelet transform and application of Shannon and Tsallis entropy for determination of SNR is described in [36]. Entropy analysis can be useful for the description of dynamical systems with chaotic behavior [37][38]. Unfortunately, in spite of the fact that relation between entropy and SNR has been described in these works, a ratio between information and entropy hasn’t been used, and described above limitations for definition of SNR remain valid.

The aim of this work is to define value of SNR as information to entropy ratio (IER) for various signals (mixture of harmonic signal and noise, chaotic signals from dynamical systems [39][40][41]). where ( ) y t is characteristic of a receiver. Unconditional Shannon entropy can be defined as 1 ( ) ln( )

where i p is a probability of detecting of variable x in a i-th cell characterized by size δ, ( | ) S x y is conditional entropy given as

Here ( | ) i j P x y is conditional probability. Defining of information according to Eq. ( 1) is possible if we have an empirical set of probabilities for time series of x(t) and y(t). For the description of dynamical systems we can accept ( ) ( ) y t x t   , so, we consider the derivative of x(t) as a second variable. Instead of one-dimensional Shannon entropy ( ) S x we use a two- dimensional full entropy ( , ) S x y . So, we can rewrite Eq.( 1) as

where ij P is probability of detecting of a point in a cell of phase space (x,y) .

Values of two-dimensional information and conditional entropy can be normalized to full entropy according to the following relations:

( , ) / ( , ),

We determine value of IER as ratio of normalized information and conditional e

📸 Image Gallery

cover.png

Reference

This content is AI-processed based on open access ArXiv data.

Start searching

Enter keywords to search articles

↑↓
ESC
⌘K Shortcut