Shannon entropies of the distributions of various electroencephalograms from epileptic humans

Reading time: 6 minute
...

📝 Original Info

  • Title: Shannon entropies of the distributions of various electroencephalograms from epileptic humans
  • ArXiv ID: 0911.1423
  • Date: 2009-11-10
  • Authors: Researchers from original ArXiv paper

📝 Abstract

1) Harmonic oscillations (HO) in numerous electroencephalograms (EEG) from different humans are introduced. 2) The probability density functions (PDF, p(X)) of the EEG voltages (X) are normal (Gauss) for OO whereas, the plots for the distributions of HO (pure) are convex. Gaussians for OO may turn to be convex as HO become dominant in MO or vice versa. However, distributions of the most of the data are found normal which means that most of the EEG oscillations consist of OO (or MO). 3) Shannon entropies (information measures) of the distributions of the data from different brain regions in the ictal intervals or inter-ictal intervals are calculated for each individual recording and compared. The averages of Shannon entropies over the individual recordings during the ictal intervals come out bigger than those from the inter-ictal intervals. These averages are found to be bigger for the data from epileptogenic brain areas than those recorded from non epileptogenic ones in different intervals.

💡 Deep Analysis

Deep Dive into Shannon entropies of the distributions of various electroencephalograms from epileptic humans.
  1. Harmonic oscillations (HO) in numerous electroencephalograms (EEG) from different humans are introduced. 2) The probability density functions (PDF, p(X)) of the EEG voltages (X) are normal (Gauss) for OO whereas, the plots for the distributions of HO (pure) are convex. Gaussians for OO may turn to be convex as HO become dominant in MO or vice versa. However, distributions of the most of the data are found normal which means that most of the EEG oscillations consist of OO (or MO). 3) Shannon entropies (information measures) of the distributions of the data from different brain regions in the ictal intervals or inter-ictal intervals are calculated for each individual recording and compared. The averages of Shannon entropies over the individual recordings during the ictal intervals come out bigger than those from the inter-ictal intervals. These averages are found to be bigger for the data from epileptogenic brain areas than those recorded from non epileptogenic ones in different inter

📄 Full Content

- 1 - Shannon entropies of the distributions of various electroencephalograms from epileptic humans

Çağlar Tuncay Department of Physics, Middle East Technical University 06531 Ankara, Turkey caglart@metu.edu.tr

Abstract: In this letter, nearly 700 million data recorded from nearly 20 epileptic humans with different brain origins of epilepsy, ages or sexes are analyzed, and;

  1. Harmonic oscillations (HO) in numerous electroencephalograms (EEG) from different humans are introduced. Inspection of the data shows that HO may come out besides the ordinary ones (OO), for several seconds or hours or longer in several simultaneous individual recordings from different brain sites in an inter-ictal interval or ictal interval. HO are deformed in certain time intervals (epoch) when the cyclic behavior is altered or wave amplitude is time dependent. Then the individual oscillations become mixed (MO). Thus, the EEG oscillations can be categorized mainly in three groups; HO, OO or MO.
  2. The probability density functions (PDF, p(X)) of the EEG voltages (X) are normal (Gauss) for OO whereas, the plots for the distributions of HO (pure) are convex. Gaussians for OO may turn to be convex as HO become dominant in MO or vice versa. However, distributions of the most of the data are found normal which means that most of the EEG oscillations consist of OO (or MO).
  3. Shannon entropies (information measures) of the distributions of the data from different brain regions in the ictal intervals or inter-ictal intervals are calculated for each individual recording and compared. The averages of Shannon entropies over the individual recordings during the ictal intervals come out bigger than those from the inter-ictal intervals. These averages are found to be bigger for the data from epileptogenic brain areas than those recorded from non epileptogenic ones in different intervals.

Key words: Harmonic oscillation, Distribution, Shannon entropy, Stationarity, Randomness

Pacs: 87.19.Nn, 87.15.Aa, 05.90.−y; 05.90.+m; 87.10.+e; 87.59.Bh

Introduction: Patterns of the EEG signals are widely studied in various linear or non linear based approaches [1]. These analyses may be valuable for detection or prediction of epileptic seizures as pointed in [2]. Another suggestion is that EEG voltages with big absolute value (amplitude) [3] maybe precursors of epileptic seizure onsets where the number of the big amplitudes may also be important. Thus, entropies of the EEG distributions can be useful for characterizing the EEG data [4]. With this aim, EEG distributions and their Shannon entropies (S) are considered in this letter. [5]

EEG distributions and Shannon entropies: Several statistical properties of EEG data are known to be investigated in terms of their distributions, from the 1950s on. [6] A recent treatment of human EEG distributions may be found in [7]. Entropies of the distributions are also studied in various contexts (for direct applications, see [8]).

  • 2 - Entropy is known to be a thermodynamic quantity describing the amount of disorder in a system. It can be taken as a measure of uncertainty in the information content. Shannon entropy is the measure used to analyze human EEG signals in this letter. (For entropies in EEG data from animals see, [8] or references given therein.) If P(X) is a normalized distribution of the brain voltages (X which are integers in micro Volts (μV), here), then S is

S = -KB ∑iPiln(Pi)

(1)

where the summation is over the states (i) which are accessible with probability (Pi), ln is the natural logarithm and KB is Boltzmann constant which is treated as unity and the equality sign is replaced by ∝, here. S in Eq. (1) can be related to the standard deviation (σ) or height (pmax) of a normal distribution (p(X)) about a mean (λ);

p(X) = (2πσ2)-½exp(-(x-λ)2/2σ2) ;

(2)

S ∝ ½(ln(2πσ2) + 1) = lnσ + 1. 4189

(3)

or

S ∝ ½ - ln(pmax) ,

(4)

respectively. The summation in Eq. (1) is approximated by integration for the results given in Eqs. (3) or (4) (or (6), below). Normalized distributions (pHO) of HO about a mean (λ) follow;

pHO(X) = π-1(Q2-(X-λ)2)-½
for -Q<X<Q

(5)

where Q is the wave amplitude which may be constant or time dependent in different epochs in the recordings of a person or different persons. HO can be shown to have the following entropies for various Q values:

SHO ∝ lnQ + 0.45158

(6)

Note that PDF in the Eqs. (2) or (5) are concave or convex for OO or HO, all respectively. If the oscillations are mixed, then the tops of the distribution peaks may come out concave or flat, depending on the relative amount of HO in the data. Secondly, entropies (Eq. (1)) are big for normal distributions (Eq. (2)) with big standard deviations (Eq. (3)) or small heights (Eq. (4)) and similarly for convex distributio

…(Full text truncated)…

Reference

This content is AI-processed based on ArXiv data.

Start searching

Enter keywords to search articles

↑↓
ESC
⌘K Shortcut