In this article we exploit the Bhattacharyya statistical divergence to determine the similarity of probability distributions of quantum observables. After brief review of useful characteristics of the Bhattacharyya divergence we apply it to determine the similarity of probability distributions of two non-commuting observables. An explicit expression for the Bhattacharyya statistical divergence is found for the case of two observables which are the x- and z-components of the angular momentum of a spin-1/2 system. Finally, a note is given of application of the considered statistical divergence to the specific physical measurement.
One of the important problems in the probability theory is to find an appropriate measure of the difference or the statistical divergence of two probability distributions P and P ′ . This measure quantifies the degree of the similarity between them. In the mathematical statistics the divergence of two probability distributions is introduced as follows: If [X, P ] and [X, P ′ ] are two probability spaces then the so-called (Csiszár's) f-divergence of probability distributions P and P ′ is given as D f (P ; P ′ ) = x∈X P ′ (x)f P (x) P ′ (x) ,
where f (u) represents a convex function in the interval (0, ∞) and strictly convex for u = 1 [1]. Among the existing divergence measures of two discrete probabilities, P ≡ [p 1 , p 2 , . . . p n ] and Q ≡ [q 1 , q 2 , . . . q n ], the Kullback-Leibler statistical divergence [3] D K (P : Q) = n i=1 p i log p i q i , is perhaps best known and most widely used. This is why this measure has several desirable properties, such as nonnegativity and additivity, which are crucial in its applications. D(P ; Q) is not symmetrical regarding the exchange of P and Q. For D K (P ; Q), the inequality holds
The minimum of D K (P ; Q) is obtained iff p i = q i (see, e.g. [14]). Apart of the Kullback-Leibler statistical divergence, a number of other divergence measures, depending on certain parameters, have been proposed and intensively studied by Rényi [5], Kapur [2], Kullback and Leiber [4], Havrda and Charvat [6], Tsallis [7], [8]. Some of them satisfy the convexity condition only for restricted values of the corresponding parameters.
However, the Kullback-Leibler, Rényi, Havrda-Charvat, Tsallis and the trigonometrical [9] statistical divergences require generally p i = 0 whenever q i = 0. From point of view of their application, this is not a desirable property because just such situations we often encounter in the theoretical physics especially in statistical and quantum physics.
In the next sections, we exploit one of the first statistical divergence measure, that was proposed in the literature, the Bhattachryya divergence of P and Q which is symmetrical regarding the exchange of P and Q and does not suffer from the above-mentioned shortcoming. We attempt to apply the Bhattacharyya statistical divergence to the quantification of the degree of similarity of two quantum observables.
The Bhattacharyya statistical divergence of the discrete probability distributions P = p 1 , p 2 , . . . , p n ; Q = q 1 , q 2 , . . . , q 2 is defined as [10]
(
This divergence has the following properties: (i) It becomes its maximal value equal to 1 when the probability distributions P and Q are identical.
(ii) Its minimal value is zero when the components of P and Q do not overlap.
(iii) Its value lies in the interval [0, 1] and expresses the degree how much the probability distributions of P and Q are similar.
(iv) It is symmetrical regarding the exchange of P and Q.
(v) S(P, Q) satisfies the properties of nonnegativity, finiteness and boundedness.
(vi) It can be straightforward extended for more than two probability distributions [15]. The Bhattacharyya statistical divergence of P and Q has a simple geometrical interpretation. Consider the following vectors
According to Eq. ( 1), the similarity measure of P and Q is simply the scalar product of P and Q in R
- . Since P and Q represent the unit vectors in R (m) + its scalar product is equal to the cosine of angle between P and Q which, of course, has the properties (i)-(iv).
Consider two observables A, B with Hermitian operators Â, B in an N-dimensional Hilbert space, whose corresponding complete orthonormal sets of eigenvectors {|x
Accordingly, the components of the probability distributions P (A) and P (B) associated with the observables A and B are
Inserting Eqs.(2a) and (2b) into Eq.( 1) we get
Hence, for A ≡ B it follows S(P (A), P (B)) = 1. Given the state vector |φ and operators Â, B the considered statistical divergence of their probability distributions can be generally determined. To each operator, a ray in the Hilbert space can be assigned. The quantity S(P (A), P (B)) gives the closeness of different rays in Hilbert space. If these rays are identical then their Bhattacharyya divergence S(P (A), P (B)) is equal to 1. If they are perpendicular to each other then S(P (A), P (B)) becomes zero. Generally, S(P (A), P (B)) A ≡ B and the corresponding rays of these operators in Hilbert space are identical, i.e. the cosine of angle between them is equal to 1. Therefore, S(P (A), P (B)) = 1.
Next, we consider the case of two non-commuting observables. Consider two observables A and B with noncommuting Hermitian operators  and B in an N-dimensional Hilbert space, whose corresponding complete orthonormal sets of eigenvectors {|x i }, {y i } (i = 1, 2, …, N ) are disjointed and have nondegenerate spectra. Let |φ be a normalized state vector of N-dimensional Hilbert space then it holds
According the quantum transformation theory w
This content is AI-processed based on open access ArXiv data.