A new combinatorial-probabilistic diagnostic entropy has been introduced. It describes the pair-wise sum of probabilities of system conditions that have to be distinguished during the diagnosing process. The proposed measure describes the uncertainty of the system conditions, and at the same time complexity of the diagnosis problem. Treating the assumed combinatorial-diagnostic entropy as a primary notion, the information delivered by the symptoms has been defined. The relationships have been derived to facilitate explicit, quantitative assessment of the information of a single symptom as well as that of a symptoms set. It has been proved that the combinatorial-probabilistic information shows the property of additivity. The presented measures are focused on diagnosis problem, but they can be easily applied to other disciplines such as decision theory and classification.
Deep Dive into A Combinatorial-Probabilistic Diagnostic Entropy and Information.
A new combinatorial-probabilistic diagnostic entropy has been introduced. It describes the pair-wise sum of probabilities of system conditions that have to be distinguished during the diagnosing process. The proposed measure describes the uncertainty of the system conditions, and at the same time complexity of the diagnosis problem. Treating the assumed combinatorial-diagnostic entropy as a primary notion, the information delivered by the symptoms has been defined. The relationships have been derived to facilitate explicit, quantitative assessment of the information of a single symptom as well as that of a symptoms set. It has been proved that the combinatorial-probabilistic information shows the property of additivity. The presented measures are focused on diagnosis problem, but they can be easily applied to other disciplines such as decision theory and classification.
1
A Combinatorial-Probabilistic
Diagnostic Entropy and Information
Henryk Borowczyk, Member, IEEE
Air Force Institute of Technology, Warsaw, Poland
borowczyk@post.pl
Abstract
A new combinatorial-probabilistic diagnostic entropy has been introduced. It describes the
pair-wise sum of probabilities of system conditions that have to be distinguished during the
diagnosing process. The proposed measure describes the uncertainty of the system conditions,
and at the same time complexity of the diagnosis problem. Treating the assumed combinatorial-
diagnostic entropy as a primary notion, the information delivered by the symptoms has been
defined. The relationships have been derived to facilitate explicit, quantitative assessment of the
information of a single symptom as well as that of a symptoms set. It has been proved that the
combinatorial-probabilistic information shows the property of additivity. The presented measures
are focused on diagnosis problem, but they can be easily applied to other disciplines such as
decision theory and classification.
Index Terms– entropy, fault diagnosis, information, multi-valued model, uncertainty
I. INTRODUCTION
Constructing an optimal set of diagnostic symptoms/tests and optimal sequence of
gathering/execution thereof is one of the most important problems in engineering systems
diagnosis [1] – [7]. Applied optimization method depends on the form of diagnostic model and
optimization criterion [1],[2],[5],[8] – [12].
The diagnostic model describes the relationship between a system condition (the set of faults and
healthy condition) and diagnostic symptoms [4],[5],[13] – [16].
Most models use binary (good – no-good) conditions and binary (normal – abnormal) symptoms
[3],[8],[17]. Better results can be obtained using the qualitative, approximate, and multi-valued
models [2],[4],[15],[18],[19]. In [2], it is proved that the length of diagnosis algorithm in the case
of multi-valued system conditions and multi-valued symptoms is not larger than the binary
algorithm.
Qualitative and multi-valued models can be applied to determine a diagnosis algorithm
[1],[2],[19], approximate inference within expert systems [4],[15],[19].
One of the methods of constructing a diagnosis algorithm consists in applying the information-
based analysis [2],[3],[8],[16],[18],[20] – [24], that is, description of the system condition
uncertainty and the amount of information delivered by means of individual symptoms and sets
thereof.
This aim can be reached with the Shannon-introduced quantities: the entropy, and the amount of
2
information [25]. There are some other kinds of entropies [27] that can be considered – Renyi‟s
entropy [26],[28], structural α-entropy [29], and functions
( )
z
t
[30]. Characterization of
information measures (from the information theory point of view) have been extensively
discussed in [31],[32].
The abovementioned entropies were introduced for solving information-theoretic problems (e.g.
coding). For other problems, different forms of entropy might be more suitable [26],[28],[33].
Therefore, finding the form of entropy best tailored to meet the diagnostic requirements is
justified.
This article deals with a multi-valued diagnostic model that exploits the multi-valued system
conditions and multi-valued symptoms, where the set of values taken by conditions and
symptoms is finite.
The set of desirable properties of the proposed diagnostic entropy is determined by taking the
diagnostics point of view into account. The problem is formulated in a system condition-set
partition framework [33].
The organization of the article is as follows. In section II, basic assumptions concerning multi-
valued diagnostic model are stated. Section III describes information-theoretic, set-partition
framework for the diagnosis algorithm designing. It is a starting point for establishing a set of
postulated properties of diagnostic entropy that is described in Section IV.
In Section V, the combinatorial-probabilistic diagnostic entropy is introduced and its postulated
properties are proved.
The combinatorial-probabilistic diagnostic information of symptoms and sets thereof is defined in
Section VI.
II. ASSUMPTIONS
Further consideration is conducted with the following assumptions formulated and referring to a
multi-valued model of the system under the diagnosing.
- A finite set of the system conditions is determined:
{ },
1,…,
i
E
e
i
n
(1)
Elements
ie
E
can be treated as random events of the type: “a system under diagnosis is in the
i-th condition”, whereas the whole set E - as a certain event. Conditions might be one-fault or
multi-fault ones.
- The system can remain in one, and only one, of the condition
ie
E
with the probability
( )
i
P e
, and the following holds:
1,
,
( )
0,
( )
1
i
i
n
P e
P E
K
…(Full text truncated)…
This content is AI-processed based on ArXiv data.