Next Article in Journal
Natural Time Analysis: The Area under the Receiver Operating Characteristic Curve of the Order Parameter Fluctuations Minima Preceding Major Earthquakes
Previous Article in Journal
Probabilistic Shaping for Finite Blocklengths: Distribution Matching and Sphere Shaping
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Dual Measure of Uncertainty: The Deng Extropy

by
Francesco Buono
1 and
Maria Longobardi
2,*
1
Dipartimento di Matematica e Applicazioni “Renato Caccioppoli”, Università degli Studi di Napoli Federico II, I-80126 Naples, Italy
2
Dipartimento di Biologia, Università degli Studi di Napoli Federico II, I-80126 Naples, Italy
*
Author to whom correspondence should be addressed.
Entropy 2020, 22(5), 582; https://doi.org/10.3390/e22050582
Submission received: 6 May 2020 / Revised: 17 May 2020 / Accepted: 19 May 2020 / Published: 21 May 2020
(This article belongs to the Section Information Theory, Probability and Statistics)

Abstract

:
The extropy has recently been introduced as the dual concept of entropy. Moreover, in the context of the Dempster–Shafer evidence theory, Deng studied a new measure of discrimination, named the Deng entropy. In this paper, we define the Deng extropy and study its relation with Deng entropy, and examples are proposed in order to compare them. The behaviour of Deng extropy is studied under changes of focal elements. A characterization result is given for the maximum Deng extropy and, finally, a numerical example in pattern recognition is discussed in order to highlight the relevance of the new measure.

1. Introduction

Let X be a discrete random variable with support { x 1 , , x n } and with probability mass function vector p ̲ = ( p 1 , , p n ) . The Shannon entropy of X is defined as
H ( X ) = H ( p ̲ ) = i = 1 n p i log p i ,
where log is the natural logarithm, see [1]. It is a measure of information and discrimination about the uncertainty related to the random variable X. Lad et al. [2] proposed the extropy as the dual of entropy. This is a measure of uncertainty related to the outside and is defined as
J ( X ) = J ( p ̲ ) = i = 1 n ( 1 p i ) log ( 1 p i ) .
Lad et al. proved the following property related to the sum of the entropy and the extropy:
H ( p ̲ ) + J ( p ̲ ) = i = 1 n H ( p i , 1 p i ) = i = 1 n J ( p i , 1 p i ) ,
where H ( p i , 1 p i ) = J ( p i , 1 p i ) = p i log p i ( 1 p i ) log ( 1 p i ) are the entropy and the extropy of a discrete random variable of which the support has cardinality two and the probability mass function vector is ( p i , 1 p i ) .
Dempster [3] and Shafer [4] introduced a method to study uncertainty. Their theory of evidence is a generalization of the classical probability theory. In D-S theory, an uncertain event with a finite number of alternatives is considered, and a mass function over the power set of the alternatives (i.e., a degree of confidence to all of its subsets) is defined. If we give positive mass only to singletons, a probability mass function is obtained. D-S theory allows us to describe more general situations in which there is less specific information.
Here we describe an example studied in [5] to explain how D-S theory extends the classical probability theory. Consider two boxes, A and B such that in A there are only red balls, whereas in B there are only green balls and the number of balls in each box is unknown. A ball is picked randomly from one of these two boxes. The box A is selected with probability p A = 0.6 and box B is selected with probability p B = 0.4 . Hence, the probability of picking a red ball is 0.6, P ( R ) = 0.6 , whereas the probability of picking a green ball is 0.4, P ( G ) = 0.4 . Suppose now that in box B there are green and red balls with rates unknown. The box A is still selected with probability p A = 0.6 and the box B with probability p B = 0.4 . In this case, we can not obtain the probability of picking a red ball. To analyze this problem, we can use D-S theory to express the uncertainty. In particular, we choose a mass function m such that, m ( R ) = 0.6 and m ( R , G ) = 0.4 .
Dempster–Shafer theory of evidence has several applications due to its advantages in dealing with uncertainty; for example, it is used in decision making [6,7], in risk evaluation [8,9], in reliability analysis [10,11] and so on. In the following, we recall the basic notions of this theory.
Let X be a frame of discernment, i.e., a set of mutually exclusive and collectively exhaustive events indicated by X = { θ 1 , θ 2 , , θ | X | } . The power set of X is indicated by 2 X and it has cardinality 2 | X | . A function m : 2 X [ 0 , 1 ] is called a mass function or a basic probability assignment (BPA) if
m ( ) = 0 and A 2 X m ( A ) = 1 .
If m ( A ) 0 implies | A | = 1 then m is also a probability mass function, i.e., BPAs generalize discrete random variables. Moreover, elements A such that m ( A ) > 0 are called focal elements. Given a BPA, we can evaluate for each focal element the pignistic probability transformation (PPT). Let us recall that the pignistic probability is the probability that a thinking being would assign to that event. It represents a point estimate of belief and can be determined as [12]
P P T ( A ) = B : A B m ( B ) | B | .
If we have a weight or a reliability of evidence, represented by a coefficient α [ 0 , 1 ] , we can use it to generate another BPA m α in the following way (see [4])
m α ( A ) = α m ( A ) , if A X α m ( X ) + ( 1 α ) , if A = X .
If we have two BPAs m 1 , m 2 for a frame of discernment X, we can introduce another BPA m for X using the Dempster rule of combination, see [3]. We define m ( A ) , A X , in the following way
m ( A ) = 0 , if A = B , C X : B C = A m 1 ( B ) m 2 ( C ) 1 K , if A
where K = B , C X : B C = m 1 ( B ) m 2 ( C ) . We remark that, if K > 1 , we can not apply the Dempster rule of combination.
Recently, several measures of discrimination and uncertainty have been proposed in the literature (see, for instance, [13,14,15,16,17,18,19]). In particular, in the context of the Dempster–Shafer evidence theory, there are interesting measures of discrimination, as the Deng entropy; it was this latter concept that has suggested us the introduction of a dual definition.
The Deng entropy was introduced in [5] for a BPA m as
E d ( m ) = A X : m ( A ) > 0 m ( A ) log 2 m ( A ) 2 | A | 1 .
This entropy is similar to Shannon entropy and they coincide if the BPA is also a probability mass function. The term 2 | A | 1 represents the potential number of states in A. For a fixed value of m ( A ) , as the cardinality of A increases, 2 | A | 1 increases and then also Deng entropy does.
In the literature, several properties of Deng entropy have been studied (see for instance [20]) and other measures of uncertainty based on Deng entropy have been introduced (see [21,22]). Other relevant measures of uncertainty and information known in the Dempster–Shafer theory of evidence are, for example, Hohle’s confusion measure [23], Yager’s dissonance measure [24] and Klir and Ramer’s discord measure [25].
The aim of this paper is to dualize the Deng entropy by defining a corresponding extropy. We present some examples of comparing Deng entropy and our new extropy and their monotonicities. Then investigate the relations between these measures, and the behaviour of the Deng extropy under changes of focal elements. Moreover, a characterization result is given for the maximum Deng extropy. Finally, an application to pattern recognition is given in which it is clear that the Deng extropy can make the right recognition. In the conclusions, the results contained in the paper are summarized.

2. The Deng Extropy

In order to obtain an analogue of Equation (3), we choose the following definition for the Deng extropy
E X d ( m ) = A X : m ( A ) > 0 ( 1 m ( A ) ) log 2 1 m ( A ) 2 | A c | 1 ,
where A c is the complementary of A in X and | A c | = | X | | A | . Our purpose is to apply the Deng extropy in order to measure the uncertainty related to the outside in the context of Dempster–Shafer evidence theory. For this reason, X is not involved in the determination of the Deng extropy, even when m ( X ) > 0 . The term 2 | A c | 1 represents the potential number of states outside of A. For a fixed value of m ( A ) , as the cardinality of A increases, 2 | A c | 1 decreases and then also the Deng extropy does.
Proposition 1.
Let m be a BPA for a frame of discernment X. Then
E d ( m ) + E X d ( m ) = A X : m ( A ) > 0 E d ( m A ) m ( X ) log 2 m ( X ) 2 | X | 1
= A X : m ( A ) > 0 E X d ( m A ) m ( X ) log 2 m ( X ) 2 | X | 1 ,
with the convention 0 log 0 = 0 , where m A is a BPA on X defined as
m A ( B ) = m ( A ) , i f B = A 1 m ( A ) , i f B = A c 0 , o t h e r w i s e .
Proof. 
From the definition of the BPA m A we have
E d ( m A ) = m ( A ) log 2 m ( A ) 2 | A | 1 ( 1 m ( A ) ) log 2 1 m ( A ) 2 | A c | 1 E X d ( m A ) = ( 1 m ( A ) ) log 2 1 m ( A ) 2 | A c | 1 m ( A ) log 2 m ( A ) 2 | A | 1 ,
i.e., they are equal. Hence, for every A X such that m ( A ) > 0 E d ( m A ) (or E X d ( m A ) ) gives the corresponding addend of E d ( m ) + E X d ( m ) . The only exception is given by X, which could give a contribution in the left hand side of Equation (9) if m ( X ) > 0 , and for this reason we have the extra term in the right side of Equations (9) and (10). □
Next, we give some examples of evaluation of the Deng extropy and entropy in different situations.
Example 1.
Given a frame of discernment X, a X and a BPA m such that m ( { a } ) = m ( a ) = 1 , we have
E X d ( m ) = ( 1 1 ) log 2 1 1 2 | X | 1 1 = 0 , E d ( m ) = log 2 1 = 0 .
So, in this case, Deng entropy coincides with Deng extropy and they are equal to 0.
Example 2.
Given a frame of discernment X = { a , b , c } and a BPA m such that m ( a ) = m ( b ) = m ( c ) = 1 3 , we have
E X d ( m ) = 2 3 log 2 2 9 · 3 = 2 log 2 2 9 , E d ( m ) = 1 3 log 2 1 3 · 3 = log 2 1 3 .
Example 3.
Given a frame of discernment X = { a , b , c } and a BPA m such that m ( a ) = m ( b ) = m ( c ) = m ( a , b ) = 1 4 , we have
E X d ( m ) = 3 4 log 2 1 4 · 3 3 4 log 2 3 4 = 3 3 4 log 2 3 , E d ( m ) = 1 4 log 2 1 4 · 3 1 4 log 2 1 12 = 2 + 1 4 log 2 3 .
Example 4.
Given a frame of discernment X with cardinality n and a BPA m such that m ( i ) = 1 n , for i = 1 , , n , we have
E X d ( m ) = n 1 1 n log 2 1 1 n 2 n 1 1 = ( n 1 ) log 2 n n 1 + log 2 ( 2 n 1 1 ) , E d ( m ) = log 2 ( n ) ,
which are increasing with n N . We plot the values of the Deng extropy for n 15 in Figure 1. Deng entropy is not plotted because it is trivial.
Example 5.
Given a frame of discernment X = { 1 , 2 , , 15 } and a BPA m such that m ( 3 , 4 , 5 ) = 0.05 , m ( 6 ) = 0.05 , m(A) = 0.8, m ( X ) = 0.1 . When A changes, the following values for the Deng extropy and entropy in Table 1 are obtained:
As it was pointed out before, the results show that the extropy of m decreases monotonously with the rise of the size of subset A, while the entropy increases.

3. The Maximum Deng Extropy

Kang and Deng [26] studied the problem of the maximum Deng entropy. They find out that the maximum Deng entropy on a frame of discernment X with cardinality | X | is attained if and only if the BPA m is defined as
m ( A i ) = 2 | A i | 1 j = 1 2 | X | 1 ( 2 | A j | 1 ) , i = 1 , , 2 | X | 1 ,
where A i , i = 1 , , 2 | X | 1 , are all non-empty elements of 2 X . Hence, the value of the maximum Deng entropy is given by
E d = i = 1 2 | X | 1 2 | A i | 1 j = 1 2 | X | 1 ( 2 | A j | 1 ) log 2 2 | A i | 1 j = 1 2 | X | 1 ( 2 | A j | 1 ) 2 | A i | 1 = log 2 i = 1 2 | X | 1 ( 2 | A i | 1 ) .
In this section, we provide conditions to obtain the maximum Deng extropy for a fixed number of focal elements and with a fixed value for m ( X ) .
Theorem 1.
Let m be a BPA for a frame of discernment X. The maximum Deng extropy for fixed values of m ( X ) and N number of focal elements different from X, N = | N = { A X : m ( A ) > 0 } , is attained if and only if
m ( A ) = 1 N ( 1 m ( X ) ) B N ( 2 | B c | 1 ) ( 2 | A c | 1 ) , A N .
In this case, the value of the maximum Deng extropy is
E X d = [ N ( 1 m ( X ) ) ] log 2 N ( 1 m ( X ) ) A N ( 2 | A c | 1 ) .
Proof. 
Suppose m ( X ) = 0 . We will prove that in this case the maximum Deng extropy is
E X d = ( N 1 ) log 2 N 1 A N ( 2 | A c | 1 ) ,
and that it is attained if and only if the BPA is defined by
m ( A ) = 1 N 1 B N ( 2 | B c | 1 ) ( 2 | A c | 1 ) , A N .
We have to maximize
E X d = A N ( 1 m ( A ) ) log 2 1 m ( A ) 2 | A c | 1
subject to the condition
A N m ( A ) = 1 .
Then, the Lagrange function can be defined as
( E X d ) 0 = A N ( 1 m ( A ) ) log 2 1 m ( A ) 2 | A c | 1 + λ A N m ( A ) 1 .
Thus the gradient can be computed, and for A N we have
( E X d ) 0 m ( A ) = log 2 1 m ( A ) 2 | A c | 1 + log 2 e + λ = 0 ,
where log 2 e + λ does not depend on m ( A ) . By vanishing all the partial derivatives, we obtain
1 m ( A ) 2 | A c | 1 = K , A N ,
where K is a constant. It follows
m ( A ) = 1 K 2 | A c | 1 , A N .
By summing over A N , we get
1 = A N 1 K 2 | A c | 1 ,
and then
K = N 1 A N 2 | A c | 1 .
Therefore, from Equation (12) we obtain
m ( A ) = 1 N 1 B N 2 | B c | 1 2 | A c | 1 , A N ,
i.e., Equation (11). Finally, for the Deng extropy related to this BPA, we get
E X d ( m ) = A N N 1 B N 2 | B c | 1 2 | A c | 1 log 2 N 1 B N 2 | B c | 1 = ( N 1 ) log 2 N 1 A N ( 2 | A c | 1 ) = E X d
and the proof is completed. □

4. Application to Pattern Recognition

In this section, we investigate an application of the Deng extropy in pattern recognition by using the dataset Iris given in [27]. This example was already studied in [28] to analyze the applications of another belief entropy defined in the Dempster–Shafer evidence theory. We compare a method proposed by Kang [29] with a method based on the use of Deng extropy. The Iris dataset is useful to introduce the application of the generation of BPAs based on the Deng extropy in the classification of the kind of flowers. The dataset is composed of 150 samples. For each one, we have the sepal length in cm (SL), the sepal width in cm (SW), the petal length in cm (PL), the petal width in cm (PW) and the class that is only one between Iris Setosa (Se), Iris Versicolour (Ve) and Iris Virginica (Vi). The samples are equally distributed for each class. We select 40 samples for each kind of Iris and then we use sample of max–min value to generate a model of interval numbers, as shown in Table 2. Each element of the dataset can be regarded as an unknown test sample. Suppose the selected sample data is [6.1, 3.0, 4.9, 1.8, Iris Virginica].
Four BPAs are generated with a method proposed by Kang et al. based on the similarity of interval numbers [29]. Given two intervals A = [ a 1 , a 2 ] and B = [ b 1 , b 2 ] their similarity S ( A , b ) is defined as
S ( A , B ) = 1 1 + α D ( A , B ) ,
where α > 0 is the coefficient of support (we choose α = 5 ) and D ( A , B ) is the distance of intervals A and B defined in [30] as
D 2 ( A , B ) = a 1 + a 2 2 b 1 + b 2 2 2 + 1 3 a 2 a 1 2 2 + b 2 b 1 2 2 .
In order to generate BPAs, the intervals given in Table 2 are used as interval A and as interval B we use singletons given by the selected sample. For each one of the four properties, we get seven values of similarity and then we get a BPA by normalizing them (see Table 3). Hence, we evaluate the Deng extropy of these BPAs, as shown in the bottom row of Table 3. We obtain a combined BPA by using the Dempster rule of Combination (6). The type of unknown sample is determined by combined BPA. From Equation (4), we get the maximum value of PPT. Hence, Kang’s method assigns to the sample the type Iris Versicolour and it does not make the right decision.
Next, we use the Deng extropies given in Table 3 to generate other BPAs. We refer to these extropies as E X d ( S L ) , E X d ( S W ) , E X d ( P L ) , E X d ( P W ) . We use these values as significance of each sample property to evaluate the weight of each property. For the sample property sepal length we have
ω ( S L ) = e E X d ( S L ) e E X d ( S L ) + e E X d ( S W ) + e E X d ( P L ) + e E X d ( P W ) .
We divide each weight for the maximum of the weights and use these values as discounting coefficients to generate new BPAs, as shown in Equation (5), see Table 4. Again, a combined BPA is obtained by using the Dempster rule of combination. The type of unknown sample is determined by combined BPA. Hence, the method based on the Deng extropy can make the right recognition.
We tested all 150 samples and we get that the global recognition rate of Kang’s method is 93.33% whereas the global recognition of the method based on the Deng extropy is 94%. The results are shown in Table 5.

5. Conclusions

In this paper, the Deng extropy has been defined as the dual measure of Deng entropy. Its relation with the analogous entropy has been analyzed and these measures have been compared in order to decide in which cases the first is better than the other one. Moreover, some examples are proposed. The behaviour of Deng extropy has been studied under changes of focal elements. We have given a characterization result for the maximum Deng extropy and, finally, a numerical example is discussed in order to highlight the relevance of this dual measure in pattern recognition.

Author Contributions

These authors contributed equally to this work. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Acknowledgments

Francesco Buono and Maria Longobardi are partially supported by the GNAMPA research group of INdAM (Istituto Nazionale di Alta Matematica) and MIUR-PRIN 2017, Project “Stochastic Models for Complex Systems” (No. 2017 JFFHSH).

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
BPABasic probability assignment
PPTPignistic probability transformation
SLSepal length in cm
SWSepal width in cm
PLPetal length in cm
PWPetal width in cm
SeIris Setosa
VeIris Versicolour
ViIris Virginica

References

  1. Shannon, C.E. A mathematical theory of communication. Bell Labs. Tech. J. 1948, 27, 379–423. [Google Scholar] [CrossRef] [Green Version]
  2. Lad, F.; Sanfilippo, G.; Agrò, G. Extropy: Complementary dual of entropy. Stat. Sci. 2015, 30, 40–58. [Google Scholar] [CrossRef]
  3. Dempster, A.P. Upper and lower probabilities induced by a multivalued mapping. Ann. Math. Stat. 1967, 38, 325–339. [Google Scholar] [CrossRef]
  4. Shafer, G. A Mathematical Theory of Evidence; Princeton University Press: Princeton, NJ, USA, 1976. [Google Scholar]
  5. Deng, Y. Deng entropy. Chaos Solitons Fractals 2016, 91, 549–553. [Google Scholar] [CrossRef]
  6. Fu, C.; Yang, J.B.; Yang, S.L. A group evidential reasoning approach based on expert reliability. Eur. J. Oper. Res. 2015, 246, 886–893. [Google Scholar] [CrossRef]
  7. Yang, J.B.; Xu, D.L. Evidential reasoning rule for evidence combination. Artif. Intell. 2013, 205, 1–29. [Google Scholar] [CrossRef]
  8. Kabir, G.; Tesfamariam, S.; Francisque, A.; Sadiq, R. Evaluating risk of water mains failure using a Bayesian belief network model. Eur. J. Oper. Res. 2015, 240, 220–234. [Google Scholar] [CrossRef]
  9. Liu, H.C.; You, J.X.; Fan, X.J.; Lin, Q.L. Failure mode and effects analysis using D numbers and grey relational projection method. Expert Syst. Appl. 2014, 41, 4670–4679. [Google Scholar] [CrossRef]
  10. Han, Y.; Deng, Y. An enhanced fuzzy evidential DEMATEL method with its application to identify critical success factors. Soft Comput. 2018, 22, 5073–5090. [Google Scholar] [CrossRef]
  11. Liu, Z.; Pan, Q.; Dezert, J.; Han, J.W.; He, Y. Classifier fusion with contextual reliability evaluation. IEEE Trans. Cybern. 2018, 48, 1605–1618. [Google Scholar] [CrossRef]
  12. Smets, P. Data fusion in the transferable belief model. In Proceedings of the Third International Conference on Information Fusion, Paris, France, 10–13 July 2000; Volume 1, pp. PS21–PS33. [Google Scholar]
  13. Balakrishnan, N.; Buono, F.; Longobardi, M. On weighted extropies. Comm. Stat. Theory Methods. (under review).
  14. Calì, C.; Longobardi, M.; Ahmadi, J. Some properties of cumulative Tsallis entropy. Physica A 2017, 486, 1012–1021. [Google Scholar] [CrossRef] [Green Version]
  15. Calì, C.; Longobardi, M.; Navarro, J. Properties for generalized cumulative past measures of information. Probab. Eng. Inform. Sci. 2020, 34, 92–111. [Google Scholar] [CrossRef]
  16. Calì, C.; Longobardi, M.; Psarrakos, G. A family of weighted distributions based on the mean inactivity time and cumulative past entropies. Ricerche Mat. 2019, 1–15. [Google Scholar] [CrossRef]
  17. Di Crescenzo, A.; Longobardi, M. On cumulative entropies. J. Stat. Plann. Inference 2009, 139, 4072–4087. [Google Scholar] [CrossRef]
  18. Kamari, O.; Buono, F. On extropy of past lifetime distribution. Ricerche Mat. 2020, in press. [Google Scholar] [CrossRef]
  19. Longobardi, M. Cumulative measures of information and stochastic orders. Ricerche Mat. 2014, 63, 209–223. [Google Scholar] [CrossRef] [Green Version]
  20. Abellan, J. Analyzing properties of Deng entropy in the theory of evidence. Chaos Solitons Fractals 2017, 95, 195–199. [Google Scholar] [CrossRef]
  21. Tang, Y.; Fang, X.; Zhou, D.; Lv, X. Weighted Deng entropy and its application in uncertainty measure. In Proceedings of the 20th International Conference on Information Fusion (Fusion), Xi’an, China, 10–13 July 2017; pp. 1–5. [Google Scholar]
  22. Wang, D.; Gao, J.; Wei, D. A New Belief Entropy Based on Deng Entropy. Entropy 2019, 21, 987. [Google Scholar] [CrossRef] [Green Version]
  23. Hohle, U. Entropy with respect to plausibility measures. In Proceedings of the 12th IEEE International Symposium on Multiple-Valued Logic, Paris, France, 25–27 May 1982; pp. 167–169. [Google Scholar]
  24. Yager, R.R. Entropy and specificity in a mathematical theory of evidence. Int. J. Gen. Syst. 1983, 9, 249–260. [Google Scholar] [CrossRef]
  25. Klir, G.J.; Ramer, A. Uncertainty in the Dempster-Shafer theory: A critical re-examination. Int. J. Gen. Syst. 1990, 18, 155–166. [Google Scholar] [CrossRef]
  26. Kang, B.; Deng, Y. The Maximum Deng Entropy. IEEE Access 2019, 7, 120758–120765. [Google Scholar] [CrossRef]
  27. Dheeru, D.; Karra Taniskidou, E. UCI Machine Learning Repository. 2017. Available online: http://archive.ics.uci.edu/ml (accessed on 20 May 2020).
  28. Cui, H.; Liu, Q.; Zhang, J.; Kang, B. An Improved Deng Entropy and Its Application in Pattern Recognition. IEEE Access 2019, 7, 18284–18292. [Google Scholar] [CrossRef]
  29. Kang, B.Y.; Li, Y.; Deng, Y.; Zhang, Y.J.; Deng, X.Y. Determination of basic probability assignment based on interval numbers and its application. Acta Electron. Sin. 2012, 40, 1092–1096. [Google Scholar]
  30. Tran, L.; Duckstein, L. Comparison of fuzzy numbers using a fuzzy distance measure. Fuzzy Sets Syst. 2002, 130, 331–341. [Google Scholar] [CrossRef]
Figure 1. E X d ( m ) in function of n with basic probability assignment (BPA) defined in Example 4.
Figure 1. E X d ( m ) in function of n with basic probability assignment (BPA) defined in Example 4.
Entropy 22 00582 g001
Table 1. The value of the Deng extropy and the Deng entropy when A changes.
Table 1. The value of the Deng extropy and the Deng entropy when A changes.
ADeng ExtropyDeng Entropy
{ 1 } 28.1042.6623
{ 1 , 2 } 27.9043.9303
{ 1 , 2 , 3 } 27.7044.9082
{ 1 , , 4 } 27.5045.7878
{ 1 , , 5 } 27.3046.6256
{ 1 , , 6 } 27.1047.4441
{ 1 , , 7 } 26.9038.2532
{ 1 , , 8 } 26.7029.0578
{ 1 , , 9 } 26.5009.8600
{ 1 , , 10 } 26.29510.661
{ 1 , , 11 } 26.08611.462
{ 1 , , 12 } 25.86612.262
{ 1 , , 13 } 25.62113.062
{ 1 , , 14 } 25.30413.862
Table 2. The interval numbers of the statistical model.
Table 2. The interval numbers of the statistical model.
ItemSLSWPLPW
S e [4.4,5.8][2.3,4.4][1.0,1.9][0.1,0.6]
V e [4.9,7.0][2.0,3.4][3.0,5.1][1.0,1.7]
V i [4.9,7.9][2.2,3.8][4.5,6.9][1.4,2.5]
S e , V e [4.9,5.8][2.3,3.4]
S e , V i [4.9,5.8][2.3,3.8]
V e , V i [4.9,7.0][2.2,3.4][4.5,5.1][1.4,1.7]
S e , V e , V i [4.9,5.8][2.3,3.4]
Table 3. BPAs based on Kang’s method, Deng extropy and final fusion result.
Table 3. BPAs based on Kang’s method, Deng extropy and final fusion result.
ItemSLSWPLPWCombined BPA
m ( S e ) 0.10980.10180.06250.10040.0059
m ( V e ) 0.17030.13030.18390.23990.4664
m ( V i ) 0.12570.13850.18190.30170.4656
m ( S e , V e ) 0.14130.16630.00000.00000.0000
m ( S e , V i ) 0.14130.14410.00000.00000.0000
m ( V e , V i ) 0.17030.15270.57190.35800.0620
m ( S e , V e , V i ) 0.14130.16630.00000.00000.0000
Deng extropy5.25485.28065.16364.9477
Table 4. The modified BPAs based on the Deng extropy and final fusion result.
Table 4. The modified BPAs based on the Deng extropy and final fusion result.
ItemSLSWPLPWCombined BPA
m ( S e ) 0.08080.07300.05040.10040.0224
m ( V e ) 0.12520.09340.14820.23990.4406
m ( V i ) 0.09250.09930.14650.30170.4451
m ( S e , V e ) 0.10390.11920.00000.00000.0000
m ( S e , V i ) 0.10390.10330.00000.00000.0000
m ( V e , V i ) 0.12520.10950.46080.35800.0919
m ( S e , V e , V i ) 0.36840.40230.19420.00000.0000
Table 5. The recognition rate.
Table 5. The recognition rate.
ItemSetosaVersicolorVirginicaGlobal
Kang’s method100%96%84%93.33%
Method based on Deng extropy100%96%86%94%

Share and Cite

MDPI and ACS Style

Buono, F.; Longobardi, M. A Dual Measure of Uncertainty: The Deng Extropy. Entropy 2020, 22, 582. https://doi.org/10.3390/e22050582

AMA Style

Buono F, Longobardi M. A Dual Measure of Uncertainty: The Deng Extropy. Entropy. 2020; 22(5):582. https://doi.org/10.3390/e22050582

Chicago/Turabian Style

Buono, Francesco, and Maria Longobardi. 2020. "A Dual Measure of Uncertainty: The Deng Extropy" Entropy 22, no. 5: 582. https://doi.org/10.3390/e22050582

APA Style

Buono, F., & Longobardi, M. (2020). A Dual Measure of Uncertainty: The Deng Extropy. Entropy, 22(5), 582. https://doi.org/10.3390/e22050582

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop