Next Article in Journal
Quantum Correlation Based on Uhlmann Fidelity for Gaussian States
Previous Article in Journal
Agent Inaccessibility as a Fundamental Principle in Quantum Mechanics: Objective Unpredictability and Formal Uncomputability
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Bayesian Update with Information Quality under the Framework of Evidence Theory

School of Computer and Information Science, Southwest University, Chongqing 400715, China
*
Author to whom correspondence should be addressed.
Entropy 2019, 21(1), 5; https://doi.org/10.3390/e21010005
Submission received: 4 November 2018 / Revised: 28 November 2018 / Accepted: 18 December 2018 / Published: 21 December 2018

Abstract

:
Bayesian update is widely used in data fusion. However, the information quality is not taken into consideration in classical Bayesian update method. In this paper, a new Bayesian update with information quality under the framework of evidence theory is proposed. First, the discounting coefficient is determined by information quality. Second, the prior probability distribution is discounted as basic probability assignment. Third, the basic probability assignments from different sources can be combined with Dempster’s combination rule to obtain the fusion result. Finally, with the aid of pignistic probability transformation, the combination result is converted to posterior probability distribution. A numerical example and a real application in target recognition show the efficiency of the proposed method. The proposed method can be seen as the generalized Bayesian update. If the information quality is not considered, the proposed method degenerates to the classical Bayesian update.

1. Introduction

Probability is one of the often used tools to deal with uncertainty [1]. In probability theory, Bayesian method occupies an important position. Bayesian method has been widely used in various aspects, such as artificial intelligence [2,3], pattern recognition [4], spam filtering [5], construct and estimate biology models [6], Chinese word segmentation and semantics [7,8], exoplanetary explore [9,10,11], Multi-criteria decision making [12,13] and others [14,15,16,17].
Bayesian update is a popular topic [18,19], Vinogradova proposed using Bayes methods to recalculate the weight [20]. However, there are some situations in which the classical Bayesian method cannot cope with information fusion intuitively. If the probability distributions exist with high degree conflict, classical Bayes methods are no longer applicable. Recently, Yager proposed a new information quality (IQ) [21], quality is related to the lack of uncertainty in the fused value and the use of credible sources, information quality is a measure of information or certainty. The value provides us a method to evaluate the probability distribution. Inspired by Yager and Vinogradova, we decided one possible way to address this issue is to take the information quality into consideration. The main contribution of this paper is proposed a new Bayesian update method with the information quality based on the framework of evidence theory.
Dempster-Shafer evidence theory plays an important role in intelligent system [22,23]. There are two main advantages. One is the better ability to model uncertain information [24,25,26,27,28]. Information fusion enables us to get more useful information from a large amount of data, which is another advantage of evidence theory [29,30]. The Dempster’s combination rule has the ability to combine a variety of source information in a straight way [31,32,33]. Some research argues that Bayesian method is a special case of Dempster-Shafer evidence theory [34,35]. When the basic probability assignment (BPA) is only assigned to a single subset, Dempster’s combination rule degenerates to Bayesian law.
The motivation of this paper is to improve the ability of the classical Bayesian rule to deal with highly conflicting information. As illustrated in some examples in the following sections, the counterintuitive result is obtained with classical Bayesian method. To address this issue, a new Bayesian update with information quality under the framework of evidence theory is presented. Based on the framework of Dempster-Shafer evidence theory, the novel Bayesian update method is proposed. The information quality of the prior probability distribution is taken as weight and the discounting coefficient is determined by the weight. Then, the basic probability assignment can be obtained from prior probability distribution with the discounting coefficient. Next, the combination result of basic probability assignment from different sensor reports is combined by using Dempster’s combination rule. Finally, the pignistic probability distribution transformation is used to obtain the posterior probability.
The rest of this paper is organized as follows. Section 2 introduces the preliminary knowledge. Section 3 presents the method to deal with Bayesian update based on information quality. Section 4 illustrates the use of the proposed method in target recognition. Section 5 is a brief summary of this article.

2. Preliminaries

This section will introduce some preliminary knowledge regarding evidence theory [36,37], the pignistic probability transformation [38] and information quality [21].

2.1. Evidence Theory

There are many methods to cope with uncertainty information, such as Z-numbers [39,40], fuzzy set [41,42,43,44,45], grey theory [46,47,48], D-numbers [49,50,51,52,53]. Dempster-Shafer evidence theory has been widely used, such as in risk assessment [54], environment management [55], fault diagnosis [56,57], and decision making [58,59]. Assume Ω is the frame of discernment. The components of Ω are infinite nonempty sets with the attributes of mutually exclusive and exhaustive. Let the components of Ω = { , A 1 , A 2 , , A n } . The power set of Ω is 2 Ω equal to 2 n . Each component of the power set is a subset of Ω .
Definition 1.
The basic probability assignment (BPA) is a mapping m : 2 Ω [ 0 , 1 ] that satisfies [36,37]:
m ( ) = 0 a n d A Ω m ( A ) = 1 ,
where A is a subset of Ω.
Definition 2.
Given two basic probability assignments, Dempster’s combination rule is defined as follows [36,37],
m ( C ) = m i ( X ) m i ( Y ) = 0 i f X Y = X Y = C , X , Y Ω m i ( X ) m i ( Y ) 1 k i f X Y
with k shows the conflict among the collected evidence, defined as follows,
k = X Y = , X , Y Ω m i ( X ) m i ( Y )
However, when the value of k is big, use of Dempster’s combination rule will produce counter intuitive results [60,61]. Yager [62], Dubois [63] and Smets [64] and more recently Murphy [65], Deng et al. [66] and other researchers [67] have proposed alternative combination rules. Recently, with the belief entropy [68,69], some new methods are presented to address the conflict [70,71,72].
The next example illustrates how the Bayesian update cannot cope with high degree conflict probability distributions.
Example 1.
Using the classical Bayesian method to deal with the high degree conflict example, the probability distribution is p 1 : (0.9900, 0.0100, 0), p 2 : (0, 0.0100, 0.9900).
k = 0.9900 0.0100 + 0.9900 0.9900 + 0.9900 0.0100 = 0.9999
m ( A ) = m ( C ) = 0 1 0.9999 = 0
m ( B ) = 0.0001 1 0.9999 = 1
So the final combination result is P: (0, 1, 0).
It is obvious that the combination result is counterintuitive. The example shows the classical Bayesian method cannot update the probability distribution when conflict exists, but the proposed method can update the conflicting probability distributions. Details for the combination steps are given in Example 6.

2.2. Pignistic Probability Transformation

Pignistic probability transformation (PPT) is used to transfer basic probability assignment into probability distribution, defined as follows [38],
Definition 3.
Let m be a basic probability assignment on Ω and its associated pignistic probability transformation B e t P m : Ω [ 0 , 1 ] is defined as [38]
B e t P m ( ω ) = A Ω , ω ϵ A 1 A m ( A ) 1 m ( ) m ( ) 1
with A being the cardinality of subset A.
Example 2.
Let one BPA from distinct sources on frame Ω = { ω 1 , ω 2 , ω 3 , ω 4 } be
  • m 1 ( { ω 1 , ω 2 } ) = 0.8000 m 1 ( { ω 3 } ) = 0.1000 m 1 ( { ω 4 } ) = 0.1000
  • Let B = { ω 1 } then B e t P 1 ( B ) = 0.4000
  • Let A = { ω 4 } then B e t P 1 ( A ) = 0.1000

2.3. Information Quality

Entropy is a measure of uncertainty associated with the information [73]. More uncertainty means more entropy; the smaller the entropy, the more information is contained in this probability distribution. There are several methods to calculate entropy [74,75], some famous entropies are as follows, Shannon entropy [76], Gini entropy [77], Deng entropy [78,79] and others [80].
Definition 4.
Gini entropy is defined as follows [77],
G ( p i ) = 1 i = 1 n p i 2
where p i is the vector form of probability distribution.
From the definition of Gini entropy, it is obvious that in order to magnify the value of G ( p i ) the value of i = 1 n p i 2 the bigger the better. For this reason, Yager proposed use of p i 2 , named NegEnt, as a measure of information or certainty [21]. Information quality has been applied in decision making [81,82], evaluating information [26], in maximal fusion [83,84], modeling [85] and elsewhere [24,86].
The bigger the NegEnt ( p i 2 ), the smaller the entropy, the more certainty provided by the probability distribution; the information increases by increasing NegEnt ( p i 2 ).
Definition 5.
Given a probability distribution p i , the information quality is defined as follows [21],
I Q p i = p i 2 = j = 1 n ( p i j ) 2
while p i is defined as follows [21],
p i = ( p i ) ( p i ) = ( j = 1 n ( p i j ) 2 ) 1 2
Example 3.
Given a probability distribution p: (0.3000,0.6000,0.1000), the corresponding information quality can be calculated as
I Q p i = 0.3000 0.3000 + 0.6000 0.6000 + 0.1000 0.1000 = 0.4600

3. Proposed Method

This section first describes the method to determine weight, based on the information quality of the probability distribution. Then a new method is presented to generate basic probability assignment based on the weight. Next, Dempster’s combination rule is used to fuse basic probability assignments. Finally, with the aid of PPT, the fusion transformation to probability distribution is detailed.

3.1. Determine Weight

The information quality is an important index to measure the quality of the information. The weights of the probability distributions quantitatively express their significance and influence the evaluation result [87], so it is reasonable to take information quality as the weight of the probability distribution. The information quality needs to be normalized since the sum of the weights must meet the attribute, the summation of the weight must equal to one. The weight can be seen as a discounting coefficient and we can use it to generate basic probability distribution.
Definition 6.
Given the probability distribution p i , the corresponding weight is defined as follows,
ω i = I Q p i i n I Q p i
where I Q p i is the information quality of the probability distribution.
Example 4.
If the information quality is given as follows, I Q p 1 = 0.5400 , I Q p 2 = 0.8200 , I Q p 3 = 0.6600 .
Then the corresponding weighting can be calculated as follows,
ω p 1 = 0.5400 0.5400 + 0.8200 + 0.6600 = 0.2700
ω p 2 = 0.8200 0.5400 + 0.8200 + 0.6600 = 0.4000
ω p 3 = 0.6600 0.5400 + 0.8200 + 0.6600 = 0.3300

3.2. Generate Basic Probability Assignent

This section proposed the method to convert a probability distribution to a basic probability assignment, based on the weight of the probability distribution. Note, there is a one-to-one correspondence between probability distribution and basic probability assignment. Algorithm 1 illustrates the method to get basic probability assignment.
Algorithm 1: The algorithm to generate a basic probability assignment
// To get all BPA, execute this algorithm n (total number of probability distributions) times as the algorithm is used to convert a probability distribution to a BPA.
Input: The weight of the probability distribution, ω 1
m ( A ) = ω 1 p ( A )
m ( B ) = ω 1 p ( B )

m ( N ) = ω 1 p ( N )
m ( A B N ) = 1 i = 1 n ω 1 p ( I ) I = A , B , C , , N
Output: m 1 = ( { m ( A ) } , { m ( B ) } , { } , { m ( A B N ) } )

3.3. Fusion Method

This section shows how to combine basic probability assignment multiple times and how to transform the fusion result into a probability distribution.
Only two basic probability assignments are involved in fusion at any time. The first and the second basic probability assignment participate the fusion first. Then the fusion result and the next basic probability assignment are involved in fusion, and the rest are fused in turn until all the BPA are involved in the fusion. The pseudo-code in Algorithm 2 illustrates the fusion process intuitively.
Algorithm 2: The algorithm of fusion process
Entropy 21 00005 i001
Next, a flow chart (Figure 1) illustrates the whole process of the proposed method.
As can be seen in Figure 1, the additions of the previous works are mainly in two aspects: first, the information quality is taken into account in the process of the Bayesian update. Second, the Bayesian update proposed in this paper is based on the framework of the evidence theory.

4. Application

In this section, a numerical example will first illustrate the use of the proposed method. Example 1 is then revisited with the use of the new approach. Finally, two real applications in target recognition demonstrates how the proposed method can be applied.

4.1. Numerical Example

This numerical example shows the process of the proposed method.
Example 5.
The probability distributions are p 1 : ( 1 3 , 1 3 , 1 3 ), p 2 : (0.7000, 0.2000, 0.1000), p 3 : (0.6000, 0.3000, 0.1000).
The corresponding information qualities are:
I Q p 1 = 0.3300
I Q p 2 = 0.5400
I Q p 3 = 0.4600
The corresponding weightings are:
ω p 1 = 0.3300 0.3300 + 0.5400 + 0.4600 = 0.2500
ω p 2 = 0.5400 0.3300 + 0.5400 + 0.4600 = 0.4000
ω p 3 = 0.4600 0.3300 + 0.5400 + 0.4600 = 0.3500
The generated basic probability assignments are:
m 1 = ( { 1 12 } , { 1 12 } , { 1 12 } , { 3 4 } )
m 2 = ( { 0.2800 } , { 0.0800 } , { 0.0400 } , { 0.6000 } )
m 3 = ( { 0.2000 } , { 0.1100 } , { 0.0400 } , { 0.6500 } )
Next, use of Dempster’s combination rule 2 times gives the following.
  • m 1 and m 2 fusion provides m = ({0.3000}, {0.1300}, {0.0900}, {0.4800})
  • m and m 3 fusion provides m = ({0.4000}, {0.1700}, {0.0900}, {0.3400})
Finally, the PPT provides the fused probability distribution.
p ( A ) = 0.4000 + 1 3 0.3400 = 0.5200
p ( B ) = 0.1700 + 1 3 0.3400 = 0.2800
p ( C ) = 0.0900 + 1 3 0.3400 = 0.2000
The finally fusion result is p: (0.5200, 0.2800, 0.200).
Example 6.
Using the data given in Example 1, but the final result is combined by the proposed Beysian update.
The corresponding information qualities are:
I Q p 1 = 0.9800
I Q P 2 = 0.9800
The corresponding weightings are:
ω p 1 = ω p 2 = 0.5000
The generated basic probability assignments are:
m 1 = ( { 0.4950 } , { 0.0050 } , { 0 } , { 0.5000 } )
m 2 = ( { 0 } , { 0.0050 } , { 0.4950 } , { 0.5000 } )
Next, use of Dempster’s combination rule one time for m 1 and m 2 fusion gives m.
Fusion of m 1 and m 2 obtains m = ({0.3200}, {0.0300}, {0.3200}, {0.3300}).
Finally, the PPT provides the fused probability distribution.
p ( A ) = 0.3200 + 1 3 0.3300 = 0.4300
p ( B ) = 0.0300 + 1 3 0.3300 = 0.1400
p ( C ) = 0.3200 + 1 3 0.3300 = 0.4300
The final combination result is p: (0.4300, 0.1400, 0.4300).
Compare the two final combination results (0, 1, 0) and (0.4300, 0.1400, 0.4300); the later result is more reasonable. The high degree conflict example illustrates in this situation that the classical Bayesian method cannot update sensor report but the presented Bayesian method can using evidence theory to provide intuitive updates.

4.2. Target Recognition

In this section, an application in target recognition illustrates the efficiency of the proposed method.
Assume three bombs were planted in an area in a military exercise. Three sensors are used to detect the bombs. The data collected from sensors are as follows, s 1 : (0.7000, 0.2000, 0.1000), s 2 : (0.8000, 0.1000, 0.1000), s 3 : (0.6000, 0.2000, 0.2000).
The corresponding information qualities are:
I Q s 1 = 0.5400
I Q s 2 = 0.6600
I Q s 3 = 0.4400
The corresponding weightings are:
ω s 1 = 0.5400 0.5400 + 0.6600 + 0.4400 = 0.3300
ω s 2 = 0.6600 0.5400 + 0.6600 + 0.4400 = 0.4000
ω s 3 = 0.4400 0.5400 + 0.6600 + 0.4400 = 0.2700
The generated basic probability assignments are:
m 1 = ( { 0.2300 } , { 0.0700 } , { 0.0300 } , { 0.6700 } )
m 2 = ( { 0.3200 } , { 0.0400 } , { 0.0400 } , { 0.600 } )
m 3 = ( { 0.1600 } , { 0.0500 } , { 0.0500 } , { 0.7300 } )
Next, use of Dempster’s combination rule gives the following.
  • m 1 and m 2 fusion gives m ,
    m = ( { 0.4700 } , { 0.0600 , } { 0.0300 } , { 0.4400 } )
  • m and m 3 fusion gives m,
    m = ( { 0.5300 } , { 0.0700 } , { 0.0500 } , { 0.3500 } )
Finally, the final probability distribution is obtained by PPT.
p ( A ) = 0.5300 + 1 3 0.3500 = 0.6500
p ( B ) = 0.0700 + 1 3 0.3500 = 0.1800
p ( C ) = 0.0500 + 1 3 0.3500 = 0.1700
The final fusion result is s: (0.6500, 0.1800, 0.1700).
From the collected sensor reports, it is easy to know target A is identified, and the fusion result also identifies A, as can be seen in Figure 2.

4.3. Multi-Sensor Target Recognition

A real application in multi-sensor target recognition illustrates the virtue of the proposed method compared with the simple average. In a multi-sensor based automatic target recognition system, the detected targets are: A , B , C ; suppose the real target is A. From five different sensors, the system has collected five bodies of data shown as follows: s 1 (0.5000, 0.2000, 0.3000), s 2 (0.7000, 0.1000, 0.2000), s 3 (0.5500, 0.1000, 0.3500), s 4 (0.5500, 0.1000, 0.3500), s 5 (0.6000, 0.1000, 0.3000). The results obtained by the proposed method and simple average are shown in Table 1.
As can be seen from Table 1, when only two collected data simply average, they perform better. However, with collected data increasing the proposed method, better results are achieved compared with simply average. This application shows the proposed method doing better than simply taking the average.

5. Conclusions

Bayesian update plays an important role in data fusion. It is reasonable to take information quality into consideration in the Bayesian update process. A new Bayesian update method considering information quality is presented in this paper.
This new way uses discount probability assignment and Dempster’s combination rule. A numerical example and a real application in target recognition illustrate the use of the proposed method. The proposed Bayesian update can deal with conflicting prior probability distributions while the classical Bayesian update cannot.
The main contributions of this paper are mainly in three aspects.
First, it creatively combines information quality with Bayesian update based on the framework of evidence theory.
Second, it proposed a new method to obtain the discount coefficient.
Third, it has the ability to deal with highly conflicting data.
The advantages of the proposed method are as follows: less computation load, strong robustness, fault tolerance. The presented Bayesian update is a generalization of the classical Bayesian update with information quality and conflict taken into account using the framework of evidence theory.
The two open issues and our ongoing works are listed as follows:
One, the input data in this paper is probability distribution. However, in real application of target recognition, the radar report may be modeled by basic probability assignments. As a result, the open issue is to present a new information quality of basic probability assignment.
The other, the proposed method to deal with conflict, depends on the quality of the sensor data report. Determining how to construct the evaluation model, with not only the information in this paper but also the other parameters, is necessary to be considered in future research.

Author Contributions

Methodology, F.X.; Writing original draft, Y.L.

Funding

The work is partially supported by the National Natural Science Foundation of China (Grant No. 61672435).

Acknowledgments

The authors greatly appreciate the reviewers’ suggestions and the editors’s encouragement.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Knuth, K.H. Measuring on lattices. AIP Conf. Proc. 2009, 1193, 132–144. [Google Scholar]
  2. Knuth, K.H.; Erner, P.M.; Frasso, S. Designing Intelligent Instruments. AIP Conf. Proc. 2007, 954, 203–211. [Google Scholar] [Green Version]
  3. Malakar, N.K.; Gladkov, D.; Knuth, K.H. Modeling a Sensor to Improve Its Efficacy. J. Sens. 2013, 2013, 481054. [Google Scholar] [CrossRef]
  4. Jain, A.K.; Duin, R.P.; Mao, J. Statistical pattern recognition: A review. IEEE Trans. Pattern Anal. Mach. Intell. 2000, 22, 4–37. [Google Scholar] [CrossRef]
  5. Guzella, T.S.; Caminhas, W.M. A review of machine learning approaches to spam filtering. Expert Syst. Appl. 2009, 36, 10206–10222. [Google Scholar] [CrossRef]
  6. Tsilifis, P.; Browning, W.J.; Wood, T.E.; Newton, P.K.; Ghanem, R.G. The Stochastic Quasi-chemical Model for Bacterial Growth: Variational Bayesian Parameter Update. J. Nonlinear Sci. 2018, 28, 371–393. [Google Scholar] [CrossRef]
  7. Rudin, D. Uncertainty and Persistence: A Bayesian Update Semantics for Probabilistic Expressions. J. Philos. Logic 2018, 47, 365–405. [Google Scholar] [CrossRef]
  8. Dionelis, N.; Brookes, M. Modulation-domain speech enhancement using a Kalman filter with a Bayesian update of speech and noise in the log-spectral domain. In Proceedings of the Joint Workshop on Hands-free Speech Communications and Microphone Arrays (HSCMA 2017), San Francisco, CA, USA, 1–3 March 2017; IEEE: Piscataway, NJ, USA, 2017; pp. 111–115. [Google Scholar]
  9. Knuth, K.H.; Placek, B.; Angerhausen, D.; Carter, J.L.; D’Angelo, B.; Gai, A.D.; Carado, B. EXONEST: The Bayesian Exoplanetary Explorer. Entropy 2017, 19, 559. [Google Scholar] [CrossRef]
  10. Placek, B. Bayesian Detection and Characterization of Extra-Solar Planets via Photometric Variations; State University of New York at Albany: Albany, NY, USA, 2014. [Google Scholar]
  11. Placek, B.; Knuth, K.H.; Angerhausen, D. EXONEST: Bayesian model selection applied to the detection and characterization of exoplanets via photometric variations. Astrophys. J. 2014, 795, 112. [Google Scholar] [CrossRef]
  12. Krylovas, A.; Dadelo, S.; Kosareva, N.; Zavadskas, E.K. Entropy–KEMIRA Approach for MCDM Problem Solution in Human Resources Selection Task. Int. J. Inf. Technol. Decis. Mak. 2017, 16, 1183–1209. [Google Scholar] [CrossRef]
  13. Zavadskas, E.K.; Podvezko, V. Integrated determination of objective criteria weights in MCDM. Int. J. Inf. Technol. Decis. Mak. 2016, 15, 267–283. [Google Scholar] [CrossRef]
  14. Knuth, K.H.; Habeck, M.; Malakar, N.K.; Mubeen, M.A.; Placek, B. Bayesian evidence and model selection. Dig. Signal Process. 2015, 47, 50–67. [Google Scholar] [CrossRef] [Green Version]
  15. Rowe, M.C.; Brewer, B.J. AMORPH: A statistical program for characterizing amorphous materials by X-ray diffraction. Comput. Geosci. 2018, 120, 21–31. [Google Scholar] [CrossRef]
  16. Knuth, K.H. Informed Source Separation: A Bayesian Tutorial. In Proceedings of the 2005 13th European Signal Processing Conference, Antalya, Turkey, 4–8 September 2005; IEEE: Piscataway, NJ, USA, 2006. [Google Scholar]
  17. Zan, J.; Hasenbein, J.J.; Morton, D.P.; Mehrotra, V. Staffing call centers under arrival-rate uncertainty with Bayesian updates. Oper. Res. Lett. 2018, 46, 379–384. [Google Scholar] [CrossRef]
  18. Gençaga, D.; Knuth, K.H.; Rossow, W.B. A Recipe for the Estimation of Information Flow in a Dynamical System. Entropy 2015, 17, 438–470. [Google Scholar] [CrossRef] [Green Version]
  19. Dehghannasiri, R.; Esfahani, M.S.; Qian, X.; Dougherty, E.R. Optimal Bayesian Kalman Filtering with Prior Update. IEEE Trans. Signal Process. 2018, 66, 1982–1996. [Google Scholar] [CrossRef]
  20. Vinogradova, I.; Podvezko, V.; Zavadskas, E. The recalculation of the weights of criteria in MCDM methods using the bayes approach. Symmetry 2018, 10, 205. [Google Scholar] [CrossRef]
  21. Yager, R.R.; Petry, F. An intelligent quality-based approach to fusing multi-source probabilistic information. Inf. Fusion 2016, 31, 127–136. [Google Scholar] [CrossRef] [Green Version]
  22. Deng, X.; Jiang, W. Dependence assessment in human reliability analysis using an evidential network approach extended by belief rules and uncertainty measures. Ann. Nucl. Energy 2018, 117, 183–193. [Google Scholar] [CrossRef]
  23. Han, Y.; Deng, Y. A novel matrix game with payoffs of Maxitive Belief Structure. Int. J. Intell. Syst. 2018. [Google Scholar] [CrossRef]
  24. Fei, L.; Deng, Y. A new divergence measure for basic probability assignment and its applications in extremely uncertain environments. Int. J. Intell. Syst. 2018. [Google Scholar] [CrossRef]
  25. Jiang, W. A correlation coefficient for belief functions. Int. J. Approx. Reason. 2018. [Google Scholar] [CrossRef]
  26. Yin, L.; Deng, X.; Deng, Y. The negation of a basic probability assignment. IEEE Trans. Fuzzy Syst. 2018. [Google Scholar] [CrossRef]
  27. Wang, X.; Song, Y. Uncertainty measure in evidence theory with its applications. Appl. Intell. 2017, 48, 1–17. [Google Scholar] [CrossRef]
  28. Sui, L.; Feissel, P.; Denoeux, T. Identification of elastic properties in the belief function framework. Int. J. Approx. Reason. 2018, 101, 69–87. [Google Scholar] [CrossRef]
  29. Muhammad, K.; Sajjad, M.; Lee, M.Y.; Baik, S.W. Efficient visual attention driven framework for key frames extraction from hysteroscopy videos. Biomed. Signal Process. Control 2017, 33, 161–168. [Google Scholar] [CrossRef]
  30. Muhammad, K.; Ahmad, J.; Sajjad, M.; Baik, S.W. Visual saliency models for summarization of diagnostic hysteroscopy videos in healthcare systems. SpringerPlus 2016, 5, 1495. [Google Scholar] [CrossRef]
  31. Jiang, W.; Hu, W. An improved soft likelihood function for Dempster-Shafer belief structures. Int. J. Intell. Syst. 2018, 33, 1264–1282. [Google Scholar] [CrossRef]
  32. Xiao, F.; Qin, B. A Weighted Combination Method for Conflicting Evidence in Multi-Sensor Data Fusion. Sensors 2018, 18, 1487. [Google Scholar] [CrossRef]
  33. Ahmad, J.; Muhammad, K.; Kwon, S.I.; Baik, S.W.; Rho, S. Dempster-Shafer fusion based gender recognition for speech analysis applications. In Proceedings of the 2016 International Conference on Platform Technology and Service (PlatCon), Jeju, Korea, 15–17 February 2016; IEEE: Piscataway, NJ, USA; pp. 1–4. [Google Scholar]
  34. Leonard, T.; Hsu, J.S. Bayesian Methods: An Analysis for Statisticians and Interdisciplinary Researchers; Cambridge University Press: Cambridge, UK, 2001; Volume 5. [Google Scholar]
  35. Yager, R.R. Generalized probabilities of fuzzy events from fuzzy belief structures. Inf. Sci. 1982, 28, 45–62. [Google Scholar] [CrossRef]
  36. Shafer, G. A Mathematical Theory of Evidence; Princeton University Press: Princeton, NJ, USA, 1976; Volume 42. [Google Scholar]
  37. Dempster, A.P. Upper and lower probabilities induced by a multivalued mapping. In Classic Works of the Dempster-Shafer Theory of Belief Functions; Springer: Berlin, Germany, 2008; pp. 57–72. [Google Scholar]
  38. Liu, W. Analyzing the degree of conflict among belief functions. Artif. Intell. 2006, 170, 909–924. [Google Scholar] [CrossRef] [Green Version]
  39. Kang, B.; Chhipi-Shrestha, G.; Deng, Y.; Hewage, K.; Sadiq, R. Stable strategies analysis based on the utility of Z-number in the evolutionary games. Appl. Math. Comput. 2018, 324, 202–217. [Google Scholar] [CrossRef]
  40. Kang, B.; Deng, Y.; Hewage, K.; Sadiq, R. A method of measuring uncertainty for Z-number. IEEE Trans. Fuzzy Syst. 2018. [Google Scholar] [CrossRef]
  41. Xiao, F. A Hybrid Fuzzy Soft Sets Decision Making Method in Medical Diagnosis. IEEE Access 2018, 6, 25300–25312. [Google Scholar] [CrossRef]
  42. Mardani, A.; Jusoh, A.; Zavadskas, E.K. Fuzzy multiple criteria decision-making techniques and applications—Two decades review from 1994 to 2014. Expert Syst. Appl. 2015, 42, 4126–4148. [Google Scholar] [CrossRef]
  43. Han, Y.; Deng, Y. An enhanced fuzzy evidential DEMATEL method with its application to identify critical success factors. Soft Comput. 2018, 22, 5073–5090. [Google Scholar] [CrossRef]
  44. Zavadskas, E.K.; Antuchevicience, J.; Hajiagha, S.H.R. The interval-valued intuitionistic fuzzy MULTIMOORA method for group decision making in engineering. Math. Probl. Eng. 2015, 2015, 560690. [Google Scholar] [CrossRef]
  45. Zavadskas, E.K.; Antucheviciene, J.; Turskis, Z.; Adeli, H. Hybrid multiple-criteria decision-making methods: A review of applications in engineering. Scientia Iranica 2016, 23, 1–20. [Google Scholar] [CrossRef]
  46. Tsai, S.B. Using grey models for forecasting China’s growth trends in renewable energy consumption. Clean Technol. Environ. Policy 2016, 18, 563–571. [Google Scholar] [CrossRef]
  47. Tsai, S.B.; Lee, Y.C.; Guo, J.J. Using modified grey forecasting models to forecast the growth trends of green materials. Proc. Inst. Mech. Eng. Part B J. Eng. Manuf. 2014, 228, 931–940. [Google Scholar] [CrossRef]
  48. Li, Z.; Chen, L. A novel evidential FMEA method by integrating fuzzy belief structure and grey relational projection method. Eng. Appl. Artif. Intell. 2019, 77, 136–147. [Google Scholar] [CrossRef]
  49. Xiao, F. A novel multi-criteria decision making method for assessing health-care waste treatment technologies based on D numbers. Eng. Appl. Artif. Intell. 2018, 71, 216–225. [Google Scholar] [CrossRef]
  50. Mo, H.; Deng, Y. A new MADA methodology based on D numbers. Int. J. Fuzzy Syst. 2018, 20, 2458–2469. [Google Scholar] [CrossRef]
  51. Xiao, F. An Intelligent Complex Event Processing with D Numbers under Fuzzy Environment. Math. Probl. Eng. 2016, 2016, 3713518. [Google Scholar] [CrossRef]
  52. Daijun, W. A modified D numbers methodology for environmental impact assessment. Technol. Econ. Dev. Econ. 2018, 24, 653–669. [Google Scholar]
  53. Deng, X.; Deng, Y. D-AHP method with different credibility of information. Soft Comput. 2018. [Google Scholar] [CrossRef]
  54. Chen, L.; Deng, Y. A new failure mode and effects analysis model using Dempster-Shafer evidence theory and grey relational projection method. Eng. Appl. Artif. Intell. 2018, 76, 13–20. [Google Scholar] [CrossRef]
  55. Chen, L.; Deng, X. A Modified Method for Evaluating Sustainable Transport Solutions Based on AHP and Dempster–Shafer Evidence Theory. Appl. Sci. 2018, 8, 563. [Google Scholar] [CrossRef]
  56. Xiao, F. A Novel Evidence Theory and Fuzzy Preference Approach-Based Multi-Sensor Data Fusion Technique for Fault Diagnosis. Sensors 2017, 17, 2504. [Google Scholar] [CrossRef]
  57. Zhang, H.; Deng, Y. Engine fault diagnosis based on sensor data fusion considering information quality and evidence theory. Adv. Mech. Eng. 2018, 10. [Google Scholar] [CrossRef]
  58. Fei, L.; Deng, Y.; Hu, Y. DS-VIKOR: A New Multi-criteria Decision-Making Method for Supplier Selection. Int. J. Fuzzy Syst. 2018. [Google Scholar] [CrossRef]
  59. He, Z.; Jiang, W. An evidential dynamical model to predict the interference effect of categorization on decision making. Knowl.-Based Syst. 2018, 150, 139–149. [Google Scholar] [CrossRef]
  60. Zhang, W.; Deng, Y. Combining conflicting evidence using the DEMATEL method. Soft Comput. 2018. [Google Scholar] [CrossRef]
  61. Wang, Y.; Deng, Y. Base belief function: An efficient method of conflict management. J. Ambient Intell. Hum. Comput. 2018. [Google Scholar] [CrossRef]
  62. Yager, R.R. On the Dempster-Shafer framework and new combination rules. Inf. Sci. 1987, 41, 93–137. [Google Scholar] [CrossRef]
  63. Dubois, D.; Prade, H. Representation and combination of uncertainty with belief functions and possibility measures. Comput. Intell. 1988, 4, 244–264. [Google Scholar] [CrossRef]
  64. Smets, P. The combination of evidence in the transferable belief model. IEEE Trans. Pattern Anal. Mach. Intell. 1990, 12, 447–458. [Google Scholar] [CrossRef] [Green Version]
  65. Murphy, C.K. Combining belief functions when evidence conflicts. Decis. Support Syst. 2000, 29, 1–9. [Google Scholar] [CrossRef]
  66. DENG, Y.; SHI, W.; LIU, Q. Combining Be-lief Function based on Distance Function. Decis. Support Syst. 2004, 38, 489–493. [Google Scholar]
  67. Deng, X.; Jiang, W.; Wang, Z. Zero-sum polymatrix games with link uncertainty: A Dempster-Shafer theory solution. Appl. Math. Comput. 2019, 340, 101–112. [Google Scholar] [CrossRef]
  68. Li, Y.; Deng, Y. Generalized Ordered Propositions Fusion Based on Belief Entropy. Int. J. Comput. Commun. Control 2018, 13, 792–807. [Google Scholar] [CrossRef]
  69. Deng, X. Analyzing the monotonicity of belief interval based uncertainty measures in belief function theory. Int. J. Intell. Syst. 2018, 33, 1869–1879. [Google Scholar] [CrossRef]
  70. Xiao, F. Multi-sensor data fusion based on the belief divergence measure of evidences and the belief entropy. Inf. Fusion 2019, 46, 23–32. [Google Scholar] [CrossRef]
  71. Pan, L.; Deng, Y. A New Belief Entropy to Measure Uncertainty of Basic Probability Assignments Based on Belief Function and Plausibility Function. Entropy 2018, 20, 842. [Google Scholar] [CrossRef]
  72. Zhou, X.; Hu, Y.; Deng, Y.; Chan, F.T.S.; Ishizaka, A. A DEMATEL-Based Completion Method for Incomplete Pairwise Comparison Matrix in AHP. Ann. Oper. Res. 2018, 271, 1045–1066. [Google Scholar] [CrossRef]
  73. Knuth, K.H.; Skilling, J. Foundations of inference. Axioms 2012, 1, 38–73. [Google Scholar] [CrossRef]
  74. Brewer, B.J. Computing Entropies with Nested Sampling. Entropy 2017, 19, 422. [Google Scholar] [CrossRef]
  75. Han, Y.; Deng, Y. A hybrid intelligent model for Assessment of critical success factors in high risk emergency system. J. Ambient Intell. Hum. Comput. 2018, 9, 1933–1953. [Google Scholar] [CrossRef]
  76. Shannon, C.E. A mathematical theory of communication. Bell Syst. Techn. J. 1948, 27, 379–423. [Google Scholar] [CrossRef]
  77. Gini, C. Variabilità e mutabilità. Reprinted in Memorie di Metodologica Statistica; Pizetti, E., Salvemini, T., Eds.; Libreria Eredi Virgilio Veschi: Rome, Italy, 1912. [Google Scholar]
  78. Xiao, F. An Improved Method for Combining Conflicting Evidences Based on the Similarity Measure and Belief Function Entropy. Int. J. Fuzzy Syst. 2018, 20, 1256–1266. [Google Scholar] [CrossRef]
  79. Deng, Y. Deng entropy. Chaos Solitons Fractals 2016, 91, 549–553. [Google Scholar] [CrossRef]
  80. Kang, B.; Deng, Y.; Hewage, K.; Sadiq, R. Generating Z-number based on OWA weights using maximum entropy. Int. J. Intell. Syst. 2018, 33, 1745–1755. [Google Scholar] [CrossRef]
  81. Song, M.; Jiang, W.; Xie, C.; Zhou, D. A new interval numbers power average operator in multiple attribute decision making. Int. J. Intell. Syst. 2017, 32, 631–644. [Google Scholar] [CrossRef]
  82. Yager, R.R. On ordered weighted averaging aggregation operators in multicriteria decisionmaking. IEEE Trans. Syst. Man Cybern. 1988, 18, 183–190. [Google Scholar] [CrossRef]
  83. Nguyen, T.T.; Phan, T.C.; Nguyen, Q.V.H.; Aberer, K.; Stantic, B. Maximal fusion of facts on the web with credibility guarantee. Inf. Fusion 2019, 48, 55–66. [Google Scholar] [CrossRef]
  84. Buck, B.; Macaulay, V.A. Maximum Entropy in Action: A Collection of Expository Essays; Oxford University Press: New York, NY, USA, 1991. [Google Scholar]
  85. Yager, R.R. Modeling multi-criteria objective functions using fuzzy measures. Inf. Fusion 2016, 29, 105–111. [Google Scholar] [CrossRef]
  86. Kullback, S. Information Theory and Statistics; Courier Corp.: North Chelmsford, MA, USA, 1997. [Google Scholar]
  87. Krylovas, A.; Kosareva, N.; Zavadskas, E.K. WEBIRA-comparative analysis of weight balancing method. Int. J. Comput. Commun. Control 2018, 12, 238–253. [Google Scholar] [CrossRef]
Figure 1. Flow chart of the presented method.
Figure 1. Flow chart of the presented method.
Entropy 21 00005 g001
Figure 2. The result of fusion.
Figure 2. The result of fusion.
Entropy 21 00005 g002
Table 1. The results of different combination methods used in multi-sensor target recognition.
Table 1. The results of different combination methods used in multi-sensor target recognition.
s 1 , s 2 s 1 , s 2 , s 3 s 1 , s 2 , s 3 , s 4 s 1 , s 2 , s 3 , s 4 , s 5
simple average p ( A ) =  0.6000 p ( A ) =  0.5800 p ( A ) =  0.5750 p ( A ) =  0.5800
p ( B ) =  0.1500 p ( B ) =  0.1400 p ( B ) =  0.1250 p ( B ) =  0.1200
p ( C ) =  0.2500 p ( C ) =  0.2800 p ( C ) =  0.3000 p ( C ) =  0.3000
proposed method p ( A ) =  0.5532 p ( A ) =  0.5924 p ( A ) =  0.6267 p ( A ) =  0.6428
p ( B ) =  0.1899 p ( B ) =  0.1490 p ( B ) =  0.1185 p ( B ) =  0.1100
p ( C ) =  0.2569 p ( C ) =  0.2586 p ( C ) =  0.2548 p ( C ) =  0.2472

Share and Cite

MDPI and ACS Style

Li, Y.; Xiao, F. Bayesian Update with Information Quality under the Framework of Evidence Theory. Entropy 2019, 21, 5. https://doi.org/10.3390/e21010005

AMA Style

Li Y, Xiao F. Bayesian Update with Information Quality under the Framework of Evidence Theory. Entropy. 2019; 21(1):5. https://doi.org/10.3390/e21010005

Chicago/Turabian Style

Li, Yuting, and Fuyuan Xiao. 2019. "Bayesian Update with Information Quality under the Framework of Evidence Theory" Entropy 21, no. 1: 5. https://doi.org/10.3390/e21010005

APA Style

Li, Y., & Xiao, F. (2019). Bayesian Update with Information Quality under the Framework of Evidence Theory. Entropy, 21(1), 5. https://doi.org/10.3390/e21010005

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop