Next Article in Journal
New Type Direction Curves in 3-Dimensional Compact Lie Group
Previous Article in Journal
Secrecy Performance of Underlay Cooperative Cognitive Network Using Non-Orthogonal Multiple Access with Opportunistic Relay Selection
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

The Consistency between Cross-Entropy and Distance Measures in Fuzzy Sets

College of Mathematics, Southwest Jiaotong University, Chengdu 610031, China
*
Author to whom correspondence should be addressed.
Symmetry 2019, 11(3), 386; https://doi.org/10.3390/sym11030386
Submission received: 14 February 2019 / Revised: 10 March 2019 / Accepted: 12 March 2019 / Published: 16 March 2019

Abstract

:
The processing of uncertain information is increasingly becoming a hot topic in the artificial intelligence field, and the information measures of uncertainty information processing are also becoming of importance. In the process of decision-making, decision-makers make decisions mostly according to information measures such as similarity, distance, entropy, and cross-entropy in order to choose the best one. However, we found that many researchers apply cross-entropy to multi-attribute decision-making according to the minimum principle, which is in accordance with the principle of distance measures. Thus, among all the choices, we finally chose the one with the smallest cross-entropy (distance) from the ideal one. However, the relation between cross-entropy and distance measures in fuzzy sets or neutrosophic sets has not yet been verified. In this paper, we mainly consider the relation between the discrimination measure of fuzzy sets and distance measures, where we found that the fuzzy discrimination satisfied all the conditions of distance measure; that is to say, the fuzzy discrimination was found to be consistent with distance measures. We also found that the cross-entropy, which improved when it was based on the fuzzy discrimination, satisfied all the conditions of distance measure, and we finally proved that cross-entropy, including fuzzy cross-entropy and neutrosophic cross-entropy, was also a distance measure.

1. Introduction

In the real world, there exists much uncertain, imprecise, and incomplete information, meaning that there are also many tools to settle them. Zadeh [1] first proposed the concept of a fuzzy set, which, defined by a membership function, is used to depict the membership value of one object to a set. Atanassov [2] proposed an intuitionistic fuzzy set (IFS) described by two functions, including a membership function depicting the membership value, and a non-membership function depicting the non-membership value of one object to the intuitionistic fuzzy set. The intuitionistic fuzzy set is the extension of a fuzzy set through adding a non-membership function. It has provided a more flexible mathematical framework to process uncertainty, imprecise, and incomplete information. Smarandache [3] firstly proposed the notion of neutrosophy and a neutrosophic set in 1998. The neutrosophic set is defined by the truth-membership function, indeterminacy-membership function, and the falsity-membership function, and it is comprised of a membership value, non-membership value, and indeterminacy-membership value. The neutrosophic sets theory has been successfully applied in the image-processing field by Vlachos and Sergiadis [4]. Therefore, Wang et al. [5] put forward the definition of a single-value neutrosophic set (SVNS) and some operations for better application in real scientific and engineering fields. A single value neutrosophic set is the extension of fuzzy set, and the SVNS theory also provides us with a more convenient tool for the uncertain information processing. Recently, some researchers have devoted themselves to the study of the single-value neutrosophic set theory and its applications, and achieved some successful results in some fields. Zhang et al. [6,7,8,9,10] did a lot of research about neutrosophic sets, and proposed a new kind of inclusion relation and new operations in SVNSs; furthermore, they also discussed the algebraic structure and some applications in algebraic systems.
Information measures are essential to decision-making in information processing, including the similarity function, distance or divergence function, and entropy and cross-entropy. These information measures are widely applicable in things like image processing, clustering, and pattern recognition. Liu et al. [11] applied a single-value neutrosophic number to the Decision-Making Trial and Evaluation Laboratory Method, and consequently presented the SVNN-DEMATEL (single value neutrosophic number-Decision-making Trial and Evaluation Laboratory Method) model. Mukhametzyanov et al. [12] provided an analysis about some multi-criteria decision-making (MDM) methods and the final selection, and presented a result consistency evaluation model. Tu et al. [13] introduced some simplified neutrosophic symmetry measures and applied it to their decision-making. The similarity function is mainly used to measure the level of similarity between two objects. Entropy is usually to depict the degree of uncertainty of one object, and is very important for measuring uncertain information. Cross-entropy can depict the discrimination degree of two objects, and we can judge their relation from it. Therefore, cross-entropy has many applications in information measures, decision-making, pattern recognition, and so on. Zadeh [14] firstly proposed the entropy of fuzzy events based on Shannon entropy. Kullback [15] was concerned with an information measure which is known as “distance” or “divergence”, depicting the relationship between two probability distributions. Therefore, it can serve as an information measure which can indicate the degree of discrimination. Furthermore, a new kind of information measure called the “cross-entropy distance” of two probability distributions was introduced by Kullback. DeLuca and Termini [16] introduced the notion of fuzzy entropy and some axioms to express the fuzziness degree of a fuzzy set, according to Shannon’s function. Then, the fuzzy entropy was generalized to the interval-valued fuzzy sets and intuitionistic fuzzy sets by Burillo and Bustince [17]. Szmidt [18] defined intuitionistic fuzzy entropy in a new way. Wei et al. [19] proposed interval-valued intuitionistic fuzzy entropy. Cross-entropy can be used to depict the degree of discrimination between two objects. Therefore, many researchers have modified cross-entropy measures. For example, Lin [20] proposed divergence based on Shannon entropy, and it is a type of modified fuzzy cross-entropy. Bhandari [21] introduced fuzzy divergence between two fuzzy sets. Shang et al. [22] put forward fuzzy cross-entropy and a symmetric discrimination measure, which was improved by fuzzy divergence and can be used to describe the discrimination degree between two fuzzy sets. Vlachos and Sergiadis [4] presented intuitionistic fuzzy cross-entropy, and also found a connection between fuzzy entropy and intuitionistic fuzzy entropy in terms of fuzziness and intuitionism. Verma [23,24] introduced the divergence measure, which is an information measure that can depict the discrimination degree. Cross-entropy measures were generalized to single-valued neutrosophic sets and applied to multi-criteria decision-making by Ye [25]. Since then, Şhahin [26] has continued to generalize the cross-entropy measure to interval neutrosophic sets, and introduced its application in multi-criteria decision-making.
In our study, we found that the fuzzy discrimination proposed by Bhandari [21] and improved fuzzy cross-entropy by Shang [22], as well as neutrosophic cross-entropy introduced by Ye [25] all have similar properties with distance measures, such as non-negativity, symmetry, and when the cross-entropy (distance) between two fuzzy sets is 0, if and only if the two fuzzy sets are completely equal. Furthermore, the decision principle of cross-entropy and distance applied in decision-making are also the same. That is to say, in the process of decision-making, among all the choices, we finally chose the one with the smallest cross-entropy (distance) from the ideal one. Based on the above analysis, we tended to study their relationships between cross-entropy and distance measure. There has been no previous research about their relationships between cross-entropy and distance. So, we mainly proved that the fuzzy discrimination and improved fuzzy cross-entropy and neutrosophic cross-entropy based on discrimination are distance measures in fact. We also present all the proof about fuzzy discrimination and improved fuzzy cross-entropy and neutrosophic cross-entropy based on discrimination satisfying all the conditions of distance in the paper. In Section 2, we mainly introduce some relevant knowledge, and provide proof that the fuzzy discrimination measure satisfies all the conditions of a distance measure, i.e., it is actually a kind of distance measure. In Section 3, we mainly prove that the fuzzy cross-entropy satisfies all the conditions of a distance measure, and that cross-entropy in single-value neutrosophic sets is also a kind of distance; that is to say, that the cross-entropy measure is consistent with distance measures.

2. Fuzzy Discrimination Is Consistent with Distance Measure

Let X be a universe course, and a fuzzy set A is denoted by a membership function μ A ( x ) which is used to express the degree of belongingness of one point x X to the set A, and for all x X , μ A ( x ) [ 0 , 1 ] . When μ A ( x ) = 0 or μ A ( x ) = 1 , then A becomes a crisp set.
Definition 1
([1]). Let X be a universe course, and fuzzy set A ˜ defined on X be given as:
A ˜ = { x , μ A ˜ ( x ) x X }
where μ A ˜ : X [ 0 , 1 ] , and every point has a membership value to express the degree of belongingness to a set.
Let F S ( X ) be the set of all the fuzzy sets. The following are some properties of fuzzy sets: M , N , T F S ( X ) ,
(1)
if M N , then μ M ( x ) μ N ( x ) ;
(2)
if M N , N M , then M = N ;
(3)
if M N , N T , then M T .
Definition 2.
Bhandari [21] proposed the fuzzy discrimination measure which is used to express a discrimination degree in favor of A against B. A , B F S ( X ) are defined as follows:
I ( M , N ) = i = 1 n ( μ M ( x i ) ln μ M ( x i ) μ N ( x i ) + ( 1 μ M ( x i ) ) ln 1 μ M ( x i ) 1 μ N ( x i )
It is obvious that I ( M , N ) 0 . I ( M , N ) = 0 if, and only if M = N . Bhandari [21] also defined E ( M , N ) = I ( M , N ) + I ( N , M ) for symmetry.
From the above, the fuzzy discrimination has similar properties, such as distance measures (except for one axiom of distance measure), and they also have a similar principle corresponding to the principle of minimum cross-entropy. In other words, there exists a perfect solution, A, but because it is generally unlikely to exist in a real situation, we should aim to find a solution which is likely to exist in the real world denoted by B , C , , so that we can get their cross-entropy to A. We ended up choosing the smallest cross-entropy, and the solution corresponding to the smallest cross-entropy was the optimal solution. The distance measure also had the same principle as the cross-entropy.
Definition 3.
A function D : F S ( X ) × F S ( X ) [ 0 , 1 ] was named as the distance measure on F S ( X ) so that the following conditions [10] could be satisfied, and A , B F S ( X ) ,
(1) 
0 D ( A , B ) 1 ;
(2) 
D ( A , B ) = 0 , if and only if A = B ;
(3) 
D ( A , B ) = D ( B , A ) ;
(4) 
if A B C , then D ( A , C ) D ( A , B ) , D ( A , C ) D ( B , C ) .
We redefined the fuzzy distance measure as follows, considering the infinity of the discrimination measure:
Definition 4.
A function F D : F S ( X ) × F S ( X ) [ 0 , ) was named as the fuzzy distance measure so that the following conditions were satisfied, and A , B F S ( X ) ,
(1) 
F D ( A , B ) 0 ;
(2) 
F D ( A , B ) = 0 , if and only if A = B ;
(3) 
F D ( A , B ) = F D ( B , A ) ;
(4) 
if A B C , then F D ( A , C ) F D ( A , B ) , F D ( A , C ) F D ( B , C ) .
It is obvious that the symmetry fuzzy discrimination has satisfied the first three conditions of fuzzy distance:
(1)
E ( M , N ) 0
(2)
E ( M , N ) = 0 , if and only if M = N ;
(3)
E ( M , N ) = E ( N , M ) .
Thus, we just needed to verify that the fuzzy discrimination satisfied condition (4) of the fuzzy distance measure.
Theorem 1.
Let m , n , t be three numbers in [ 0 , 1 ] . If m n t , then E ( m , t ) E ( m , n ) , E ( m , t ) E ( n , t ) .
Proof. 
We provided 0 ln 0 = 0 , m 0 = 0 .
E ( m , n ) = I ( m , n ) + I ( n , m ) = ( m n ) ln ( m n ) + ( n m ) ln ( 1 m 1 n )
E ( n , t ) = I ( n , t ) + I ( t , n ) = ( n t ) ln ( n t ) + ( t n ) ln ( 1 n 1 t )
E ( m , t ) = I ( m , t ) + I ( t , m ) = ( m t ) ln ( m t ) + ( t m ) ln ( 1 m 1 t )
Firstly, we needed to prove that ( 3 ) ( 1 ) .
( 3 ) = ln ( m t ) m t + ln ( 1 m 1 t ) t m = ln [ ( m t ) m t ( 1 m 1 t ) t m ]
( 1 ) = ln [ ( m n ) m n ( 1 m 1 n ) n m ]
( 3 ) ( 1 ) = ln [ ( m t ) m t ( 1 m 1 t ) t m ( n m ) m n ( 1 n 1 m ) n m ]
Let f ( t ) = ( m t ) m t ( 1 m 1 t ) t m ( n m ) m n ( 1 n 1 m ) n m ]
A = ( n m ) m n ( 1 n 1 m ) n m , B = ( m t ) m t , C = ( 1 m 1 t ) t m
Then, f ( t ) = A B C , f ( t ) = A ( B C + B C )
B t = ( e ( m t ) ln m t ) = e ( m t ) ln ( m t ) ( ln ( m t ) m t t ) = B ( ln ( m t ) m t t )
C t = ( e ( t m ) ln ( 1 m 1 t ) ) = C ( ln ( 1 m 1 t ) + t m 1 t )
f ( t ) = A B C ( ln ( m t ) m t t + ln ( 1 m 1 t ) + t m 1 t )
It is obvious that A , B , C 0 .
Since 0 m n t , then m t 1 , ln ( m t ) 0 , 1 m 1 t 1 , ln ( 1 m 1 t ) 0 . That is, f ( t ) 0 .
When n = t , f ( t ) = f ( n ) = 1 . When t > n , f ( t ) > f ( n ) = 1 , then ln f ( t ) 0 .
That is to say, the original ( 3 ) ( 1 ) 0 , i.e., E ( m , t ) E ( m , n ) has been obtained.
From here, we continue to prove that ( 3 ) ( 2 ) .
( 2 ) = ln [ ( n t ) n t ( 1 n 1 t ) t n ]
( 3 ) ( 2 ) = ln [ ( m t ) m t ( 1 m 1 t ) t m ( t n ) n t ( 1 t 1 n ) t n ]
Let g ( m ) = ( m t ) m t ( 1 m 1 t ) t m ( t n ) n t ( 1 t 1 n ) t n
D = ( t n ) n t ( 1 t 1 n ) t n , B = ( m t ) m t , C = ( 1 m 1 t ) t m
Then, g ( m ) = D B C , g ( m ) = D ( B C + B C )
B m = ( e ( m t ) ln ( m t ) ) = B ( ln ( m t ) + m t t )
C m = ( e ( t m ) ln ( 1 m 1 t ) ) = E ( ln ( 1 m 1 t ) + m t 1 m )
g ( z ) = D B C ( ln ( m t ) + m t t ln ( 1 m 1 t ) + m t 1 m )
It is obvious that D 0 .
Since 0 m n t , then m t 1 , ln ( m t ) 0 , 1 m 1 t 1 , ln ( 1 m 1 t ) 0 . That is, g ( m ) 0 .
When m = n , g ( m ) = g ( n ) = 1 , then m < n , and when g ( m ) > g ( n ) = 1 , then ln g ( m ) 0 .
That is to say, the original ( 3 ) ( 2 ) 0 , i.e., E ( m , t ) E ( n , t ) has been obtained. □
Finally, we obtained the proof of the theorem.
Theorem 2.
Let X be a universe course, M , N , T F S ( X ) . If M N T , then E ( M , T ) E ( M , N ) , E ( M , T ) E ( N , T ) .
E ( M , N ) = I ( M , N ) + I ( N , M ) = i = 1 n ( μ M ( x i ) μ N ( x i ) ) ln μ M ( x i ) μ N ( x i ) + ( μ N ( x i ) μ M ( x i ) ) ln 1 μ M ( x i ) 1 μ N ( x i )
E ( N , T ) = I ( N , T ) + I ( T , N ) = i = 1 n ( μ N ( x i ) μ T ( x i ) ) ln μ N ( x i ) μ T ( x i ) + ( μ T ( x i ) μ N ( x i ) ) ln 1 μ N ( x i ) 1 μ T ( x i )
E ( M , T ) = I ( M , T ) + I ( T , M ) = i = 1 n ( μ M ( x i ) μ T ( x i ) ) ln μ M ( x i ) μ T ( x i ) + ( μ T ( x i ) μ M ( x i ) ) ln 1 μ M ( x i ) 1 μ T ( x i )
From here, we need to prove that ( 6 ) ( 4 ) and ( 6 ) ( 5 ) .
Since M N T , then μ M ( x i ) μ N ( x i ) μ T ( x i ) , and μ ( x i ) [ 0 , 1 ] , x i X .
From Theorem 1, we know that the Theorem has been satisfied to every single membership value, meaning that the proof can be easily obtained from Theorem 1.
Example 1.
Let X be a space of the universe course M , N , T F S ( X ) , where M = { x , 0.5 | x X } , N = { x , 0.7 | x X } , T = { x , 0.9 | x X } . Clearly, M N T , and we can get E ( M , N ) = 0.1695 , E ( N , T ) = 0.2700 , E ( M , T ) = 0.8789 ; that is, E ( M , T ) E ( M , N ) , E ( M , T ) E ( N , T ) .
Theorem 3.
The above-defined symmetry fuzzy discrimination is a distance measure.
Finally, the above proves that the symmetry fuzzy discrimination, defined by Definition 2, is consistent with the distance measure from the above theorems.

3. Fuzzy Cross-Entropy Is Consistent with Distance Measure

Bhandari and Pal pointed out that the fuzzy discrimination has a defect—when μ B ( x i ) approaches 0 or 1, its value will be infinity. Thus, it has been modified on the basis of directed divergence proposed by Lin [20], and also modified by Shang et al. [22] as follows:
Definition 5
([22]). Let M , N F S ( X ) , where we can define a fuzzy cross-entropy as:
I 2 ( M , N ) = i = 1 n ( μ M ( x i ) ln ( μ M ( x i ) 1 / 2 ( μ M ( x i ) + μ N ( x i ) ) ) + ( 1 μ M ( x i ) ) ln ( 1 μ M ( x i ) 1 1 / 2 ( μ M ( x i ) + μ N ( x i ) ) )
This shows that it is well-defined and independent of every value of μ ( x i ) , which can express the discrimination degree of A from B.
It also has the same properties as the above discrimination measures, in that when I 2 ( M , N ) 0 , I 2 ( M , N ) = 0 if, and only if M = N .
Let E 2 ( M , N ) = I 2 ( M , N ) + I 2 ( N , M ) ; then, symmetry is satisfied. Thus, we mainly consider how the above-defined fuzzy cross-entropy E 2 ( M , N ) satisfies Condition (4) of the distance measure.
E 2 ( M , N ) = i = 1 n μ M ( x i ) ln ( μ M ( x i ) 1 / 2 ( μ M ( x i ) + μ N ( x i ) ) ) + ( 1 μ M ( x i ) ) ln ( 1 μ M ( x i ) 1 1 / 2 ( μ M ( x i ) + μ N ( x i ) ) ) + i = 1 n μ N ( x i ) ln ( μ N ( x i ) 1 / 2 ( μ M ( x i ) + μ N ( x i ) ) ) + ( 1 μ N ( x i ) ) ln ( 1 μ N ( x i ) 1 1 / 2 ( μ M ( x i ) + μ N ( x i ) ) )
Theorem 4.
Let m , n , t be three numbers in [ 0 , 1 ] . If m n t , then E 2 ( m , t ) E 2 ( m , n ) , E 2 ( m , t ) E 2 ( n , t ) .
Proof. 
E 2 ( m , n ) = I 2 ( m , n ) + I 2 ( n , m ) = m ln ( m 1 / 2 ( m + n ) ) + ( 1 m ) ln ( 1 m 1 1 / 2 ( m + n ) ) + n ln ( n 1 / 2 ( m + n ) ) + ( 1 n ) ln ( 1 n 1 1 / 2 ( m + n ) )
E 2 ( n , t ) = I 2 ( n , t ) + I 2 ( t , n ) = n ln ( n 1 / 2 ( n + t ) ) + ( 1 n ) ln ( 1 n 1 1 / 2 ( n + t ) ) + t ln ( t 1 / 2 ( n + t ) ) + ( 1 t ) ln ( 1 t 1 1 / 2 ( n + t ) )
E 2 ( m , t ) = I 2 ( m , t ) + I 2 ( t , m ) = m ln ( m 1 / 2 ( m + t ) ) + ( 1 m ) ln ( 1 m 1 1 / 2 ( m + t ) ) + t ln ( t 1 / 2 ( m + t ) ) + ( 1 t ) ln ( 1 t 1 1 / 2 ( m + t ) )
We firstly prove that ( 10 ) ( 8 ) .
We provided 0 ln 0 = 0 , m 0 = 0 .
( 10 ) = ln ( m 1 / 2 ( m + t ) ) m + ln ( 1 m 1 1 / 2 ( m + t ) ) 1 m + ln ( t 1 / 2 ( m + t ) ) t + ln ( 1 t 1 1 / 2 ( m + t ) ) 1 t
= ln [ ( m 1 / 2 ( m + t ) ) x ( 1 m 1 1 / 2 ( m + t ) ) 1 m ( t 1 / 2 ( m + t ) ) t ( 1 t 1 1 / 2 ( m + t ) ) 1 t ]
( 8 ) = ln [ ( m 1 / 2 ( m + n ) ) x ( 1 m 1 1 / 2 ( m + n ) ) 1 m ( n 1 / 2 ( m + n ) ) n ( 1 n 1 1 / 2 ( m + n ) ) 1 n ]
( 10 ) ( 8 ) = ln [ ( m + n m + t ) m ( 2 t m + t ) t ( m + n 2 n ) n ( 1 1 / 2 ( m + n ) 1 1 / 2 ( m + t ) ) 1 m ( 1 t 1 1 / 2 ( m + t ) ) 1 t ( 1 1 / 2 ( m + n ) 1 n ) 1 n ]
Let f ( z ) = ( m + n m + t ) m ( 2 t m + t ) t ( m + n 2 n ) n ( 1 1 / 2 ( m + n ) 1 1 / 2 ( m + t ) ) 1 m ( 1 t 1 1 / 2 ( m + t ) ) 1 t ( 1 1 / 2 ( m + n ) 1 n ) 1 n
A = ( m + n 2 n ) n ( 1 1 / 2 ( m + n ) 1 n ) 1 n , B = ( m + n m + t ) m , C = ( 2 t m + t ) t , D = ( 1 1 / 2 ( m + n ) 1 1 / 2 ( m + t ) ) 1 m , E = ( 1 t 1 1 / 2 ( m + t ) ) 1 t .
Then, f ( t ) = A B C D E , f ( t ) = A ( B C D E + B C D E + B C D E + B C D E )
B t = ( e m ln ( m + n m + t ) ) = e ( m t ) ln ( m t ) ( m m + t ) = B ( m m + t )
C t = ( e ( t ln ( 2 t m + t ) ) = C ( ln ( 2 t m + t ) + m m + t )
D t = ( e ( 1 m ) ln ( 1 1 / 2 ( m + n ) 1 1 / 2 ( m + t ) ) ) = D ( 1 m 2 ( m + t ) )
E t = ( e ( 1 t ) ln ( 1 t 1 1 / 2 ( m + t ) ) ) = E ( ln ( 1 t 1 1 / 2 ( m + t ) ) + m 1 1 1 / 2 ( m + t ) )
f ( t ) = A B C D E ( ln ( 2 t m + t ) ln ( 1 t 1 1 / 2 ( m + t ) ) )
It is clear that A , B , C , D , E 0 . Since 0 m n t , then 2 t m + t , ln ( 2 t m + t ) 0 , 1 t 1 1 / 2 ( m + t ) , ln ( 1 t 1 1 / 2 ( m + t ) ) 0 . That is, f ( t ) 0 . When t = n , f ( t ) = f ( n ) = 1 , then t > n . Where f ( t ) > f ( n ) = 1 , then ln ( f ( t ) ) 0 .
That is to say, the original ( 10 ) ( 8 ) 0 , meaning that E ( m , t ) E ( m , n ) has been obtained.
From here, we continue to prove that ( 10 ) ( 8 ) .
( 9 ) = ln [ ( n 1 / 2 ( n + t ) ) n ( 1 n 1 1 / 2 ( n + t ) ) 1 n ( t 1 / 2 ( n + t ) ) t ( 1 t 1 1 / 2 ( n + t ) ) 1 t ]
( 10 ) ( 9 ) = ln [ ( n + t m + t ) t ( 2 m m + t ) m ( n + t 2 n ) n ( 1 m 1 1 / 2 ( m + t ) ) 1 m ( 1 1 / 2 ( n + t ) 1 1 / 2 ( m + t ) ) 1 t ( 1 1 / 2 ( n + t ) 1 n ) 1 n ]
Let g ( m ) = ( n + t m + t ) t ( 2 m m + t ) m ( n + t 2 n ) n ( 1 m 1 1 / 2 ( m + t ) ) 1 m ( 1 1 / 2 ( n + t ) 1 1 / 2 ( m + t ) ) 1 t ( 1 1 / 2 ( n + t ) 1 n ) 1 n ]
M = ( 1 1 / 2 ( n + t ) 1 n ) 1 n ( n + t 2 n ) n , N = ( 2 m m + t ) m , P = ( n + t m + t ) t , Q = ( 1 m 1 1 / 2 ( m + t ) ) 1 m , S = ( 1 1 / 2 ( n + t ) 1 1 / 2 ( m + t ) ) 1 t ,
Then, g ( m ) = M N P Q S , g ( m ) = M ( N P Q S + N P Q S + N P Q S + N P Q S )
N m = N ( ln 2 m m + t + t m + t )
P m = P ( t m + t )
Q m = Q ( ln 1 m 1 1 / 2 ( m + t ) + t 1 2 ( m + t ) )
S m = S ( 1 t 2 ( m + t ) )
g ( x ) = M N P Q S ( ln ( 2 m m + t ) ln ( 1 m 1 1 / 2 ( m + t ) ) )
It is obvious that M , N , P , Q , S 0 , since 0 m n t , then 2 m m + t , ln 2 m m + t 0 , 1 m 1 1 / 2 ( m + t ) , ln 1 m 1 1 / 2 ( m + t ) 0 ; that is, g ( m ) 0 . When m = n , g ( m ) = g ( n ) = 1 , then m < n , and when g ( m ) > f ( n ) = 1 , then ln g ( m ) 0 . That is to say, the original ( 10 ) ( 9 ) 0 , meaning that E 2 ( m , t ) E 2 ( n , t ) has been obtained. □
Finally, we obtain the proof of the theorem.
Theorem 5.
Let X be a universe course, M , N , T F S ( X ) . If M N T , then E 2 ( M , T ) E 2 ( M , N ) , E 2 ( M , T ) E 2 ( N , T ) .
E 2 ( M , N ) = I 2 ( M , N ) + I 2 ( N , M ) = i = 1 n μ M ( x i ) ln ( μ M ( x i ) 1 / 2 ( μ M ( x i ) + μ N ( x i ) ) ) + ( 1 μ M ( x i ) ) ln ( 1 μ M ( x i ) 1 1 / 2 ( μ M ( x i ) + μ N ( x i ) ) ) + i = 1 n μ N ( x i ) ln ( μ N ( x i ) 1 / 2 ( μ M ( x i ) + μ N ( x i ) ) ) + ( 1 μ N ( x i ) ) ln ( 1 μ N ( x i ) 1 1 / 2 ( μ M ( x i ) + μ N ( x i ) ) )
E 2 ( N , T ) = I 2 ( N , T ) + I 2 ( T , N ) = i = 1 n μ N ( x i ) ln ( μ N ( x i ) 1 / 2 ( μ N ( x i ) + μ T ( x i ) ) ) + ( 1 μ N ( x i ) ) ln ( 1 μ N ( x i ) 1 1 / 2 ( μ N ( x i ) + μ T ( x i ) ) ) + i = 1 n μ T ( x i ) ln ( μ T ( x i ) 1 / 2 ( μ N ( x i ) + μ T ( x i ) ) ) + ( 1 μ T ( x i ) ) ln ( 1 μ T ( x i ) 1 1 / 2 ( μ N ( x i ) + μ T ( x i ) ) )
E 2 ( M , T ) = I 2 ( M , T ) + I 2 ( T , M ) = i = 1 n μ M ( x i ) ln ( μ M ( x i ) 1 / 2 ( μ M ( x i ) + μ T ( x i ) ) ) + ( 1 μ M ( x i ) ) ln ( 1 μ M ( x i ) 1 1 / 2 ( μ M ( x i ) + μ T ( x i ) ) ) + i = 1 n μ T ( x i ) ln ( μ T ( x i ) 1 / 2 ( μ M ( x i ) + μ T ( x i ) ) ) + ( 1 μ T ( x i ) ) ln ( 1 μ T ( x i ) 1 1 / 2 ( μ M ( x i ) + μ T ( x i ) ) )
We can easily obtain the proof from Theorem 3.
Example 2.
Let X be a space of universe course, M , N , T F S ( X ) , in which M = { x , 0.5 | x X } , N = { x , 0.7 | x X } , T = { x , 0.9 | x X } Clearly, M N T , and we can get E 2 ( M , N ) = 0.042 , E 2 ( N , T ) = 0.0648 , E 2 ( M , T ) = 0.2035 ; that is, E 2 ( M , T ) E 2 ( M , N ) , E 2 ( M , T ) E 2 ( N , T ) .
Theorem 6.
The above-defined symmetry fuzzy cross-entropy is a kind of distance measure.

4. Neutrosophic Cross-Entropy Is a Distance Measure

Smarandache [3,27] firstly proposed the definition of a neutrosophic set, which is an extension of an intuitionistic fuzzy set(IFS) and an interval-valued intuitionistic fuzzy set, as follows:
Definition 6
([3]). Let X be a universe course, where a neutrosophic set A in X is comprised of the truth-membership function T A ( x ) , indeterminacy-membership function I A ( x ) , and falsity-membership function F A ( x ) , in which T A ( x ) , I A ( x ) , F A ( x ) : X ] 0 , 1 + [ .
There is no restriction on the sum of T A ( x ) , I A ( x ) , F A ( x ) , so 0 s u p T A ( x ) + s u p I A ( x ) + s u p F A ( x ) 3 + .
Wang et al. [5] introduced the definition of single value neutrosophic set (SVNS) for better application in the engineering field. SVNS is an extension of the IFS, and also provides another way in which to express and process uncertainty, incomplete, and inconsistent information in the real world.
Definition 7
([5]). Let X be a space of points, where a single-value netrosophic set A in X is comprised of the truth-membership function T A ( x ) , indeterminacy-membership function I A ( x ) , and falsity-membership function F A ( x ) . For each point x in X, T A ( x ) , I A ( x ) , F A ( x ) [ 0 , 1 ] . Therefore, a SVNS A can be denoted by:
A = ( x , T A ( x ) , I A ( x ) , F A ( x ) ) x X
There is no restriction on the sum of T A ( x ) , I A ( x ) , F A ( x ) , so 0 s u p T A ( x ) + s u p I A ( x ) + s u p F A ( x ) 3 .
The following are some properties about SVNSs M and N:
Let X be a universe course, S V N S ( X ) be the set of all the single-value neutrosophic sets, and M , N , T S V N S ( X ) :
(1)
M N if, and only if T M ( x ) T N ( x ) , I M ( x ) I N ( x ) , F M ( x ) F N ( x ) for every x in X [5];
(2)
M = N if, and only if A B and B A [5];
(3)
If M N , N T , then M T .
Then, Ye [25] first generalized the fuzzy cross-entropy measure to the SVNSs. The information measure of neutrosophic sets are composed of the information measure of the truth-membership, indeterminacy-membership, and falsity-membership in SVNSs. Let M , N S V N S ( X ) , where Ye introduced the discrimination information of T M ( x i ) from T N ( x i ) for ( i = 1 , 2 , , n ) on the basis of the definition of fuzzy cross-entropy I 2 ( M , N ) as the following;
I 2 T ( M , N ) = i = 1 n ( T M ( x i ) ln ( T M ( x i ) 1 / 2 ( T M ( x i ) + T N ( x i ) ) ) + ( 1 T M ( x i ) ) ln ( 1 T M ( x i ) 1 1 / 2 ( T M ( x i ) + T N ( x i ) ) )
We can define the following information in terms of the indeterminacy-membership function and the falsity-membership function in the same way:
I 2 I ( M , N ) = i = 1 n ( I M ( x i ) ln ( I M ( x i ) 1 / 2 ( I M ( x i ) + I N ( x i ) ) ) + ( 1 I M ( x i ) ) ln ( 1 I M ( x i ) 1 1 / 2 ( I M ( x i ) + I N ( x i ) ) )
I 2 F ( M , N ) = i = 1 n ( F M ( x i ) ln ( F M ( x i ) 1 / 2 ( F M ( x i ) + F N ( x i ) ) ) + ( 1 F M ( x i ) ) ln ( 1 F M ( x i ) 1 1 / 2 ( F M ( x i ) + F N ( x i ) ) )
Definition 8
([25]). The single-value neutrosophic cross-entropy about M and N where M , N S V N S ( X ) can be defined as follows:
I 3 ( M , N ) = I 2 T ( M , N ) + I 2 I ( M , N ) + I 2 F ( M , N )
It can also be used to express the degree of differences of M from N. According to Shannon’s inequality, it is clear that I 3 ( M , N ) 0 , and I 3 ( M , N ) = 0 if, and only if M = N . That is, T M ( x i ) = T N ( x i ) , I M ( x i ) = I N ( x i ) , F M ( x i ) = F N ( x i ) for any x X . Then, the neutrosophic cross-entropy can be modified as E 3 ( M , N ) = I 3 ( M , N ) + I 3 ( N , M ) for the symmetry.
Theorem 7.
Let X be a space of the universe course, M , N , T S V N S ( X ) . If M N T , then E 3 ( M , T ) E 3 ( M , N ) , and E 3 ( M , T ) E 3 ( N , T ) .
According to the proof of Theorem 4, we can easily find that E 2 T ( M , T ) E 2 T ( M , N ) , and E 2 T ( M , T ) E 2 T ( N , T ) . In a similar way, E 2 I ( M , T ) E 2 I ( M , N ) , and E 2 I ( M , T ) E 2 I ( N , T ) , E 2 F ( M , T ) E 2 2 ( M , N ) , and E 2 F ( M , T ) E 2 F ( N , T ) , meaning that E 3 ( M , T ) E 3 ( M , N ) , E 3 ( M , T ) E 3 ( N , T ) . Thus, conclusively, we were able to easily obtain the proof.
Example 3.
Let X be a space of the universe course, M , N , T S V N S ( X ) , where M = { ( x , 0.5 , 0.3 , 0.7 ) x X } , N = { ( x , 0.7 , 0.2 , 0.5 ) x X } , T = { ( x , 0.8 , 0.1 , 0.1 ) x X } .
It is clear that M N T , and we can obtain:
E 2 T ( M , N ) = 0.0420 , E 2 T ( N , T ) = 0.0134 , E 2 T ( M , T ) = 0.1013 ;
that is, E 2 T ( M , T ) E 2 T ( M , N ) , E 2 T ( M , T ) E 2 T ( N , T ) .
E 2 I ( M , N ) = 0.0134 , E 2 I ( N , T ) = 0.0199 , E 2 I ( M , T ) = 0.0648 ;
that is, E 2 I ( M , T ) E 2 I ( M , N ) , E 2 I ( M , T ) E 2 I ( N , T ) .
E 2 F ( M , N ) = 0.0420 , E 2 F ( N , T ) = 0.2035 , E 2 F ( M , T ) = 0.4101 ;
that is, E 2 F ( M , T ) E 2 F ( M , N ) , E 2 F ( M , T ) E 2 F ( N , T ) .
E 3 ( M , T ) = E 2 T ( M , T ) + E 2 I ( M , T ) + E 2 F ( M , T ) = 0.5762 , E 3 ( M , N ) = 0.0974 , E 3 ( N , T ) = 0.2368 .
Thus, E 3 F ( M , T ) E 3 F ( M , N ) , E 3 F ( M , T ) E 3 F ( N , T ) .
Theorem 8.
The above-defined symmetry neutrosophic cross-entropy is a distance measure.

5. Conclusions

On account of these similar properties between distance measure and cross-entropy (such as non-negativity and symmetry), and when the cross-entropy (distance) between two fuzzy sets is 0 if, and only if the two sets coincide. We also found that the decision principle of cross-entropy is consistent with decision principle of distance measure in decision-making. That decision principle is, among all the choices, we finally chose the one with the smallest cross-entropy (distance) from the ideal solution. Based on the above analysis, we tend to studied their relationships. In this paper, we mainly proved that the fuzzy discrimination and improved fuzzy cross-entropy and neutrosophic cross-entropy based on fuzzy discrimination were distance measures in fact. That is to say, the symmetry cross-entropy mentioned in this paper is consistent with the distance measure. In the future, we will try to simplify the formula and propose an improvement to it. It is how the cross-entropy formulas are composed of logarithmic functions which is what makes the calculation so complicated.

Author Contributions

All authors have contributed equally to this paper.

Funding

This work has been supported by the National Natural Science Foundation of China (Grant No. 61473239).

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Zadeh, L.A. Fuzzy sets. Inf. Control 1965, 8, 338–353. [Google Scholar] [CrossRef]
  2. Atanassov, K. Intuitionistic fuzzy set. Fuzzy Sets Syst. 1986, 20, 87–96. [Google Scholar] [CrossRef]
  3. Smarandache, F. Neutrosophy: Neutrosophic Probability, Set, and Logic; American Research Press: Rehoboth, DE, USA, 1998. [Google Scholar]
  4. Vlachos, I.K.; Sergiadis, G.D. Intuitionistic fuzzy information-applications to pattern recognition. Pattern Recognit. Lett. 2007, 28, 197–206. [Google Scholar] [CrossRef]
  5. Wang, H.; Smarandache, F.; Zhang, Y.Q.; Sunderraman, R. Single Valued Neutrosophic Sets. Multispace Multistructure 2010, 4, 410–413. [Google Scholar]
  6. Zhang, X.H. Fuzzy anti-grouped filters and fuzzy normal filters in pseudo-BCI algebras. J. Intell. Fuzzy Syst. 2017, 33, 1767–1774. [Google Scholar] [CrossRef]
  7. Zhang, X.H.; Ma, Y.C.; Smarandache, F.; Dai, J.H. Neutrosophic regular filters and fuzzy regular filters in pseudo-BCI algebras. Neutrosophic Sets Syst. 2017, 17, 10–15. [Google Scholar]
  8. Zhang, X.H.; Bo, C.X.; Smarandache, F.; Dai, J.H. New inclusion relation of neutrosophic sets with applications and related lattice structure. Int. J. Mach. Learn. Cybern. 2018, 9, 1753–1763. [Google Scholar] [CrossRef]
  9. Zhang, X.H.; Bo, C.X.; Smarandache, F.; Park, C. New operations of totally dependent-neutrosophic sets and totally dependent-neutrosophic soft sets. Symmetry 2018, 10, 187. [Google Scholar] [CrossRef]
  10. Zhang, X.H.; Yu, P.; Smarandache, F.; Park, C. Redefined neutrosophic filters in BE-algebras. Ital. J. Pure Appl. Math. 2019, in press. [Google Scholar]
  11. Liu, F.; Guan, A.W.; Lukovac, V.; Vukić, M. A multicriteria model for the selection of the transport service provider: A single valued neutrosophic DEMATEL multicriteria model. Decis. Mak. Appl. Manag. Eng. 2018, 1, 121–130. [Google Scholar] [CrossRef]
  12. Mukhametzyanov, I.; Pamučar, D. A sensitivity analysis in MCDM problems: A statistical approach. Decis. Mak. Appl. Manag. Eng. 2018, 1, 51–80. [Google Scholar] [CrossRef]
  13. Tu, A.; Ye, J.; Wang, B. Symmetry Measures of Simplified Neutrosophic Sets for Multiple Attribute Decision-Making Problems. Symmetry 2018, 10, 144. [Google Scholar] [CrossRef]
  14. Zadeh, L.A. Probability measures of fuzzy events. J. Math. Anal. Appl. 1968, 23, 421–427. [Google Scholar] [CrossRef]
  15. Kullback, S. Information Theory and Statistics; Dover Publications: New York, NY, USA, 1997. [Google Scholar]
  16. De Luca, A.S.; Termini, S. A definition of nonprobabilistic entropy in the setting of fuzzy sets theory. Inf. Control 1972, 20, 301–312. [Google Scholar] [CrossRef]
  17. Burillo, P.; Bustince, H. Entropy on intuitionistic fuzzt sets and on interval-valued fuzzy sets. Fuzzy Sets Syst. 1996, 78, 305–316. [Google Scholar] [CrossRef]
  18. Szmidt, E.; Kacprzyk, J. Entropy for intuitionistic fuzzy sets. Fuzzy Sets Syst. 2001, 118, 467–477. [Google Scholar] [CrossRef]
  19. Wei, C.P.; Wang, P.; Zhang, Y.Z. Entropy, similarity measure of interval-valued intuitionistic fuzzy sets and their applications. Inf. Sci. 2011, 181, 4273–4286. [Google Scholar] [CrossRef]
  20. Lin, J. Divergence measures based on Shannon entropy. IEEE Trans. Inf. Theory 1991, 37, 145–151. [Google Scholar] [CrossRef]
  21. Bhandari, D.; Pal, N.R. Some new information measures for fuzzy sets. Inf. Sci. 1993, 67, 209–228. [Google Scholar] [CrossRef]
  22. Shang, X.G.; Jiang, W.S. A note on fuzzy information measures. Pattern Recognit. Lett. 1997, 18, 425–432. [Google Scholar] [CrossRef]
  23. Verma, R. On generalized fuzzy divergence measure and their application to multicriteria decision-making. J. Comb. Inf. Syst. Sci. 2014, 39, 191–213. [Google Scholar]
  24. Verma, R.; Sharma, B.D. On generalized intuitionistic fuzzy relative information and their properties. J. Uncertain Syst. 2012, 6, 308–320. [Google Scholar]
  25. Ye, J. Single valued neutrosophic cross-entropy for multicriteria decision making problems. Appl. Math. Model. 2014, 38, 1170–1175. [Google Scholar] [CrossRef]
  26. Şhahin, R. Cross-entropy measure on interval neutrosophic sets and its applications in multicriteria decision making. Neural Comput. Appl. 2017, 28, 1177–1187. [Google Scholar] [CrossRef]
  27. Smarandache, F. A Unifying Field in Logics. Neutrosophy: Neutrosophic Probability, Set and Logic; American Research Press: Rehoboth, DE, USA, 1999. [Google Scholar]

Share and Cite

MDPI and ACS Style

Wang, Y.; Yang, H.; Qin, K. The Consistency between Cross-Entropy and Distance Measures in Fuzzy Sets. Symmetry 2019, 11, 386. https://doi.org/10.3390/sym11030386

AMA Style

Wang Y, Yang H, Qin K. The Consistency between Cross-Entropy and Distance Measures in Fuzzy Sets. Symmetry. 2019; 11(3):386. https://doi.org/10.3390/sym11030386

Chicago/Turabian Style

Wang, Yameng, Han Yang, and Keyun Qin. 2019. "The Consistency between Cross-Entropy and Distance Measures in Fuzzy Sets" Symmetry 11, no. 3: 386. https://doi.org/10.3390/sym11030386

APA Style

Wang, Y., Yang, H., & Qin, K. (2019). The Consistency between Cross-Entropy and Distance Measures in Fuzzy Sets. Symmetry, 11(3), 386. https://doi.org/10.3390/sym11030386

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop