1. Introduction
In the real world, there exists much uncertain, imprecise, and incomplete information, meaning that there are also many tools to settle them. Zadeh [
1] first proposed the concept of a fuzzy set, which, defined by a membership function, is used to depict the membership value of one object to a set. Atanassov [
2] proposed an intuitionistic fuzzy set (IFS) described by two functions, including a membership function depicting the membership value, and a non-membership function depicting the non-membership value of one object to the intuitionistic fuzzy set. The intuitionistic fuzzy set is the extension of a fuzzy set through adding a non-membership function. It has provided a more flexible mathematical framework to process uncertainty, imprecise, and incomplete information. Smarandache [
3] firstly proposed the notion of neutrosophy and a neutrosophic set in 1998. The neutrosophic set is defined by the truth-membership function, indeterminacy-membership function, and the falsity-membership function, and it is comprised of a membership value, non-membership value, and indeterminacy-membership value. The neutrosophic sets theory has been successfully applied in the image-processing field by Vlachos and Sergiadis [
4]. Therefore, Wang et al. [
5] put forward the definition of a single-value neutrosophic set (SVNS) and some operations for better application in real scientific and engineering fields. A single value neutrosophic set is the extension of fuzzy set, and the SVNS theory also provides us with a more convenient tool for the uncertain information processing. Recently, some researchers have devoted themselves to the study of the single-value neutrosophic set theory and its applications, and achieved some successful results in some fields. Zhang et al. [
6,
7,
8,
9,
10] did a lot of research about neutrosophic sets, and proposed a new kind of inclusion relation and new operations in SVNSs; furthermore, they also discussed the algebraic structure and some applications in algebraic systems.
Information measures are essential to decision-making in information processing, including the similarity function, distance or divergence function, and entropy and cross-entropy. These information measures are widely applicable in things like image processing, clustering, and pattern recognition. Liu et al. [
11] applied a single-value neutrosophic number to the Decision-Making Trial and Evaluation Laboratory Method, and consequently presented the SVNN-DEMATEL (single value neutrosophic number-Decision-making Trial and Evaluation Laboratory Method) model. Mukhametzyanov et al. [
12] provided an analysis about some multi-criteria decision-making (MDM) methods and the final selection, and presented a result consistency evaluation model. Tu et al. [
13] introduced some simplified neutrosophic symmetry measures and applied it to their decision-making. The similarity function is mainly used to measure the level of similarity between two objects. Entropy is usually to depict the degree of uncertainty of one object, and is very important for measuring uncertain information. Cross-entropy can depict the discrimination degree of two objects, and we can judge their relation from it. Therefore, cross-entropy has many applications in information measures, decision-making, pattern recognition, and so on. Zadeh [
14] firstly proposed the entropy of fuzzy events based on Shannon entropy. Kullback [
15] was concerned with an information measure which is known as “distance” or “divergence”, depicting the relationship between two probability distributions. Therefore, it can serve as an information measure which can indicate the degree of discrimination. Furthermore, a new kind of information measure called the “cross-entropy distance” of two probability distributions was introduced by Kullback. DeLuca and Termini [
16] introduced the notion of fuzzy entropy and some axioms to express the fuzziness degree of a fuzzy set, according to Shannon’s function. Then, the fuzzy entropy was generalized to the interval-valued fuzzy sets and intuitionistic fuzzy sets by Burillo and Bustince [
17]. Szmidt [
18] defined intuitionistic fuzzy entropy in a new way. Wei et al. [
19] proposed interval-valued intuitionistic fuzzy entropy. Cross-entropy can be used to depict the degree of discrimination between two objects. Therefore, many researchers have modified cross-entropy measures. For example, Lin [
20] proposed divergence based on Shannon entropy, and it is a type of modified fuzzy cross-entropy. Bhandari [
21] introduced fuzzy divergence between two fuzzy sets. Shang et al. [
22] put forward fuzzy cross-entropy and a symmetric discrimination measure, which was improved by fuzzy divergence and can be used to describe the discrimination degree between two fuzzy sets. Vlachos and Sergiadis [
4] presented intuitionistic fuzzy cross-entropy, and also found a connection between fuzzy entropy and intuitionistic fuzzy entropy in terms of fuzziness and intuitionism. Verma [
23,
24] introduced the divergence measure, which is an information measure that can depict the discrimination degree. Cross-entropy measures were generalized to single-valued neutrosophic sets and applied to multi-criteria decision-making by Ye [
25]. Since then, Şhahin [
26] has continued to generalize the cross-entropy measure to interval neutrosophic sets, and introduced its application in multi-criteria decision-making.
In our study, we found that the fuzzy discrimination proposed by Bhandari [
21] and improved fuzzy cross-entropy by Shang [
22], as well as neutrosophic cross-entropy introduced by Ye [
25] all have similar properties with distance measures, such as non-negativity, symmetry, and when the cross-entropy (distance) between two fuzzy sets is 0, if and only if the two fuzzy sets are completely equal. Furthermore, the decision principle of cross-entropy and distance applied in decision-making are also the same. That is to say, in the process of decision-making, among all the choices, we finally chose the one with the smallest cross-entropy (distance) from the ideal one. Based on the above analysis, we tended to study their relationships between cross-entropy and distance measure. There has been no previous research about their relationships between cross-entropy and distance. So, we mainly proved that the fuzzy discrimination and improved fuzzy cross-entropy and neutrosophic cross-entropy based on discrimination are distance measures in fact. We also present all the proof about fuzzy discrimination and improved fuzzy cross-entropy and neutrosophic cross-entropy based on discrimination satisfying all the conditions of distance in the paper. In
Section 2, we mainly introduce some relevant knowledge, and provide proof that the fuzzy discrimination measure satisfies all the conditions of a distance measure, i.e., it is actually a kind of distance measure. In
Section 3, we mainly prove that the fuzzy cross-entropy satisfies all the conditions of a distance measure, and that cross-entropy in single-value neutrosophic sets is also a kind of distance; that is to say, that the cross-entropy measure is consistent with distance measures.
2. Fuzzy Discrimination Is Consistent with Distance Measure
Let X be a universe course, and a fuzzy set A is denoted by a membership function which is used to express the degree of belongingness of one point to the set A, and for all , . When or , then A becomes a crisp set.
Definition 1 ([
1])
. Let X be a universe course, and fuzzy set defined on X be given as:where , and every point has a membership value to express the degree of belongingness to a set. Let be the set of all the fuzzy sets. The following are some properties of fuzzy sets: ,
- (1)
if , then ;
- (2)
if , then ;
- (3)
if , then .
Definition 2. Bhandari [21] proposed the fuzzy discrimination measure which is used to express a discrimination degree in favor of A against B. are defined as follows: It is obvious that
.
if, and only if
. Bhandari [
21] also defined
for symmetry.
From the above, the fuzzy discrimination has similar properties, such as distance measures (except for one axiom of distance measure), and they also have a similar principle corresponding to the principle of minimum cross-entropy. In other words, there exists a perfect solution, A, but because it is generally unlikely to exist in a real situation, we should aim to find a solution which is likely to exist in the real world denoted by , so that we can get their cross-entropy to A. We ended up choosing the smallest cross-entropy, and the solution corresponding to the smallest cross-entropy was the optimal solution. The distance measure also had the same principle as the cross-entropy.
Definition 3. A function was named as the distance measure on so that the following conditions [10] could be satisfied, and , - (1)
;
- (2)
, if and only if ;
- (3)
;
- (4)
if , then .
We redefined the fuzzy distance measure as follows, considering the infinity of the discrimination measure:
Definition 4. A function was named as the fuzzy distance measure so that the following conditions were satisfied, and ,
- (1)
;
- (2)
, if and only if ;
- (3)
;
- (4)
if , then .
It is obvious that the symmetry fuzzy discrimination has satisfied the first three conditions of fuzzy distance:
- (1)
- (2)
, if and only if ;
- (3)
.
Thus, we just needed to verify that the fuzzy discrimination satisfied condition (4) of the fuzzy distance measure.
Theorem 1. Let be three numbers in . If , then .
Proof. We provided
.
Firstly, we needed to prove that .
Let
Then,
It is obvious that .
Since , then . That is, .
When , . When , , then .
That is to say, the original , i.e., has been obtained.
From here, we continue to prove that
.
Let
Then,
It is obvious that .
Since , then . That is, .
When , , then , and when , then .
That is to say, the original , i.e., has been obtained. □
Finally, we obtained the proof of the theorem.
Theorem 2. Let X be a universe course, . If , then .
From here, we need to prove that and .
Since , then , and
From Theorem 1, we know that the Theorem has been satisfied to every single membership value, meaning that the proof can be easily obtained from Theorem 1.
Example 1. Let X be a space of the universe course , where . Clearly, , and we can get ; that is, .
Theorem 3. The above-defined symmetry fuzzy discrimination is a distance measure.
Finally, the above proves that the symmetry fuzzy discrimination, defined by Definition 2, is consistent with the distance measure from the above theorems.
3. Fuzzy Cross-Entropy Is Consistent with Distance Measure
Bhandari and Pal pointed out that the fuzzy discrimination has a defect—when
approaches 0 or 1, its value will be infinity. Thus, it has been modified on the basis of directed divergence proposed by Lin [
20], and also modified by Shang et al. [
22] as follows:
Definition 5 ([
22])
. Let , where we can define a fuzzy cross-entropy as:This shows that it is well-defined and independent of every value of , which can express the discrimination degree of A from B.
It also has the same properties as the above discrimination measures, in that when , if, and only if .
Let
; then, symmetry is satisfied. Thus, we mainly consider how the above-defined fuzzy cross-entropy
satisfies Condition (4) of the distance measure.
Theorem 4. Let be three numbers in . If , then .
Proof. We firstly prove that .
We provided
.
Let
Then,
It is clear that . Since , then . That is, . When , , then . Where , then .
That is to say, the original , meaning that has been obtained.
From here, we continue to prove that
.
Let
Then,
It is obvious that , since , then ; that is, . When , then , and when , then . That is to say, the original , meaning that has been obtained. □
Finally, we obtain the proof of the theorem.
Theorem 5. Let X be a universe course, . If , then .
We can easily obtain the proof from Theorem 3.
Example 2. Let X be a space of universe course, , in which Clearly, , and we can get ; that is, .
Theorem 6. The above-defined symmetry fuzzy cross-entropy is a kind of distance measure.
4. Neutrosophic Cross-Entropy Is a Distance Measure
Smarandache [
3,
27] firstly proposed the definition of a neutrosophic set, which is an extension of an intuitionistic fuzzy set(IFS) and an interval-valued intuitionistic fuzzy set, as follows:
Definition 6 ([
3])
. Let X be a universe course, where a neutrosophic set A in X is comprised of the truth-membership function , indeterminacy-membership function , and falsity-membership function , in which .There is no restriction on the sum of , so .
Wang et al. [
5] introduced the definition of single value neutrosophic set (SVNS) for better application in the engineering field. SVNS is an extension of the IFS, and also provides another way in which to express and process uncertainty, incomplete, and inconsistent information in the real world.
Definition 7 ([
5])
. Let X be a space of points, where a single-value netrosophic set A in X is comprised of the truth-membership function , indeterminacy-membership function , and falsity-membership function . For each point x in X, . Therefore, a SVNS A can be denoted by:There is no restriction on the sum of , so .
The following are some properties about SVNSs M and N:
Let X be a universe course, be the set of all the single-value neutrosophic sets, and :
- (1)
if, and only if
for every
x in
X [
5];
- (2)
if, and only if
and
[
5];
- (3)
If , then .
Then, Ye [
25] first generalized the fuzzy cross-entropy measure to the SVNSs. The information measure of neutrosophic sets are composed of the information measure of the truth-membership, indeterminacy-membership, and falsity-membership in SVNSs. Let
, where Ye introduced the discrimination information of
from
for
on the basis of the definition of fuzzy cross-entropy
as the following;
We can define the following information in terms of the indeterminacy-membership function and the falsity-membership function in the same way:
Definition 8 ([
25])
. The single-value neutrosophic cross-entropy about M and N where can be defined as follows: It can also be used to express the degree of differences of M from N. According to Shannon’s inequality, it is clear that , and if, and only if . That is, for any . Then, the neutrosophic cross-entropy can be modified as for the symmetry.
Theorem 7. Let X be a space of the universe course, . If , then , and .
According to the proof of Theorem 4, we can easily find that , and . In a similar way, , and , , and , meaning that . Thus, conclusively, we were able to easily obtain the proof.
Example 3. Let X be a space of the universe course, , where .
It is clear that , and we can obtain: that is, . that is, . that is, . Thus, .
Theorem 8. The above-defined symmetry neutrosophic cross-entropy is a distance measure.