Next Article in Journal
Real-Time Tumor Motion Tracking in 3D Using Planning 4D CT Images during Image-Guided Radiation Therapy
Next Article in Special Issue
Incremental Learning for Classification of Unstructured Data Using Extreme Learning Machine
Previous Article in Journal
Chronotype, Risk and Time Preferences, and Financial Behaviour
Previous Article in Special Issue
Multiple Attribute Decision-Making Method Using Linguistic Cubic Hesitant Variables
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Two Hesitant Multiplicative Decision-Making Algorithms and Their Application to Fog-Haze Factor Assessment Problem

1
School of Mathematics and Statistics, Hefei Normal University, Hefei 230601, Anhui, China
2
School of Mathematical Sciences, Anhui University, Hefei 230601, Anhui, China
3
School of Management, Hefei University of Technology, Hefei 230009, Anhui, China
4
School Department of Mechanical Engineering, Texas A&M University, College Station, TX 77843, USA
*
Author to whom correspondence should be addressed.
Algorithms 2018, 11(10), 154; https://doi.org/10.3390/a11100154
Submission received: 29 August 2018 / Revised: 28 September 2018 / Accepted: 2 October 2018 / Published: 10 October 2018
(This article belongs to the Special Issue Algorithms for Decision Making)

Abstract

:
Hesitant multiplicative preference relation (HMPR) is a useful tool to cope with the problems in which the experts utilize Saaty’s 1–9 scale to express their preference information over paired comparisons of alternatives. It is known that the lack of acceptable consistency easily leads to inconsistent conclusions, therefore consistency improvement processes and deriving the reliable priority weight vector for alternatives are two significant and challenging issues for hesitant multiplicative information decision-making problems. In this paper, some new concepts are first introduced, including HMPR, consistent HMPR and the consistency index of HMPR. Then, based on the logarithmic least squares model and linear optimization model, two novel automatic iterative algorithms are proposed to enhance the consistency of HMPR and generate the priority weights of HMPR, which are proved to be convergent. In the end, the proposed algorithms are applied to the factors affecting selection of fog-haze weather. The comparative analysis shows that the decision-making process in our algorithms would be more straight-forward and efficient.

1. Introduction

In a group decision making (GDM) situation, the decision makers (DMs) are usually required to select the desirable alternative(s) from a collection of alternatives. To cope with this problem, DMs would compare alternatives with each other and provide the preference information, and a judgement matrix can be constructed [1,2,3].
In order to model DMs’ knowledge and preferences, preference relations have been introduced. To characterize fuzziness and uncertainty, some kinds of extended preference relations have been introduced, including fuzzy preference relation (FPR) [4,5,6,7,8], multiplicative preference relation (MPR) [9,10,11,12] and linguistic preference relation (LPR) [13,14]. The experts describe their preference information with a 0–1 scale crisp numbers in FPR, and they utilize a 1–9 scale to express their preference information in MPR [15]. It is noted that the elements in MPRs are crisp values. However, considering the fuzziness and hesitation involved in practical decision-making problems, it may be difficult for DMs to express their evaluated information with crisp values. To describe the imprecision, the interval multiplicative preference relation [16] and the intuitionistic multiplicative preference relation (IMPR) [17] are introduced to express their decision-making preference information. Xia et al. [18] first defined the intuitionistic multiplicative preference relation (IMPR) and developed some IMPRs information aggregation techniques. Xia and Xu [18] introduced the concepts of hesitant fuzzy preference relation (HFPR) and hesitant multiplicative preference relation (HMPR) and studied their properties, which are followed by the construction of some methods of group decision making (GDM).
For various forms of preference relations, the most two important issues are consistency analysis and consistency improvement [19]. Ma et al. [20] developed an approach to check the inconsistency and weak transitivity for the FPR and to repair its inconsistency for reaching weak transitivity. Herrera-Viedma et al. [21] designed a method to construct consistent FPRs from a set of original preference data. For a given MPR, Xu and Wei [16] proposed a convergent model to improve its consistency. With the help of the Abelian linearly ordered group, Xia and Chen [22] established a general improving consistency and reaching consensus methods for different types of preference relations. By using the order consistency and multiplicative consistency, Jin et al. [23] proposed two new approaches for GDM with IFPRs to produce the normalized intuitionistic fuzzy weights for alternatives. Wang [24] proposed some linear programming models for deriving intuitionistic fuzzy weights. For the unbalanced LPRs, Dong et al. [25] investigated an optimization model to increase the consistency level. Pei et al. [26] developed an iterative algorithm to adjust the additive consistency of IFLPRs and derive the intuitionistic fuzzy weights for IFLPRs. Based on β-normalization, Zhu et al. [27] utilized the optimized parameter to develop a novel approach for inconsistent HFPRs. Under the hesitant fuzzy preference information environment, Zhang et al. [28] constructed a decision support model to derive the most desirable alternative.
Similar to MPRs, studying the HMPR is an important research topic. However, there are few techniques in the existing literature have been done about it. Xia and Xu [18] directly used the proposed operators to aggregate the HMPRs information. However, it is generally known that the unacceptable consistency preference relations easily lead to inconsistent conclusions. Therefore, the decision-making results that obtained by the method in Xia and Xu [18] may be unreasonable. Based on the β-normalization principle, Zhang and Wu [28] investigated a new decision-making model to generate the interval weights of alternatives from HMPRs. However, with the algorithm in Zhang and Wu [28], one must convert normalized HMPR into several MPRs, and it seems that the decision-making process is an indirect computation process. Therefore, deriving the priority weight vector of the HMPR efficiently and improving the consistency of HMPR are two most two important issues. This paper first introduces a new version of HMPR, and then the consistency of HMPR and consistency index of HMPR are presented. After that, two new algorithms are investigated to improve the consistency of HMPRs.
To do this, the remainder of this paper is organized as follows: Section 2 reviews some of the basic concepts. In Section 3, the definitions of HMPR, consistency of HMPR and consistency index of HMPR are presented. Two algorithms to improve the consistency level for HMPRs are investigated in Section 4. Section 5 provides an illustrative example to show the effectiveness and rationality of the proposed methods. Concluding remarks are presented in Section 6.

2. Preliminaries

In this section, we review some related work of the MPR and hesitant multiplicative set (HMS). Saaty [15] first introduced the concept of MPR, which is a useful tool to express evaluation information. For convenience, let X = { x 1 , x 2 , , x n } be a finite set of alternatives and N = { 1 , 2 , , n } .
Definition 1.
[15] An MPR A on X is represented by a matrix A = ( a i j ) n × n X × X with a i i = 1 , a i j a j i = 1 , i , j N , where a i j denotes the ratio of preferred degree of alternative x i with respect to x j .
In particular, as Saaty [29] showed the 1–9 scale, a i j = { 1 / 9 , 1 / 8 , , 1 / 2 , 1 , 2 , , 8 , 9 } , and a i j = 1 denotes that there is no difference between x i and x j , a i j = 9 denotes x i is absolutely preferred to x j , and a i j = { 2 , 3 , , 8 } (or a i j = { 1 / 2 , 1 / 3 , , 1 / 8 } ) indicates the intermediate evaluations [30].
Definition 2.
Let A = ( a i j ) n × n be a MPR, then A is consistent, if it satisfies the transitivity, i.e., a i j = a i k a k j , i , j , k N .
For a MPR A = ( a i j ) n × n , if there exists a crisp priority weight vector w = ( w 1 , w 2 , , w n ) T , such that a i j = w i / w j , i , j N , then A is consistent [15], where w i > 0 , i N , and i = 1 n w i = 1 .
Because of the complexity and uncertainty involved in practical GDM problems, it may be difficult for DMs to express their preference information with only one crisp number, but they can be represented by a few different crisp values.
Definition 3.
An HMS on X is defined as H = { x i , b H ( x i ) | i N } , where b H ( x i ) = { γ k [ 1 / 9 , 9 ] | k = 1 , 2 , , | b H ( x i ) | } is a hesitant multiplicative element (HME), which denotes all of the possible membership degrees of the element x X for the set H , | b H ( x i ) | is the cardinal of b H ( x i ) .
Definition 4.
[28] Let b = { γ k | k = 1 , 2 , , | b | } be a HME, | b | be the cardinal of b , the score function of b is defined by f ( b ) = ( k = 1 | b | γ k ) 1 | b | . Suppose that b 1 and b 2 are two HMEs, if f ( b 1 ) f ( b 2 ) , then b 1 b 2 .

3. Hesitant Multiplicative Preference Relations and Consistency Index

In what follows, inspired by MPR and score function of HME, we define a new version of HMPR, and then the consistency of HMPR and consistency index of HMPR are presented.
Definition 5.
An HMPR P on X can be defined as reciprocal matrix P = ( p i j ) n × n X × X , where p i j is an HME, which indicates the possible preference degrees of alternative x i over x j , and it satisfies
f ( p i j ) f ( p j i ) = 1 , p i i = { 1 } , | p i j | = | p j i | , i , j N
where f ( p i j ) and | p i j | are the score function of p i j and the number of values in p i j , respectively.
For a HMPR P = ( p i j ) n × n , since f ( p i j ) = ( k = 1 | p i j | γ i j , k ) 1 | p i j | ( ( 1 / 9 ) | p i j | ) 1 | p i j | = 1 / 9 and f ( p i j ) = ( k = 1 | p i j | γ i j , k ) 1 | p i j | ( 9 | p i j | ) 1 | p i j | = 9 , then f ( p i j ) [ 1 / 9 , 9 ] , i , j N . On the other hand, from Definition 5, one can obtain that f ( p i j ) f ( p j i ) = 1 , i , j N . Therefore, by using score function, we can transform the HMPR P into an MPR F = ( f i j ) n × n , where f i j = f ( p i j ) , i , j N .
Therefore, the following consistency of HMPR is introduced.
Definition 6.
Assume that P = ( p i j ) n × n is an HMPR, where p i j is HME, then P = ( p i j ) n × n is called consistent HMPR, if there exists a normalized crisp weight vector w = ( w 1 , w 2 , , w n ) T , such that
f ( p i j ) = w i / w j , i , j N
where w i > 0 , i N , i = 1 n w i = 1 .
From Equation (2), we have ln f ( p i j ) = ln w i ln w j , i , j N . However, for the HMPR provided by DMs, it is difficult to satisfy the consistency, and then Equation (2) cannot hold, it means that there exist i , j N , such that ln f ( p i j ) ln w i ln w j , then we can use ( ln f ( p i j ) ( ln w i ln w j ) ) 2 = ( ln f ( p i j ) ln w i + ln w j ) 2 to measure the deviation between ln f ( p i j ) and ln w i ln w j . Therefore, the values of ( ln f ( p i j ) ln w i + ln w j ) 2 ( i , j N ) can be used to measure the consistency level of the HMPR P = ( p i j ) n × n .
Definition 7.
Assume that P = ( p i j ) n × n is an HMPR, w = ( w 1 , w 2 , , w n ) T is the priority weight vector derived from P satisfying w i > 0 , i N , i = 1 n w i = 1 , then the consistency index of P is defined as
C I ( P ) = 2 n ( n 1 ) i < j ( ln f ( p i j ) ln w i + ln w j ) 2
The smaller is the value of C I ( P ) , the better is the consistency of HMPR P . If C I ( P ) = 0 , then P is consistent. If we provide the threshold δ 0 and C I ( P ) δ 0 , then P is called of acceptable consistency.

4. Consistency Repaired Methods for an HMPR

Motivated by the logarithmic least squares model [31], the priority weight vector can be derived by using the following optimization model:
( M 1 )   min J 1 = i , j = 1 n ( ln f ( p i j ) ln w i + ln w j ) 2 s . t . i = 1 n w i = 1 , w i > 0 , i N .
In fact, since f ( p i j ) f ( p j i ) = 1 , i , j N , then we have
J 1 = i < j ( ln f ( p i j ) ln w i + ln w j ) 2 + i > j ( ln f ( p i j ) ln w i + ln w j ) 2 = i < j ( ln f ( p i j ) ln w i + ln w j ) 2 + i > j ( ( ln f ( p i j ) + ln w i ln w j ) ) 2 = i < j ( ln f ( p i j ) ln w i + ln w j ) 2 + j < i ( ln f ( p j i ) ln w j + ln w i ) 2 = 2 i < j ( ln f ( p i j ) ln w i + ln w j ) 2 ,
thus, the developed optimization model (M-1) can be converted into the following optimization model:
( M 2 )   min J 1 = i < j ( ln f ( p i j ) ln w i + ln w j ) 2 s . t . i = 1 n w i = 1 , w i > 0 , i N .
According to Definition 6, we know that an HMPR P = ( p i j ) n × n is consistent, then there exists a normalized crisp weight vector w = ( w 1 , w 2 , , w n ) T , such that f ( p i j ) = w i / w j , i , j N , i.e., ln f ( p i j ) ln w i + ln w j = 0 , i , j N , then we have
j = 1 n ( ln f ( p i j ) ln w i + ln w j ) = 0 ,
thus
j = 1 n ln f ( p i j ) n ln w i + j = 1 n ln w j = 0 .
That is
ln w i = 1 n j = 1 n ln w j + 1 n j = 1 n ln f ( p i j ) ,
i.e.,
w i = k e 1 n j = 1 n ln f ( p i j ) = k ( j = 1 n f ( p i j ) ) 1 / n ,
where k = ( j = 1 n w j ) 1 / n . Since i = 1 n w i = 1 , then we have k = 1 / i = 1 n ( j = 1 n f ( p i j ) ) 1 / n , one can get that
w i = ( j = 1 n f ( p i j ) ) 1 / n / i = 1 n ( j = 1 n f ( p i j ) ) 1 / n
Therefore, the following Algorithm 1 is designed to be adjusted the consistency of the HMPR P :
Algorithm 1: The consistency adjusting process of HMPR based on logarithm least squares model
Step 1. Suppose   that   P ( t ) = ( p i j ( t ) ) n × n = P = ( p i j ) n × n and t = 0 , and pre-set the threshold δ 0 , the controlling parameter θ and the maximum number of iterations t max ;
Step 2. Derive the priority vector w ˜ ( t ) = ( w ˜ 1 ( t ) , w ˜ 2 ( t ) , , w ˜ n ( t ) ) T by Equation (6);
Step 3. Determine   the   consistency   index   C I ( P ( t ) ) by using Equation (3);
Step 4. If   C I ( P ( t ) ) δ 0 or   t > t max , then go to Step 7. Otherwise, go to Step 5;
Step 5. Let   P ( t + 1 ) = ( p i j ( t + 1 ) ) n × n , where
p i j ( t + 1 ) = { γ i j , k ( t + 1 ) | k = 1 , 2 , , | p i j ( t + 1 ) | } , γ i j , k ( t + 1 ) = ( γ i j , k ( t ) ) 1 θ ( w ˜ i ( t ) / w ˜ j ( t ) ) θ
Step 6. Let t = t + 1 and return to Step 2;
Step 7. Output   P ( t ) , w ˜ ( t ) , C I ( P ( t ) ) and   t ;
Step 8. End.
In the following, we prove that the developed Algorithm 1 is convergent.
Theorem 1.
Let P = ( p i j ) n × n be an HMPR, θ ( 0 < θ < 1 ) be the adjusted parameter, { P ( t ) } be a collection of HMPRs in Algorithm 1. If C I ( P ( t ) ) is the consistency index of P ( t ) , then we have
C I ( P ( t + 1 ) ) < C I ( P ( t ) )   for   each   t ,   and   lim t C I ( P ( t ) ) = 0
Proof. 
Suppose that w ˜ ( t ) = ( w ˜ 1 ( t ) , w ˜ 2 ( t ) , , w ˜ n ( t ) ) T is the priority weight vector of P ( t ) for each t , from the above analysis, we know that w ˜ ( t ) = ( w ˜ 1 ( t ) , w ˜ 2 ( t ) , , w ˜ n ( t ) ) T also is the optimal weight vector by solving model (M-2) for P ( t ) . Thus, we have
i < j ( ln f ( p i j ( t + 1 ) ) ln w ˜ i ( t + 1 ) + ln w ˜ j ( t + 1 ) ) 2 i < j ( ln f ( p i j ( t + 1 ) ) ln w i ( t + 1 ) + ln w j ( t + 1 ) ) 2
Let w i ( t + 1 ) = w ˜ i ( t ) , i N , then we have
i < j ( ln f ( p i j ( t + 1 ) ) ln w ˜ i ( t + 1 ) + ln w ˜ j ( t + 1 ) ) 2 i < j ( ln f ( p i j ( t + 1 ) ) ln w ˜ i ( t ) + ln w ˜ j ( t ) ) 2
In addition, according to Step 5 in Algorithm 1, we have
ln f ( p i j ( t + 1 ) ) = ( k = 1 | p i j ( t + 1 ) | γ i j , k ( t + 1 ) ) 1 | p i j ( t + 1 ) | = ln ( k = 1 | p i j ( t + 1 ) | ( ( γ i j , k ( t ) ) 1 θ ( w ˜ i ( t ) / w ˜ j ( t ) ) θ ) ) 1 | p i j ( t + 1 ) | = ln ( ( ( k = 1 | p i j ( t ) | γ i j , k ( t ) ) 1 | p i j ( t ) | ) 1 θ ( w ˜ i ( t ) / w ˜ j ( t ) ) θ ) = ( 1 θ ) ln f ( p i j ( t ) ) + θ ( ln w ˜ i ( t ) ln w ˜ j ( t ) ) .
Therefore,
C I ( P ( t + 1 ) ) = 2 n ( n 1 ) i < j ( ln f ( p i j ( t + 1 ) ) ln w ˜ i ( t + 1 ) + ln w ˜ j ( t + 1 ) ) 2 2 n ( n 1 ) i < j ( ln f ( p i j ( t + 1 ) ) ln w ˜ i ( t ) + ln w ˜ j ( t ) ) 2 = 2 n ( n 1 ) i < j ( ( 1 θ ) ln f ( p i j ( t ) ) + θ ( ln w ˜ i ( t ) ln w ˜ j ( t ) ) ( ln w ˜ i ( t ) ln w ˜ j ( t ) ) ) 2 = ( 1 θ ) 2 n ( n 1 ) i < j ( ln f ( p i j ( t ) ) ln w ˜ i ( t ) + ln w ˜ j ( t ) ) 2 = ( 1 θ ) C I ( P ( t ) ) < C I ( P ( t ) ) ,
i.e., C I ( P ( t + 1 ) ) = ( 1 θ ) C I ( P ( t ) ) < C I ( P ( t ) ) for each t .
Furthermore, on the one hand, according to Equation (10), we get that
lim t + C I ( P ( t ) ) lim t + ( 1 θ ) C I ( P ( t 1 ) ) lim t + ( 1 θ ) 2 C I ( P ( t 2 ) ) lim t + ( 1 θ ) t C I ( P ( 0 ) ) = 0 .
On the other hand, it is obvious that C I ( P ( t ) ) 0 , hence lim t C I ( P ( t ) ) = 0 .□
From Definition 6, if HMPR P = ( p i j ) n × n is consistent, then Equation (2) holds. Hence, it can be rewritten as ln f ( p i j ) = ln w i ln w j , i , j N . However, in many real situations, due to fuzziness and uncertainty, the HMPR provided by DMs is usually inconsistent, thus Equation (2) cannot hold, i.e., there exist ( i , j ) N × N , such that ln f ( p i j ) ln w i ln w j . In this case, some non-negative deviation variables d i j and d i j + , d i j d i j + = 0 , i , j N are introduced, such that
ln f ( p i j ) + d i j d i j + = ln w i ln w j , i , j N
The smaller is the value of deviation variables d i j and d i j + , the better is the consistency of HMPR. Therefore, we develop a linear optimization model to derive the smallest deviation variables and priority weight vector as follows:
( M 3 )   min J 2 = i , j = 1 n ( d i j + d i j + ) s . t . { ln f ( p i j ) + d i j d i j + = ln w i ln w j , i , j N , i = 1 n w i = 1 , w i > 0 , i N .
From Definition 5 and Equation (11), one can obtain that
d i j d i j + = ln w i ln w j ln f ( p i j ) = ( ln w j ln w i ln ( 1 / f ( p i j ) ) )   = ( ln w j ln w i ln f ( p j i ) ) = ( d j i d j i + ) = d j i + d j i ,
i.e., d i j + d j i = d i j + + d j i + .
As d i j , d i j + 0 and d i j d i j + = 0 , i , j N , it follows that d i j = d j i + , d i j + = d j i , i , j N . Therefore, we can obtain the following simplified optimization model:
( M 4 )   min J 2 = i < j ( d i j + d i j + ) s . t . { ln f ( p i j ) + d i j d i j + = ln w i ln w j , i < j , i = 1 n w i = 1 , w i > 0 , i N .
By using MATLAB or LINGO, we obtain the priority vector w = ( w 1 , w 2 , , w n ) T and optimal nonzero deviation values d ˜ i j , d ˜ i j + , then we have ln f ( p i j ) + d ˜ i j d ˜ i j + = ln w i ln w j , i.e., f ( p i j ) exp ( d ˜ i j d ˜ i j + ) = w i / w j .
Therefore, the following Algorithm 2 is designed to improve the consistency of the HMPR P :
Algorithm 2: The consistency adjusting process of HMPR based on linear optimization model
Step 1’. See Algorithm 1;
Step 2’. According   to   model   ( M 4 ) ,   we   get   the   optimal   nonzero   deviation   values   d ˜ i j ( t ) and   d ˜ i j + ( t ) , i , j N , and   the   priority   weight   vector   w ˜ ( t ) = ( w ˜ 1 ( t ) , w ˜ 2 ( t ) , , w ˜ n ( t ) ) T ;
Step 3’–4’. See Algorithm 1;
Step 4. If   C I ( P ( t ) ) δ 0 or   t > t max , then go to Step 7. Otherwise, go to Step 5;
Step 5’. Let   P ( t + 1 ) = ( p i j ( t + 1 ) ) n × n , where p i j ( t + 1 ) = { γ i j , k ( t + 1 ) | k = 1 , 2 , , | p i j ( t + 1 ) | } ,
γ i j , k ( t + 1 ) = ( w i ( t ) / w j ( t ) ) θ ( γ i j , k ( t ) ) 1 θ = ( γ i j , k ( t ) exp ( d ˜ i j ( t ) d ˜ i j + ( t ) ) ) θ ( γ i j , k ( t ) ) 1 θ = γ i j , k ( t ) exp ( θ ( d ˜ i j ( t ) d ˜ i j + ( t ) ) )
Step 6’–8’. See Algorithm 1.
Next, we will prove that the developed Algorithm 2 is convergent.
Theorem 2.
Let P = ( p i j ) n × n be an HMPR, θ ( 0 < θ < 1 ) be the adjusted parameter, { P ( t ) } be a collection of HMPRs in Algorithm 2, C I ( P ( t ) ) be the consistency index of P ( t ) , then we have
C I ( P ( t + 1 ) ) < C I ( P ( t ) )   for   each   t ,   and   lim t C I ( P ( t ) ) = 0
Proof. 
The proof of Theorem 2 is similar to Theorem 1. □

5. Illustrative Example Results and Discussion

5.1. Numerical Example

There is a city that was affected by fog-haze for a long time, and the scientists found that there are four main influence factors x 1 , x 2 , x 3 , x 4 for this city’s fog-haze. In order to determine the most important influence factor and rank these factors for fog-haze, a group scientist compares these four factors with each other and then provides the following preference information, HMPR P = ( p i j ) 4 × 4 [23]:
P = ( { 1 } { 1 / 7 , 1 / 6 , 1 / 5 } { 1 , 3 } { 1 / 8 , 2 / 7 } { 5 , 6 , 7 } { 1 } { 1 , 2 } { 7 , 9 } { 1 / 3 , 1 } { 1 / 2 , 1 } { 1 } { 1 / 4 } { 7 / 2 , 8 } { 1 / 9 , 1 / 7 } { 4 } { 1 } ) .
Now, we apply this paper’s Algorithms 3 and 4 respectively to select the most important factor for fog-haze.
Algorithm 3: The consistency adjusting process of HMPR based on logarithm least squares model
Step 1. Let   t = 0 , P ( t ) = P , δ 0 = 0.3 and   θ = 0.1
Step 2. By Equation (6), we obtain the priority vector:
w ˜ ( 0 ) = ( 0.0934 , 0.5512 , 0.1090 , 0.2464 ) T .
Step 3. By Equation (3), we determine the   consistency   index   C I ( P ( 0 ) ) = 0.7573
Step 4. As C I ( P ( 0 ) ) = 0.7573 > δ 0 , then we utilize Step 5 in Algorithm 1 to   repair   the   consistency   of   the   HMPR   P ( 0 ) and   derive   the   new   HMPR   P ( 1 ) :
P ( 1 ) = ( { 1 } { 0.1453 , 0.1669 , 0.1967 } { 0.9847 , 2.6467 } { 0.1397 , 0.2939 } { 5.0839 , 5.9916 , 6.8823 } { 1 } { 1.1759 , 2.1944 } { 6.2452 , 7.8302 } { 0.3778 , 1.0155 } { 0.4557 , 0.8504 } { 1 } { 0.2647 } { 3.4025 , 7.1581 } { 0.1277 , 0.1601 } { 3.7779 } { 1 } )
Step 5. By Equation (3), we determine the   consistency   index   C I ( P ( 0 ) ) = 0.7573 . Because   C I ( P ( 1 ) ) = 0.6133 > δ 0 , then by using Step 5 in Algorithm 1, we have
P ( 2 ) = ( { 1 } { 0.1476 , 0.1672 , 0.1938 } { 0.9711 , 2.3645 } { 0.1544 , 0.3015 } { 5.1601 , 5.9824 , 6.7772 } { 1 } { 1.3606 , 2.3855 } { 5.6358 , 6.9081 } { 0.4229 , 1.0297 } { 0.4192 , 0.7350 } { 1 } { 0.2786 } { 3.3170 , 6.4781 } { 0.1448 , 0.1774 } { 3.5888 } { 1 } )
Step 6. By Equation (3), we obtain the   consistency   index   C I ( P ( 2 ) ) = 0.4968 . As   C I ( P ( 2 ) ) = 0.4968 > δ 0 , then according to Step 5 in Algorithm 1, we have
P ( 3 ) = ( { 1 } { 0.1497 , 0.1674 , 0.1912 } { 0.9591 , 2.1363 } { 0.1689 , 0.3089 } { 5.2296 , 5.9727 , 6.6820 } { 1 } { 1.5515 , 2.5716 } { 5.1383 , 6.1714 } { 0.4681 , 1.0427 } { 0.3889 , 0.6445 } { 1 } { 0.2918 } { 3.2416 , 5.9202 } { 0.1620 , 0.1946 } { 3.4272 } { 1 } )
Applying Equation (6), the priority vector of HMPR   P ( 3 ) can   be   determined :   w ˜ ( 3 ) = ( 0.0883 , 0.4437 , 0.1908 , 0.2772 ) T .
Step 7. We   calculate   the   consistency   index   C I ( P ( 3 ) ) = 0.2923 .
Step 8. Sin ce   C I ( P ( 3 ) ) < δ 0 , then   the   iteration   stops ,   and   P ( 3 ) is acceptable consistent HMPR.
Step 9. Output   w ˜ ( 3 ) = ( 0.0883 , 0.4437 , 0.1908 , 0.2772 ) T . As   w 2 ( 3 ) > w 4 ( 3 ) > w 1 ( 3 ) > w 3 ( 3 ) , then   we   have   x 2 x 4 x 1 x 3 . Therefore ,   the   most   important   factor   for   fog haze   is   x 2 .
Algorithm 4: The consistency adjusting process of HMPR based on linear optimization model
Step 1’. Let t = 0 , P ˜ ( t ) = P , δ 0 = 0.3 and θ = 0.1 .
Step 2’. Using a model (M-4), we get the optimal deviation values d ˜ 13 + ( 0 ) = 0.8124 , d ˜ 23 ( 0 ) = 1.0245 , d ˜ 24 + ( 0 ) = 1.4958 , d ˜ 12 ( 0 ) = d ˜ 12 + ( 0 ) = d ˜ 13 ( 0 ) = d ˜ 14 ( 0 ) = d ˜ 14 + ( 0 ) = d ˜ 23 + ( 0 ) = d ˜ 24 ( 0 ) = d ˜ 34 ( 0 ) = d ˜ 34 + ( 0 ) = 0 , and the priority weight vector can be obtained as follows:
w ˜ ( 0 ) = ( 0.0745 , 0.4695 , 0.0994 , 0.3566 ) T .
Step 3’. Utilizing Equation (3) to get the consistency index C I ( P ˜ ( 0 ) ) = 0.7033 .
Step 4’. As C I ( P ˜ ( 0 ) ) > δ 0 , then we apply Step 5’ in Algorithm 2 to adjust the consistency of HMPR P ˜ ( 0 ) , and one can obtain a new HMPR P ˜ ( 1 ) as follows:
P ˜ ( 1 ) = ( { 1 } { 0.1478 , 0.1672 , 0.1935 } { 0.9696 , 2.3351 } { 0.1561 , 0.3023 } { 5.1658 , 5.9809 , 6.7659 } { 1 } { 1.3823 , 2.4077 } { 5.5718 , 6.8125 } { 0.4282 , 1.0314 } { 0.4153 , 0.7234 } { 1 } { 0.2802 } { 3.3080 , 6.4061 } { 0.1468 , 0.1795 } { 3.5689 } { 1 } )
Step 5’. Using model (M-4), we get the optimal deviation values d ˜ 13 + ( 1 ) = 0.3288 , d ˜ 12 ( 1 ) = 0.1329 , d ˜ 14 + ( 1 ) = 0.6202 , d ˜ 23 + ( 1 ) = 0.8904 , d ˜ 24 ( 1 ) = 1.3359 , d ˜ 34 + ( 1 ) = 0.3337 . d ˜ 23 ( 1 ) = d ˜ 24 + ( 1 ) = d ˜ 12 + ( 1 ) = d ˜ 13 ( 1 ) = d ˜ 14 ( 1 ) = d ˜ 34 ( 1 ) = 0 . By using Equation (3), we have C I ( P ˜ ( 1 ) ) = 0.5016 > δ 0 . Thus, by Step 5’ in Algorithm 2, one can obtain
P ˜ ( 2 ) = ( { 1 } { 0.1503 , 0.1702 , 0.1966 } { 0.9349 , 2.1205 } { 0.1701 , 0.3141 } { 5.0865 , 5.8754 , 6.6534 } { 1 } { 1.6102 , 2.6441 } { 5.1098 , 6.1204 } { 0.4716 , 1.0696 } { 0.3768 , 0.6210 } { 1 } { 0.3112 } { 3.1837 , 5.8789 } { 0.1634 , 0.1957 } { 3.2134 } { 1 } )
Step 6’. Using model (M-4), we determine the priority weight vector w ˜ ( 2 ) = ( 0.1107 , 0.3910 , 0.2056 , 0.3027 ) T .
Step 7’. By using Equation (3), we have C I ( P ˜ ( 2 ) ) = 0.2855 . As C I ( P ˜ ( 2 ) ) < δ 0 , then the iteration stops, and P ˜ ( 2 ) is acceptable consistent HMPR.
Step 8’. Output w ˜ ( 2 ) = ( 0.1107 , 0.3910 , 0.2056 , 0.3027 ) T .
Step 9’. As w ˜ 2 ( 2 ) > w ˜ 4 ( 2 ) > w ˜ 1 ( 2 ) > w ˜ 3 ( 2 ) , then we have x 2 x 4 x 1 x 3 , and the most important factor for fog-haze is x 2 .

5.2. Discussions

In what follows, we utilize the Algorithm I proposed by Zhang and Wu [28] to cope with the aforementioned problem, and then the following steps are involved:
Step 1”. Let optimized parameter ξ = 1 , then we obtain the normalized HMPR P ¯ = ( p ¯ i j ) 4 × 4 as follows:
P ¯ = ( { 1 } { 1 / 7 , 1 / 6 , 1 / 5 } { 1 , 3 , 3 } { 1 / 8 , 2 / 7 , 2 / 7 } { 5 , 6 , 7 } { 1 } { 1 , 2 , 2 } { 7 , 9 , 9 } { 1 / 3 , 1 / 3 , 1 } { 1 / 2 , 1 / 2 , 1 } { 1 } { 1 / 4 , 1 / 4 , 1 / 4 } { 7 / 2 , 7 / 2 , 8 } { 1 / 9 , 1 / 9 , 1 / 7 } { 4 , 4 , 4 } { 1 } ) .
Step 2”. Utilize Equation (21) in Zhang and Wu [28] to construct the MPRs A s ( s = 1 , 2 , 3 ) from P ¯ = ( p ¯ i j ) 4 × 4 :
A 1 = ( 1 1 / 7 1 1 / 8 7 1 1 7 1 1 1 1 / 4 8 1 / 7 4 1 ) , A 2 = ( 1 1 / 6 3 2 / 7 6 1 2 9 1 / 3 1 / 2 1 1 / 4 7 / 2 1 / 9 4 1 ) , A 3 = ( 1 1 / 5 3 2 / 7 5 1 2 9 1 / 3 1 / 2 1 1 / 4 7 / 2 1 / 9 4 1 ) .
Step 3”. Acceptable consistency of A s ( s = 1 , 2 , 3 ) is checked by Algorithm I in Xu and Wei [17]. Due to vast amount of computation, we would not list the iterative calculation process of adjusting the consistency of A s ( s = 1 , 2 , 3 ) . After six iterations of Algorithm I in Xu and Wei [17], any of A s ( 6 ) ( s = 1 , 2 , 3 ) is acceptably consistent.
Step 4”. Employ Equation (28) in Zhang and Wu [28] to obtain the weight vectors of A s ( 6 ) ( s = 1 , 2 , 3 ) :
W ( A 1 ( 6 ) ) = ( 0.4557 , 2.7600 , 0.4677 , 1.4346 ) T , W ( A 2 ( 6 ) ) = ( 0.3404 , 3.2789 , 0.6613 , 1.0911 ) T , W ( A 3 ( 6 ) ) = ( 0.6185 , 2.9184 , 0.7418 , 1.3343 ) T .
Step 5”. By Equation (31) in Zhang and Wu [28], we obtain the interval weight vector of P = ( p i j ) 4 × 4 as follows:
w 1 = [ 0.3404 , 0.6185 ] , w 2 = [ 2.7600 , 3.2489 ] , w 3 = [ 0.4677 , 0.7418 ] , w 4 = [ 1.0911 , 1.4346 ] .
Step 6”. Compute the degree of possibility of w i w j ( i , j = 1 , 2 , 3 , 4 ) by Equation (32) in Zhang and Wu [28], we have w 2 > w 4 > w 3 > w 1 , and then the ranking of the four main influence factors is x 2 x 4 x 1 x 3 . Therefore, the most important factor for fog-haze is x 2 .
From the above numerical example and comparison with Algorithm I in Zhang and Wu [28], the proposed decision-making algorithms have the following characteristics:
(1) According to the above decision-making process, it is observed that our algorithms and Zhang and Wu’s [28] approach produce the same ranking of the four influence factors for fog-haze, which means that our algorithms are reasonable.
(2) It is clear that the decision-making process in our approaches would be more straight-forward and efficient than the Algorithm 1 proposed by Zhang and Wu [28]. In fact, in the process of consistency-improving, our approaches utilize the original HMPR information provided by DMs and all the calculations directly using the HMEs to produce results, which can preserve the original information of DMs. However, with the Zhang and Wu’s [28] method, one must transfer the original HMPR given by the DMs into its corresponding MPRs, therefore it seems to be an indirect computation process. Meanwhile, in the process of obtaining the interval weight vector, it may be derived the same interval weights for different alternatives when the number of alternatives is too large, which leads to the original information losses.
(3) Our approaches investigate effective methods to improve the consistency of HMPRs, so that the improved HMPRs are acceptably consistent. Furthermore, in order to obtain the acceptably consistent HMPRs (or MPRs), the required number of iterations by our algorithms are less than Zhang and Wu’s [28] approach.

6. Conclusions

In this paper, we have reviewed some concepts of HMPR, consistency of HMPR and the consistency index of HMPR. Then, we have constructed the logarithmic least squares model and linear optimization model to obtain the priority weight vector of alternatives. Furthermore, in order to improve the consistency of HMPR, we have developed two algorithms to transform the unacceptable consistent HMPRs into the acceptable ones, which were followed by the discussion of the convergence of the developed algorithms. Finally, a numerical example of ranking the influence factors for fog-haze is provided, and the comparison with an existing approach is performed to validate the effectiveness of the proposed automatic iterative decision-making algorithms.
However, this paper does not discuss the situation where some DMs decide to not provide their evaluation information, that is how to construct a decision-making method with incomplete HMPRs in the GDM problems. Therefore, in the future, we would focus on investigating some novel algorithms to improve the consistency for incomplete HMPRs, designing the consensus-reaching models for incomplete HMPRs, and applying the incomplete HMPRs to solve practical applications in other areas such as pattern recognition, information fusion system, and image processing.

Author Contributions

L.P. conceived and designed the experiments and wrote the paper; F.J. performed the experiments and analyzed the data.

Funding

This research was funded by the scholarship from China Scholarship Council, grant number 201706690001 and the National Natural Science Foundation of China, grant numbers 71771001, 71871001, 71701001, 71501002.

Acknowledgments

The authors are very thankful to the editors and referees for their valuable comments and suggestions for improving the paper.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Atanassov, K. Type-1 fuzzy sets and intuitionistic fuzzy sets. Algorithms 2017, 10, 106. [Google Scholar] [CrossRef]
  2. Behnam, V.; Meysam, S.; Ebrahimnejad, S. Soft computing-based preference selection index method for human resource management. J. Intell. Fuzzy Syst. 2014, 26, 393–403. [Google Scholar]
  3. Chen, C.; Shen, Q. Transformation-based fuzzy rule interpolation using interval type-2 fuzzy sets. Algorithms 2017, 10, 91. [Google Scholar] [CrossRef]
  4. Orlovsky, S.A. Decision-making with a fuzzy preference relation. Fuzzy Sets Syst. 1978, 1, 155–167. [Google Scholar] [CrossRef]
  5. Jin, F.F.; Pei, L.D.; Chen, H.Y.; Zhou, L.G. Interval-valued intuitionistic fuzzy continuous weighted entropy and its application to multicriteria fuzzy group decision making. Knowl. Based Syst. 2014, 59, 132–141. [Google Scholar] [CrossRef]
  6. Xu, Z.S. The least variance priority method (LVM) for fuzzy complementary judgment matrix. Syst. Eng. Theory Pract. 2001, 21, 93–96. [Google Scholar]
  7. Chiclana, F.; Herrera, F.; Herrera-Viedma, E. Integrating multiplicative preference relations in a multipurpose decision-making model based on fuzzy preference relations. Fuzzy Sets Syst. 2001, 122, 277–291. [Google Scholar] [CrossRef]
  8. Xu, Z.S. A practical method for priority of interval number complementary judgment matrix. Oper. Res. Manag. Sci. 2002, 10, 16–19. [Google Scholar]
  9. Herrera, F.L.; Martı´nez, L.; Sa´nchez, P.J. Managing non-homogeneous information in group decision making. Eur. J. Oper. Res. 2005, 166, 115–132. [Google Scholar] [CrossRef]
  10. Van Laarhoven, P.J.M.; Pedrycz, W. A fuzzy extension of Saaty’s priority theory. Fuzzy Sets Syst. 1983, 11, 229–241. [Google Scholar] [CrossRef]
  11. Xu, Z.S. Intuitionistic preference relations and their application in group decision making. Inf. Sci. 2007, 177, 2363–2379. [Google Scholar] [CrossRef]
  12. Liao, H.C.; Xu, Z.S.; Xia, M.M. Multiplicative consistency of interval-valued intuitionistic fuzzy preference relation. J. Intell. Fuzzy Syst. 2014, 26, 2969–2985. [Google Scholar]
  13. Xu, Z.S. Deviation measures of linguistic preference relations in group decision making. Omega 2005, 33, 249–254. [Google Scholar] [CrossRef]
  14. Jin, F.F.; Ni, Z.W.; Pei, L.D.; Chen, H.Y.; Tao, Z.F.; Zhu, X.H.; Ni, L.P. Approaches to group decision making with linguistic preference relations based on multiplicative consistency. Comput. Ind. Eng. 2017, 114, 69–79. [Google Scholar] [CrossRef]
  15. Saaty, T.L. A scaling method for priorities in hierarchy structures. J. Math. Psychol. 1977, 15, 234–281. [Google Scholar] [CrossRef]
  16. Xu, Z.S.; Wei, C.P. A consistency improving method in the analytic hierarchy process. Eur. J. Oper. Res. 1999, 116, 443–449. [Google Scholar]
  17. Xia, M.M.; Xu, Z.S.; Liao, H.C. Preference relations based on intuitionistic multiplicative information. IEEE Trans. Fuzzy Syst. 2013, 21, 113–133. [Google Scholar]
  18. Xia, M.M.; Xu, Z.S. Managing hesitant information in GDM problems under fuzzy and multiplicative preference relations. Int. J. Uncertain. Fuzziness Knowl Based Syst. 2013, 21, 865–897. [Google Scholar] [CrossRef]
  19. Xu, Z.S. A survey of preference relations. Int. J. Gen. Syst. 2007, 36, 179–203. [Google Scholar] [CrossRef]
  20. Ma, J.; Fan, Z.P.; Jiang, Y.P. A method for repairing the inconsistency of fuzzy preference relations. Fuzzy Sets Syst. 2006, 157, 20–33. [Google Scholar] [CrossRef]
  21. Herrera-Viedma, E.; Herrera, F.; Chiclana, F.; Luque, M. Some issues on consistency of fuzzy preference relations. Eur. J. Oper. Res. 2004, 154, 98–109. [Google Scholar] [CrossRef] [Green Version]
  22. Xia, M.M.; Chen, J. Consistency and consensus improving methods for pairwise comparison matrices based on Abelian linearly ordered group. Fuzzy Sets Syst. 2015, 266, 1–32. [Google Scholar] [CrossRef]
  23. Jin, F.F.; Ni, Z.W.; Chen, H.Y.; Li, Y.P. Approaches to group decision making with intuitionistic fuzzy preference relations based on multiplicative consistency. Knowl. Based Syst. 2016, 97, 48–59. [Google Scholar] [CrossRef]
  24. Wang, Z.J. Derivation of intuitionistic fuzzy weights based on intuitionistic fuzzy preference relations. Appl. Math. Model. 2013, 37, 6377–6388. [Google Scholar] [CrossRef]
  25. Dong, Y.C.; Li, C.C.; Herrera, F. An optimization-based approach to adjusting unbalanced linguistic preference relations to obtain a required consistency level. Inf. Sci. 2015, 292, 27–38. [Google Scholar] [CrossRef]
  26. Pei, L.D.; Jin, F.F.; Ni, Z.W.; Chen, H.Y.; Tao, Z.F. An automatic iterative decision-making method for intuitionistic fuzzy linguistic preference relations. Int. J. Syst. Sci. 2017, 48, 2779–2793. [Google Scholar] [CrossRef]
  27. Zhu, B.; Xu, Z.S.; Xu, J.P. Deriving a ranking from hesitant fuzzy preference relations under group decision making. IEEE Trans. Cybern. 2017, 44, 1328–1337. [Google Scholar] [CrossRef] [PubMed]
  28. Zhang, Z.M.; Wu, C. Deriving the priority weights from hesitant multiplicative preference relations in group decision making. Appl. Soft Comput. 2014, 25, 107–117. [Google Scholar] [CrossRef]
  29. Saaty, T.L. Exploring the interface between hierarchies, multiple objectives and fuzzy sets. Fuzzy Sets Syst. 1978, 1, 57–68. [Google Scholar] [CrossRef]
  30. Herrera, F.; Herrera-Viedma, E.; Chiclana, F. Multiperson decision-making based on multiplicative preference relations. Eur. J. Oper. Res. 2001, 129, 372–385. [Google Scholar] [CrossRef]
  31. Wang, Y.M.; Fan, Z.P. Group decision analysis based on fuzzy preference relations: Logarithmic and geometric least squares methods. Appl. Math. Comput. 2007, 194, 108–119. [Google Scholar] [CrossRef]

Share and Cite

MDPI and ACS Style

Pei, L.; Jin, F. Two Hesitant Multiplicative Decision-Making Algorithms and Their Application to Fog-Haze Factor Assessment Problem. Algorithms 2018, 11, 154. https://doi.org/10.3390/a11100154

AMA Style

Pei L, Jin F. Two Hesitant Multiplicative Decision-Making Algorithms and Their Application to Fog-Haze Factor Assessment Problem. Algorithms. 2018; 11(10):154. https://doi.org/10.3390/a11100154

Chicago/Turabian Style

Pei, Lidan, and Feifei Jin. 2018. "Two Hesitant Multiplicative Decision-Making Algorithms and Their Application to Fog-Haze Factor Assessment Problem" Algorithms 11, no. 10: 154. https://doi.org/10.3390/a11100154

APA Style

Pei, L., & Jin, F. (2018). Two Hesitant Multiplicative Decision-Making Algorithms and Their Application to Fog-Haze Factor Assessment Problem. Algorithms, 11(10), 154. https://doi.org/10.3390/a11100154

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop