Next Article in Journal
Securing IoT-Empowered Fog Computing Systems: Machine Learning Perspective
Previous Article in Journal
Using Markov-Switching Models in US Stocks Optimal Portfolio Selection in a Black–Litterman Context (Part 1)
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A New Approach for Normal Parameter Reduction Using σ-Algebraic Soft Sets and Its Application in Multi-Attribute Decision Making

1
School of Management Science and Engineering, Nanjing University of Information Science & Technology, Nanjing 210044, China
2
Department of Applied Mathematics, Chung Yuan Christian University, Chung-Li, Taoyuan 32023, Taiwan
3
Department of Mathematics, Abdul Wali Khan University, Mardan 23200, Pakistan
4
Research Center for Environment and Society, Hohai University, Nanjing 210098, China
5
School of Public Administration, Hohai University, Nanjing 211100, China
6
Department of Computer Science, University of Management and Technology, Lahore 54770, Pakistan
*
Author to whom correspondence should be addressed.
Mathematics 2022, 10(8), 1297; https://doi.org/10.3390/math10081297
Submission received: 18 March 2022 / Revised: 8 April 2022 / Accepted: 12 April 2022 / Published: 13 April 2022
(This article belongs to the Section Fuzzy Sets, Systems and Decision Making)

Abstract

:
The soft set is one of the key mathematical tools for uncertainty description and has many applications in real-world decision-making problems. However, most of the time, these decision-making problems involve less important and redundant parameters, which make the decision making process more complex and challenging. Parameter reduction is a useful approach to eliminate such irrelevant and redundant parameters during soft set-based decision-making problems without changing their decision abilities. Among the various reduction methods of soft sets, normal parameter reduction (NPR) can reduce decision-making problems without changing the decision order of alternatives. This paper mainly develops a new algorithm for NPR using the concept of σ -algebraic soft sets. Before this, the same concept was used to introduce the idea of intersectional reduced soft sets (IRSSs). However, this study clarifies that the method of IRSSs does not maintain the decision order of alternatives. Thus, we need to develop a new approach that not only keeps the decision order invariant but also makes the reduction process more simple and convenient. For this reason, we propose a new algorithm for NPR using σ -algebraic soft sets that not only overcome the existing problems of IRSSs method but also reduce the computational complexity of the NPR process. We also compare our proposed algorithm with one of the existing algorithms of the NPR in terms of computational complexity. It is evident from the experimental results that the proposed algorithm has greatly reduced the computational complexity and workload in comparison with the existing algorithm. At the end of the paper, an application of the proposed algorithm is explored by a real-world decision-making problem.

1. Introduction

As one of the basic activities of human society, decision making generally exists in all aspects of today’s life. It is usually defined as a mental process that involves judging multiple options or alternatives to select one, so as to best fulfill the aims or goals of the decision makers. As the real world is complex and changeable, the relationships between things are mostly random, imprecise and fuzzy, which are the main source of uncertainty in our daily life decision making. A number of mathematical theories have been introduced so far to address the problem of uncertainty, such as Kolmogorov’s theory of probability [1], Zadeh’s theory of fuzzy sets (FSs) [2], Atanassov’s theory of intuitionist FSs [3], Gorzalzany’s theory of interval FSs [4], Pawlak’s theory of rough sets [5] and so on. As a result of these theories, different mathematical models are developed to deal with different kinds of uncertainties. However, due to the lake of parameterization tools, these theories failed to successfully describe some uncertainty problems, which greatly limit the applications of these theories in decision making [6]. In order to solve the blind areas of the above-mentioned theories, Molodtsov [6] described things from the perspective of parameterization and introduced the concept of soft set theory. Due to sufficient parametrization tools, a real-world object can easily be described by soft sets in a more realistic and simple way. The applications of soft set theory had been explored in many directions, especially in decision making. Maji et al. [7] applied soft set theory to solve a decision-making problem using the rough set approach. Cagman and Enginoglu [8] introduced matrix representation for soft sets and defined some product operations in soft set theory. Using the product operations, they proposed the uni-int decision-making method and applied it to a decision-making problem based on soft sets. Cagman and Karatas [9] introduced a novel algorithm for intiutionistic fuzzy soft sets (IFSSs)-based decision making. Xu and Xiao [10] proposed a financial ratio selection model to predict business failure by using soft set theory. Khameneh et al. [11] introduced an adjustable approach to multi-criteria group decision-making based on a preference relationship under FS information. Ali et al. [12] presented a novel approach to multi-attribute decision making using complex intuitionistic fuzzy soft sets (CIFSSs) based on prioritized aggregation operators. Ali et al. [13] introduced the idea of interval-valued fuzzy soft preordered (IVFSpreordered) set and discussed its applications in decision-making. Ali et al. [14] developed a new multi-attribute decision-making framework under q-rung orthopair fuzzy bipolar soft sets (q-ROFBSSs). Saqlain et al. [15] discussed distance and similarity measures for neutrosophic hypersoft set (NHSS) and presented an NHSS-TOPSIS method for decision making. Some useful decision-making approaches were also presented by using the concepts of soft matrices [16,17] and N-soft sets [18,19]. Recently, Mahmood et al. [20] introduced the concept of bipolar complex fuzzy soft set (BCFSS) and discussed its applications in decision-making. For more study about applications of soft set theory and its extended models, we refer to [21,22,23].
It is evident from the last paragraph that the soft set with its extended models has many applications, especially in decision making. However, with the increasing amount of data in these decision-making problems, there exist increasingly useless or redundant data that need to be excluded; otherwise, the decision-making goals become increasingly complex. Therefore, parameter reduction is a useful process which can be used to eliminate such unnecessary or redundant information in soft set-based decision-making problems without changing their decision abilities. Generally, it is a minimum subset of parameters that can provide the same descriptive or decision ability as the entire set of parameters. Some successful implementations had been made by different researchers towards parameter reduction of soft sets. The first attempt was made by Maji et al. [7], who displayed soft sets in the form of an information table and given the idea of a reduction of soft sets by using the rough set approach. Chen et al. [24] showed that Maji’s reduction method can be used for attribute reduction in rough set theory, but it cannot be used for parameter reduction in soft set theory. Therefore, Chen et al. [24] introduced a new method for parameter reduction of soft sets. However, Chen et al.’s method failed to maintain the entire decision order of decision alternatives after the reduction process. Later, Kong et al. [25] addressed the problem of suboptimal choices and added new parameters to soft set theory. According to Kong et al. [25], most of the methods related to soft set reduction, such as Maji et al. [7] and Chen et al. [24], have only considered the optimal choice, but they ignored suboptimal choices at the time of decision making. However, in many real-world decision-making problems, such as evaluation of supply chain partners, scholarships evaluation, etc., we consider the entire ranking order of alternatives rather than the optimal one. Thus, much time is needed to make new reductions in most datasets where the data of optimal choice is deleted. Similarly, in some cases, we do not have sufficient parameters to fully characterize the alternatives in decision-making problems so that we need to add some more parameters to existing parameter sets. However, in this case, we may also need a new reduction, as the new parameters may change the decision order of decision-making problems. To overcome these two drawbacks, Kong et al. [25] introduced the concept of normal parameter reduction (NPR) of (fuzzy) soft sets and presented its heuristic algorithm. NPR can reduce the number of parameters without changing the entire ranking (or decision) order of decision alternatives. However, the algorithm of NPR, as proposed in [25], was based on the parameter importance degree, which was hard to compute and involved a great amount of computation. Therefore, Ma et al. [26] proposed the new efficient normal parameter reduction algorithm (NENPR) for soft sets to reduce the computational complexity of the NPR process. Renukadevi and Sangeetha [27] discussed some interesting characterizations of NPR of soft sets. Kong et al. [28] used the particle swarm optimization algorithm to provide proper mathematical representation to the problem of NPR of soft sets. Danjuma et al. [29] considered the case of repeated columns in soft set reduction and proposed the alternative normal parameter reduction (ANPR) algorithm of soft sets. Ma and Qin [30] introduced soft set-based parameter value reduction which keeps the entire decision ability of decision alternative with a high success rate of finding reduction and low amount of computation. Khan and Zhu [31] developed another improved algorithm for NPR of soft sets. Akram et al. [32] proposed four different algorithms for parameter reduction of N-soft set and discussed their application in decision making. For more study about soft set reduction and its applications, we refer to [33,34,35].
Kandamir [36] introduced the concept of σ -algebraic soft sets by taking the cardinality of sets as a measure on all subsets of the universal set. Furthermore, he defined two different relations (i.e., preferability and indiscernibility relations) on the parameter set, which further led to the idea of the intersectional reduced soft sets (IRSSs). However, in this study, we show that the IRSSs method is unable to maintain the entire decision order of alternatives. The main contributions of the study are summarized below:
  • We present some useful examples to show that the IRSSs method does not keep the decision order invariant.
  • We propose a new algorithm for NPR using σ -algebraic soft sets that not only overcomes the existing problems of IRSSs method, but also makes the reduction process more simple and convenient.
  • We provide a comparative study to show that the proposed algorithm has less computational complexity and workload as compared to the previous algorithm of Kong et al. [25].
  • We present an application of the proposed algorithm in a real-life decision-making problem.
The rest of the paper is organized as follows. Section 2 recalls some basic definitions and results related to soft set theory. In Section 3, We discuss the basic idea of NPR of soft sets and give its initial algorithm proposed by Kong et al. [25]. Section 4 highlights some setbacks of Kandamir’s approach to soft set reduction. In Section 5, first we derive some useful results, and then develop a new algorithm for NPR of soft set. In Section 6, we compare our new algorithm with Kong et al’s algorithm in terms of computational complexity. Section 7 provides an application of the proposed algorithm in a real-world decision-making problem. Finally, Section 8 presents the conclusion of the paper.

2. Preliminaries

This section briefly reviews some basic definitions and results related to soft set theory. Let U denote a finite universe of objects, E represent the set of parameters which can describe the properties of objects in U, and P ( U ) denote the power set of U.
Definition 1
([6]). A pair ( F , E ) is called a soft set over U, where F is a mapping given by
F : E P ( U ) .
The following example will clarify the concept of soft sets.
Example 1.
Suppose that U = { u 1 , u 2 , u 3 , u 4 } is the set of four houses under consideration, and E = { e 1 , e 2 , e 3 , e 4 , e 5 } is the set of parameters where each e i for 1 i 5 stands for beautiful, new, cheap, reliable and well-furnished, respectively. A soft set ( F , E ) can be defined to describe “the attractiveness of the houses" by:
( F , E ) = b e a u t i f u l h o u s e s = { u 1 } , n e w h o u s e s = { u 3 , u 4 } , c h e a p h o u s e s = ϕ , r e l i a b l e h o u s e s = { u 2 , u 3 , u 4 } , w e l l f u r n i s h e d h o u s e s = { u 1 , u 4 } .
Maji et al. [7] represented a soft set by a binary table to store it in computer memory. The choice value of each u i U is defined by f E ( u i ) = j ( u i j ) , where u i j are the entries in the table of the soft set ( F , E ) for 1 i n and 1 j m . For example, the tabular representation of the soft set ( F , E ) as defined in Example 1 is given by Table 1, where the last column shows the choice values of all u i U . From Table 1, it is clear that u 4 has the maximum choice value in the table. Therefore, by choice value criteria, the optimal choice object is u 4 and it can be selected as the best house among the four houses.
It is seen in the last example that soft sets can be applied to decision-making problems under uncertain environment. However, sometimes, these decision-making problems involve such parameters which do not take any part in the decision-making process. For example, if we consider e 3 in Table 1, then we see that it has no role in the decision-making process. That is, E e 3 provides the same decision ability (order) as the entire set of parameters. Therefore, it is necessary to reduce such useless parameters from E to minimize the workload and processing time in the decision-making process. Some successful implementations have been made by different researchers towards soft set reduction. Normal parameter reduction is one of them, which is described in the next section.

3. Normal Parameter Reduction of Soft Sets

Normal parameter reduction is a good approach to soft set reduction which was introduced by Kong et al. [25]. It eliminates unnecessary parameters from E without changing the decision order of decision alternatives.
Definition 2
([25]). For any nonempty subset B E , an indiscernibility relation I N D ( B ) is defined by:
I N D ( B ) = { ( u i , u j ) U × U | f B ( u i ) = f B ( u j ) } .
Using the above indiscernibility relation, a decision partition:
C E = { { u 1 , u 2 , , u i } f 1 , { u i + 1 , u i + 2 , , u j } f 2 , , { u k + i , u k + 2 , , u n } f s }
is obtained for ( F , E ) which ranks or classify the objects in U according to their choice values f E ( . ) . If we delete a parameter e i from E, then we obtain a new decision partition, which is denoted by:
C E e i = { { u 1 ´ , u 2 ´ , , u i ´ } f 1 ´ , { u i + 1 ´ , u i + 2 ´ , , u j ´ } f 2 ´ , , { u k ´ , u k + 1 ´ , , u n ´ } f s ´ } .
For simplicity, we take C E = { E f 1 , E f 2 , , E f s } and C E e i = { E e i ¯ f 1 ´ , E e i ¯ f 2 ´ , , E e i ¯ f s ´ } .
Definition 3
([25]). For a soft set ( F , E ) over U, if A = { e ¯ 1 , e ¯ 2 , , e ¯ p } E such that f A ( u 1 ) = f A ( u 2 ) = = f A ( u n ) , then A is said to be dispensable; otherwise, it is called indispensable in E. B E is called the normal parameter reduction (NPR) of E, if B is indispensable and f E B ( u 1 ) = f E B ( u 2 ) = = f E B ( u n ) .
Definition 4
([25]). For a soft set ( F , E ) over U, if C E = { E f 1 , E f 2 , , E f s } and C E e i = { E e i ¯ f 1 ´ , E e i ¯ f 2 ´ , , E e i ¯ f s ´ } are the decision partition and the decision partition deleted e i , respectively, then for each parameter e i , the parameter importance degree r e i is defined by r e i = 1 | U | ( α 1 , e i + α 2 , e i + + α s , e i ) , where:
α k , e i = | E f k E e i ¯ f z ´ | , if there exits z ´ , such that f k = f z ´ , 1 z ´ s ´ , 1 k s ; | E f k | , otherwise , and | . | denotes the cardinality of set.
Theorem 1
([25]). For a soft set ( F , E ) over U. If A = { e ¯ 1 , e ¯ 2 , , e ¯ p } E such that E A is the NPR of E, then r A = 1 or r A = 0 and r e ¯ 1 + r e ¯ 2 + + r e ¯ p = f A ( . ) .
Based on Theorem 1, an algorithm for NPR of soft sets was proposed by Kong et al. [25] which is labeled by Algorithm 1. The following example will illustrate Algorithm 1.
Algorithm 1NPR algorithm proposed by Kong et al. [25]
Step 1.Input the ( F , E ) and its parameter set E;
Step 2.Calculate r e j for all e j , where 1 j m ;
Step 3.Find A E such that e j A r e j is a nonnegative integer and put it into the feasible parameter reduction set (FPRS);
Step 4.If the condition f A ( u 1 ) = f A ( u 2 ) = = f A ( u n ) is satisfied for a subset A in the FPRS, then A is saved, otherwise it will be deleted;
Step 5.Calculate E A as the optimal NPR of ( F , E ) , where A has the maximum cardinality in the FPRS.
Example 2.
If we consider the soft set ( F , E ) as given by Table 1, then according to Algorithm 1:
Step 1. Take ( F , E ) and its parameter set E as an input.
Step 2. Compute the choice values for all u i U and obtain the decision partition C E as given by:
C E = { { u 4 } 3 , { u 1 , u 3 } 2 , { u 2 } 1 } .
Similarly, obtain the deleted-decision partitions C E e j , for all e j E as:
C E e 1 = { { u 4 } 3 , { u 3 } 2 , { u 1 , u 2 } 1 } , C E e 2 = { { u 1 , u 4 } 2 , { u 2 , u 3 } 1 , C E e 3 = C E ,
C E e 4 = { { u 1 , u 4 } 2 , { u 3 } 1 , { u 2 } 0 } , C E e 5 = { { u 3 , u 4 } 2 , { u 1 , u 2 } 1 .
Compute the importance degrees for all e j E by Definition 4 as:
r e 1 = 1 4 ( | { u 4 } { u 4 } | + | { u 1 , u 3 } { u 3 } | + | { u 2 } { u 1 , u 2 } | ) = 1 4 ( 0 + 1 + 0 ) = 1 4 .
Similarly, we can compute r e 2 = 1 2 , r e 3 = 0 , r e 4 = 3 4 , r e 5 = 1 2 .
Step 3. Find A E such that e j A r e j is a nonnegative integer and put A into FPRS. In this way we obtain the subsets, such as { e 3 } , { e 1 , e 4 } , { e 1 , e 3 , e 4 } , { e 2 , e 3 , e 5 } and so on.
Step 4. Filter the FPRS. If f A ( u 1 ) = f A ( u 2 ) = = f A ( u n ) is satisfied for a subset A, then it will be saved. Otherwise, it will be deleted from the FPRS. In this way, we obtain only three subsets which satisfy the given condition, such as A 1 = { e 1 } , A 2 = { e 3 , e 4 } and A 3 = { e 1 , e 3 , e 4 } .
Step 5. Finally, select A 3 = { e 1 , e 3 , e 4 } as the maximum cardinality in the FPRS. Thus, B = E A 3 = { e 2 , e 5 } is the optimal NPR of ( F , E ) as given by: Table 2.
We can verify that NPR can solve the problems of suboptimal choices and update the parameter set. For this, we consider the decision partition of ( F , E ) as given by (1). Similarly, the decision partition for the reduced soft set ( F , B ) is given by
C B = { { u 4 } 2 , { u 1 , u 3 } 1 , { u 2 } 0 } .
By comparing (1) with (2), we observe that the optimal choice and all the levels of suboptimal choices are invariant after the NPR. This shows that NPR not only maintains the optimal choice but also keep the entire ranking order of decision alternatives to be invariant.
We next discuss the problem of updated parameter sets. We assume that the character of objects (houses) in U cannot be completely embodied by the given parameter set E. Assume that we add some new parameters E ^ = { e ^ 1 , e ^ 2 , e ^ 3 } to the existing parameter set E, where each e ^ i stand for good color, in a hilly area and near the road, respectively. The updated soft set ( F , E E ^ ) is represented by Table 3. From Table 3, the decision partition for ( F , E E ^ ) is given by:
C E E ^ = { { u 2 , u 4 } 4 , { u 1 , u 3 } 3 } .
Similarly, if we add the new parameter set E ^ to the reduced soft set ( F , B ) , then we obtain another soft set ( F , B E ^ ) , which is represented by Table 4. From Table 4, the decision partition for ( F , B E ^ ) is given by:
C B E ^ = { { u 2 , u 4 } 3 , { u 1 , u 3 } 2 } .
It is clear from (3) and (4) that after adding the new parameters, we obtain the same decision partition for ( F , E ) and its NPR ( F , B ) . This implies that one can use ( F , B ) instead of ( F , E ) as the new parameters have the same affect on both of the soft sets. This shows that NPR can support the case of updated parameter sets.

4. Intersectional Reduced Soft Set and Its Limitations

In this section, we analyze the method of the intersectional reduced soft set proposed by Kandamir [36]. We also provide an example to show that the intersectional reduced soft set method does not overcome the problems of suboptimal choices and updated parameter sets. We start the section with some basic definitions from measure theory.
Definition 5
([37]). A collection L of subsets of U is said to be a σ-algebra, if:
(i) 
U L ,
(ii) 
if A L then A c L ,
(iii) 
i = I A i L , for any sequence ( A i ) i I in L .
Definition 6
([37]). A function ξ : L R is called a measure if the following axioms are satisfied:
(i) 
ξ ( ϕ ) = 0 ,
(ii) 
ξ ( A ) 0 , for all A L ,
(iii) 
For any pairwise disjoint sequence ( A i ) i I taken from L , ξ ( i = I A i ) = i = I ξ ( A i ) .
Definition 7
([37]). The triplet ( U , L , ξ ) is called a measure space, where L is a σ-algebra on U and ξ is a measure on L . The elements of L are called measurable sets.
The concept of σ -algebraic soft sets is defined as follows.
Definition 8
([36]). Let ( F , E ) be a soft set over U and L ba a σ-algebra on U. Then ( F , E ) is called a σ-algebraic soft set, if F ( e ) L , e E .
Definition 9
([36]). Let ( U , L , ξ ) be a measurable universe and ( F , E ) be a σ-algebraic soft set over U. Two parameters e 1 , e 2 E are said to be indiscernible, denoted by e 1 e 2 , if ξ ( F ( e 1 ) ) = ξ ( F ( e 2 ) ) .
The relation ∼ is an equivalence relation on E and the indiscernibility class of e E is denoted by [ e ] .
Definition 10
([36]). Let ( U , L , ξ ) be a measurable universe and ( F , E ) be a σ-algebraic soft set over U. The intersectional reduced soft set (IRSS) of ( F , E ) is denoted by ( F * , [ E ] ) , where F * is defined by F * ( [ e ] ) = e e ¯ F ( e ¯ ) .
According to Kandemir [36], if the cardinality of set is taken as a measure on P ( U ) , then every soft set ( F , E ) over U can be regarded as a σ -algebraic soft set over the measurable universe ( U , P ( U ) , ξ ) . In this way, we can reduce a soft set ( F , E ) to IRSS ( F * , [ E ] ) by using the relation ∼. The following example will illustrate the idea of IRSS.
Example 3
(Example 3.38 in [36]). We consider the same soft set ( F , E ) over U as defined in Example 1. The tabular representation of ( F , E ) is given by Table 1. If we take the cardinality of sets as a measure on P ( U ) , then clearly ( F , E ) is a σ-algebraic soft set over the measure universe ( U , L , ξ ) . Furthermore:
ξ ( F ( e 1 ) ) = 1 , ξ ( F ( e 2 ) ) = 2 , ξ ( F ( e 3 ) ) = 0 , ξ ( F ( e 4 ) ) = 3 , ξ ( F ( e 5 ) ) = 2 .
That is, e 2 e 5 and by Definition 10, the IRSS of ( F , E ) is given by:
( F * , [ E ] ) = { F ( [ e 1 ] ) = { u 1 } , F ( [ e 2 ] ) = { u 4 } , F ( [ e 3 ] ) = ϕ , F ( [ e 4 ] ) = { u 2 , u 3 , u 4 } } .
The tabular representation of the IRSS ( F * , [ E ] ) is given by Table 5.

Limitations of the IRSS Method

By analyzing Example 3, we observe that the IRSS method does not overcome the problems of suboptimal choices and updated parameter sets. To verify this, we consider the decision partition for ( F * , [ E ] ) as given by:
C [ E ] = { { u 4 } 2 , { u 1 , u 2 , u 3 } 1 } .
If we compare (5) with (1), then we see that the optimal choice for the soft set ( F , E ) and its IRSS ( F * , [ E ] ) is the same, but their suboptimal choices are different from each other. This shows that the IRSS method does not overcome the problem of suboptimal choices.
Next, we show that the IRSS method does not solve the problem of updated parameter sets. For this, we add the new parameter set E ^ with [ E ] and obtain another soft set ( F , [ E ] E ^ ) , which is represented by Table 6. From Table 6, the decision partition for the soft set ( F , [ E ] E ^ ) is given by:
C [ E ] E ^ = { { u 2 } 4 , { u 4 } 3 , { u 1 , u 3 } 3 } .
By comparing (6) with (3), we observe that the decision partitions for the soft sets ( F , E E ^ ) and ( F , [ E ] E ^ ) are different from each other. This shows that the IRSS method does not overcome the problem of updated parameter sets.
From the above discussion, we conclude that although the IRSS method is a simple approach towards soft set reduction, it does not overcome the problems of suboptimal choices and updated parameter sets, that is, it does not maintain the entire ranking or decision order of alternatives after the reduction process. On the other hand, NPR can solve the above-mentioned problems, but estimating the parameter importance degree in Algorithm 1 is a complex process which makes the algorithm very difficult to understand and requires a great amount of computations. Therefore, we need to develop a new approach for soft set reduction that can solve the above-mentioned drawbacks of the two approaches. The next section will present our new approach to NPR that not only resolves the problems of decision order but also reduces the processing time of NPR.

5. An Approach towards Normal Parameter Reduction Using σ -Algebraic Soft Sets

It is mentioned in the last section that every soft set can be regarded as a σ -algebraic soft set (see Example 3). Therefore, from onwards, every soft set will be considered as a σ -algebraic soft set over the measurable universe ( U , P ( U ) , ξ ) , where U is the initial universe, P ( U ) is the power set of U and ξ is the cardinality of set defined as measure on P ( U ) .
Definition 11.
For any soft set ( F , E ) over the measurable universe ( U , P ( U ) , ξ ) , the impact of a parameter e E is defined by:
γ e = ξ ( F ( e ) ) | U | .
For any nonempty subset A E , we have γ A = e A γ e .
Definition 12.
For any soft set ( F , E ) over the measurable universe ( U , P ( U ) , ξ ) , a parameter e E is called a universal parameter denoted by e U , if γ e = 1 . Similarly, e E is called a null parameter denoted by e ϕ , if γ e = 0 .
Proposition 1.
For any soft set ( F , E ) over the measurable universe ( U , P ( U ) , ξ ) , if γ e 1 γ e 2 for any e 1 , e 2 E , then ξ ( F ( e 1 ) ) ξ ( F ( e 2 ) ) .
Proof. 
The proof is straightforward by using Definition 11. □
Proposition 2.
For a soft set ( F , E ) over the measurable universe ( U , P ( U ) , ξ ) , 0 γ e j 1 , e j E .
Proof. 
According to the definition of ξ , ξ ( F ( e j ) ) 0 for all e j E . This implies that ξ ( F ( e j ) ) | U | 0 , which shows that γ e j 0 . Again, by definition of ξ , we have ξ ( F ( e j ) ) | U | , for all e j E . This implies that ξ ( F ( e j ) ) | U | 1 and γ e j 1 . Hence, 0 γ e j 1 e j E . □
Definition 13.
For a soft set ( F , E ) over the measurable universe ( U , P ( U ) , ξ ) , the impact of e 1 e 2 and e 1 e 2 are defined by:
γ e 1 e 2 = 1 | U | ξ ( F ( e 1 e 2 ) ) , and γ e 1 e 2 = 1 | U | ξ ( F ( e 1 e 2 ) ) ,
respectively, where F ( e 1 e 2 ) = F ( e 1 ) F ( e 2 ) and F ( e 1 e 2 ) = F ( e 1 ) F ( e 2 ) .
Proposition 3.
For a soft set ( F , E ) over the measurable universe ( U , P ( U ) , ξ ) , the following results hold:
(i) γ e 1 e 2 max { γ e 1 , γ e 2 } .
(ii) γ e 1 e 2 min { γ e 1 , γ e 2 } .
Proof. 
(i). We know that for any e 1 , e 2 E , F ( e 1 ) F ( e 1 ) F ( e 2 ) and F ( e 2 ) F ( e 1 ) F ( e 2 ) . This implies that F ( e 1 ) F ( e 1 e 2 ) and F ( e 2 ) F ( e 1 e 2 ) . By using the definition of measure ξ , we can write:
ξ ( F ( e 1 ) ) ξ ( F ( e 1 e 2 ) ) , ξ ( F ( e 2 ) ) ξ ( F ( e 1 e 2 ) ) ,
1 | U | ξ ( F ( e 1 ) ) 1 | U | ξ ( F ( e 1 e 2 ) ) , 1 | U | ξ ( F ( e 2 ) ) 1 | U | ξ ( F ( e 1 e 2 ) ) ,
γ e 1 γ e 1 e 2 , γ e 2 γ e 1 e 2 , max { γ e 1 , γ e 2 } γ e 1 e 2 .
The second part can be proved in similar way. □
Theorem 2.
For a soft set ( F , E ) over the measurable universe ( U , P ( U ) , ξ ) , if A = { e ¯ 1 , e ¯ 2 , , e ¯ p } E such that E A is the NPR of E, then γ A = nonnegative integer and γ A = f A ( u i ) .
Proof. 
Suppose that E A is the NPR of E. Then, by Definition 3, f A ( u 1 ) = f A ( u 2 ) = = f A ( u n ) and obviously f A ( u i ) = n o n n e g a t i v e i n t e g e r .
Case 1. Suppose that f A ( u i ) = 0 . Then for all e ¯ A , we have:
F ( e ¯ ) = ϕ ξ ( F ( e ¯ ) ) = 0 ξ ( F ( e ¯ ) ) | U | = 0
γ e ¯ = 0 e ¯ A γ e ¯ = 0 γ A = 0 γ A = f A ( u i ) .
Case 2. Let f A ( u i ) = k , where k is any natural number. Then, f A ( u 1 ) = f A ( u 2 ) = = f A ( u n ) = k , which further implies that:
i = 1 n f A ( u i ) = n k 1 n i = 1 n f A ( u i ) = k 1 | U | i = 1 n f A ( u i ) = k
1 | U | i = 1 n ( j = 1 p ( u i j ) ) = k 1 | U | j = 1 p ( i = 1 n ( u i j ) ) = k
1 | U | j = 1 p ( ξ ( F ( e j ) ) ) = k j = 1 p γ e j = k
γ A = k γ A = f A ( u i ) .
This completes the proof. □
Based on the result of Theorem 2, we propose a new algorithm for NPR of soft sets as labeled by Algorithm 2. To illustrate the idea of Algorithm 2, we present the following example.
Algorithm 2The proposed algorithm
Step 1.Input ( F , E ) and its parameter set E;
Step 2.Compute γ e j for all e j E , where ( 1 j m ) ;
Step 3.Identify the parameters e j U and e j ϕ in E and put them into the reduced parameter set Z;
Step 4.Find A E ¯ = E Z such that γ A is a nonnegative integer, and put A into the FPRS;
Step 5.If the condition f A ( u 1 ) = f A ( u 2 ) = = f A ( u n ) is satisfied for a subset A in the FPRS, then A is saved, otherwise delete A from the FPRS;
Step 6.Calculate E ( A Z ) as the optimal NPR of ( F , E ) , where A has the maximum cardinality in the FPRS.
Example 4.
Once again, we consider the same soft set ( F , E ) over U as given by Table 1. According to Algorithm 2:
Step 1. Take ( F , E ) and its parameter set E as an input.
Step 2. From Example 3, we can write:
ξ ( F ( e 1 ) ) = 1 , ξ ( F ( e 2 ) ) = 2 , ξ ( F ( e 3 ) ) = 0 , ξ ( F ( e 4 ) ) = 3 , ξ ( F ( e 5 ) ) = 2 .
Using Definition 11, we can compute γ e j for all e j E , which are listed in the last row of Table 7.
Step 3. From Table 7, γ e 3 = 0 , so it can be put into the reduced parameter set Z.
Step 4. Find A E ¯ = E Z such that γ A is a nonnegative integer and put it into FPRS. In this way, we obtain only two subsets, such as A 1 = { e 1 , e 4 } and A 2 = { e 2 , e 5 } .
Step 5. After filtering the FPRS, we observe that only A 1 can satisfy the condition f A 1 ( u 1 ) = f A 1 ( u 2 ) = = f A 1 ( u 5 ) = 1 .
Step 6. Finally, B = E ( A 1 Z ) = { e 2 , e 5 } is the required NPR of ( F , E ) , which is the same as obtained by Algorithm 1 in Example 2.
It is evident from the last example that our proposed algorithm has greatly reduced the computational complexity of Algorithm 1 by computing the impact of parameters rather than parameter importance degrees. This shows that the proposed algorithm not only overcomes the existing problems of the IRSS method (already verified in Example 2), but also minimizes the workload of the NPR process.

6. Comparative Analysis

In this section, we compare the proposed algorithm with Algorithm 1 in terms of computational complexity. We also provide some experimental results to show that the proposed algorithm is more efficient than Algorithm 1 in capturing the NPR of soft sets.

6.1. Computational Complexity

We compare the computational complexity of both algorithms from the following three aspects.
1. Estimating the parameter importance degrees and impact of parameters: It is clear from Algorithms 1 and 2 that both of the algorithms follow the same footsteps to reach the NPR of soft sets. However, Algorithm 1 uses parameter importance degrees while Algorithm 2 uses impact of parameters to calculate the FPRS. For estimating the parameter importance degrees, Algorithm 1 first needs to obtain the decision partition C E and all deleted-decision partitions C E e j for e j E . In this process, the total number of access elements is given by m 2 n + m n + n . Then, for estimating α k , e j and r e j for each e j E , it needs to access 2 n elements. Since there are total m parameters in E, the total number of access elements in this step is given by 2 m n . That is, for computing all parameter importance degrees, Algorithm 1 needs to access m 2 n + m n + n + 2 m n = m 2 n + 3 m n + n elements. On the other hand, to estimate the impact of parameters, Algorithm 2 first computes ξ ( F ( e j ) ) for each e j E , and then obtains γ e j for all e j E . The number of access elements in this whole process is m n + n , which is much less compared to m 2 n + 3 m n + n .
2. Estimating the FPRS: To compute the FPRS, Algorithm 1 needs to test the sum of all possible combinations of parameter importance degrees from combination-1 to combination-m, that is, the number of access parameter importance degrees is given by C m 1 + C m 2 + + C m m . On the other hand, Algorithm 2 first puts the parameters e j U and e j ϕ into the reduced parameter set Z. Suppose that the parameter number in e j U and e j ϕ is z. Then, Algorithm 2 tests the sum of all possible combinations of the parameter impacts from combination-1 to combination- m ´ , where m ´ = m z . That is, the number of access impact of parameters is given by C m ´ 1 + C m ´ 2 + + C m ´ m ´ . This shows that if the value of z is increasing, then the number of accessed entries for Algorithm 2 will be decreasing.
3. Filtering the PFRS: Suppose that there are k FPRSs for Algorithm 2 and z is the total parameter number of e j U and e j ϕ . Then, the total number of FPRSs for Algorithm 1 must be equal to k ( 2 z ) + z (can be verified from Table 8). The difference between FPRSs of both algorithms is given by k ( 2 z 1 ) + z . Thus, once again, a large value of z will cause a large difference between the FPRSs of Algorithms 1 and 2.

6.2. Experimental Results and Discussion

Here, we consider some experimental results to compare the computational complexity of Algorithm 1 with Algorithm 2. We apply both algorithms to the same soft set ( F , E ) , whose tabular representation is given by Table 9. The results obtained from both algorithms are summarized in Table 8. According to Table 8, the optimal NPR of E obtained from both algorithms is the same, which is given by Table 10. However, Algorithm 1 accesses 3688 entries to estimate the parameter importance degrees, while Algorithm 2 accesses just 168 entries to estimate all parameter impacts. Similarly, Algorithm 1 accesses 1,048,575 parameter importance degrees to estimate the PFRS, while Algorithm 2 accesses only 131,071 parameter impacts for the same PFRS. Furthermore, Algorithm 1 checks 122,879 PFRSs for the dispensability condition f A ( u 1 ) = f A ( u 2 ) = = f A ( u n ) , while Algorithm 2 only checks 16,383 PFRSs for the same dispensability condition. This shows that Algorithm 2 has reduced the computational complexity at every stage in the NPR process and provides better results than Algorithm 1.

7. Application in Multi-Attribute Decision Making

In this section, we present an application of the proposed algorithm in a multi-attribute decision-making problem. We consider the scholarship selection problem of the Kano state scholarship board (KSB), Nigeria. The KBS works under the ministry of education Kano state that award a scholarship position to the indigene of the state whose parents are of Kano state origin and obtain admissions into tertiary institutions in Nigeria (or in some cases overseas). The board is responsible for:
  • Awarding the scholarship and improving the welfare of the state-sponsored students for foreign training;
  • Formulation and review of policies governing the awarding of scholarships;
  • Providing guidance and counseling for students;
  • Contacting Government establishment, institutes of learning and foreign universities;
  • Applying the selection criteria to all the applicants;
  • Providing a formal recommendation of suitably qualified applicants for overseas training to the governor of the state through the commissioner of education.
Here, we take the dataset of 35 students sponsored with a foreign scholarship by the KSB (available in [29]). Each student is evaluated with respect to 15 decision attributes (or parameters). Let U = { u 1 , u 2 , , u 35 } denote the set of all students and E = { e 1 , e 2 , , e 16 } represent the set of parameters, where each e i for 1 i 16 stands for English proficiency, mathematics, physics, chemistry, biology, agricultural sciences, Hausa language, Islamic studies, having attended public school, being above 17 years, having leadership potential, having ambassadorial potential, being an indigene of the state, being healthy, scoring a 2.1 in their undergraduate education and having completed NYSC, respectively. The views of the selection board are described by the soft set ( F , E ) , whose tabular representation is given by Table 11. It is clear from Table 11 that the students { u 8 , u 11 , u 12 , u 15 , u 22 , u 23 , u 24 , u 25 , u 27 , u 31 , u 32 } have the highest choice values in the table, so they can be recommended as the best candidates for the scholarship awards by the KBS, while the students with suboptimal choice values, such as { u 4 , u 16 , u 21 } , can be considered as the second-best choice for the scholarship awards if the total number of scholarships exceeds the number of first priority students.
Now, our goal is to find such parameters in E which do not take any part in the decision-making process, and eliminate them without changing the decision order of the alternatives (students). In other words, we have to find those parameters in E which are jointly sufficient and individually necessary for the decision order of the students. For this, we apply the proposed algorithm to the given soft set ( F , E ) . Initially, we compute γ e j for all e j E , which are listed in the last row of Table 11. From Table 11, we see that γ e 1 = γ e 2 = γ e 4 = γ e 16 = 1 . Thus, these parameters can be put in the reduced parameter set Z. Next, we search for those A E Z for which γ A is a nonnegative integer. As a result, we obtain subsets such as { e 3 , e 6 , e 7 , e 15 } , { e 5 , e 7 , e 10 , e 14 } , { e 3 , e 6 , e 8 , e 9 , e 14 } , { e 5 , e 6 , e 7 , e 9 , e 10 , e 14 } , and so on, which are put in the FPRS. After filtering the FPRS, we observe that A = { e 3 , e 6 , e 8 , e 9 , e 14 } is the maximum subset of E Z that satisfies the condition f A ( u 1 ) = f A ( u 2 ) = = f A ( u 35 ) = 3 . Therefore, by the proposed algorithm, R = E ( A Z ) = { e 5 , e 7 , e 10 , e 11 , e 12 , e 13 , e 15 } is the optimal NPR of ( F , E ) as given by Table 12. It is clear from Table 12 that the optimal choices and all the levels of suboptimal choices of the reduced soft set ( F , R ) are the same as ( F , E ) . Thus, instead of taking the whole parameter set E, the selection board can take only seven parameters in ( F , R ) to decide whether a student is suitable for the scholarship award or not. This shows that our proposed algorithm is helpful to minimize the work-load and processing time in decision-making problems.

8. Conclusions and Future Work

Parameter reduction is a key step in soft set-based decision-making problems, which eliminates unnecessary and redundant information without changing the decision ability of the decision-making problem. To date, various methods have been developed for soft set reduction; however, the problems of suboptimal choices and updated parameter sets are only addressed by Kong et al. [25]. They introduced the concept of normal parameter reduction (NPR), which can reduce any soft set-based decision-making system without changing the decision order of decision alternatives. In this paper, we developed a new algorithm for NPR using the concept of σ -algebraic soft sets. Kandamir [36] also used the concept of σ -algebraic soft sets for soft set reduction, but Kandamir’s method failed to maintain the entire decision order of decision alternatives. Thus, it is desired to modify their approach and develop such a method which does not suffer from the above-mentioned problems. For this reason, we applied the concept of σ -algebraic soft sets to NPR, and proposed a new algorithm that not only overcomes the existing problems of Kandamir’s method, but also reduces the computational complexity of the NPR process. We compared the proposed algorithm with Kong et al.’s algorithm in terms of computational complexity and provided some experimental results. It is evident from the experimental results that the proposed algorithm greatly reduced the computational complexity and work-load of NPR as compared to Kong et al.’s algorithm. At the end of the paper, we presented an application of the proposed algorithm in a real-life decision-making problem.
Soft set-based decision making is a hot topic for researchers, but still, very limited literature can be found regarding soft set reduction. Thus, additional attention from the researchers is required to develop new reduction methods for soft sets. Some specific future research directions can be suggested as follows.
  • More general and efficient approaches are presented day by day for soft set-based decision making, and thus, we need to develop new reduction methods regarding these new decision criterions.
  • We need to study parameter reduction of some useful extended models of soft sets, such as picture fuzzy soft sets, probabilistic soft sets, neutrosophic soft sets and so on.
  • At present, very limited applications of soft set reduction can be found in the literature. Therefore, applications of soft set reduction require more attention and should be explored further.

Author Contributions

Conceptualization, A.K. and M.H.; methodology, A.K., A.A.S. and M.A.; validation, A.K., M.-S.Y. and A.A.S.; investigation, M.-S.Y., M.H. and A.A.S.; resources, A.A.S. and M.A.; writing—original draft preparation, A.K., M.H. and M.A.; writing—review and editing, A.K. and M.-S.Y.; visualization, M.-S.Y. and M.H.; supervision, A.K. and M.-S.Y. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Acknowledgments

The authors acknowledge funding by the Ministry of Science and Technology of Taiwan under Grant MOST-110-2118-M-033-003.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Kolmogorov, A. Foundations of the Theory of Probability: Second English Edition; Courier Dover Publications: Mineola, NY, USA, 2018. [Google Scholar]
  2. Zadeh, L. Fuzzy sets. Inf. Control 1965, 8, 338–353. [Google Scholar] [CrossRef] [Green Version]
  3. Atanassov, K.T. Intuitionistic fuzzy sets. In Intuitionistic Fuzzy Sets; Springer: Berlin/Heidelberg, Germany, 1999; pp. 1–137. [Google Scholar]
  4. Gorzałczany, M.B. A method of inference in approximate reasoning based on interval-valued fuzzy sets. Fuzzy Sets Syst. 1987, 21, 1–17. [Google Scholar] [CrossRef]
  5. Pawlak, Z. Rough sets. Int. J. Comput. Inf. Sci. 1982, 11, 341–356. [Google Scholar] [CrossRef]
  6. Molodtsov, D. Soft set theory—First results. Comput. Math. Appl. 1999, 37, 19–31. [Google Scholar] [CrossRef] [Green Version]
  7. Maji, P.; Roy, A.R.; Biswas, R. An application of soft sets in a decision making problem. Comput. Math. Appl. 2002, 44, 1077–1083. [Google Scholar] [CrossRef] [Green Version]
  8. Cagman N, E.S. Soft matrix theory and its decision making. Comput. Math. Appl. 2010, 59, 3308–3314. [Google Scholar] [CrossRef] [Green Version]
  9. Çağman, N.; Karataş, S. Intuitionistic fuzzy soft set theory and its decision making. J. Intell. Fuzzy Syst. 2013, 24, 829–836. [Google Scholar] [CrossRef]
  10. Xu, W.; Xiao, Z.; Dang, X.; Yang, D.; Yang, X. Financial ratio selection for business failure prediction using soft set theory. Knowl.-Based Syst. 2014, 63, 59–67. [Google Scholar] [CrossRef]
  11. Khameneh, A.Z.; Kılıçman, A.; Salleh, A.R. An adjustable approach to multi-criteria group decision-making based on a preference relationship under fuzzy soft information. Int. J. Fuzzy Syst. 2017, 19, 1840–1865. [Google Scholar] [CrossRef]
  12. Ali, Z.; Mahmood, T.; Aslam, M.; Chinram, R. Another view of complex intuitionistic fuzzy soft sets based on prioritized aggregation operators and their applications to multiattribute decision making. Mathematics 2021, 9, 1922. [Google Scholar] [CrossRef]
  13. Ali, M.; Kılıçman, A. On interval-valued fuzzy soft preordered sets and associated applications in decision-making. Mathematics 2021, 9, 3142. [Google Scholar] [CrossRef]
  14. Ali, G.; Alolaiyan, H.; Pamučar, D.; Asif, M.; Lateef, N. A novel MADM framework under q-rung orthopair fuzzy bipolar soft sets. Mathematics 2021, 9, 2163. [Google Scholar] [CrossRef]
  15. Saqlain, M.; Riaz, M.; Saleem, M.A.; Yang, M.S. Distance and similarity measures for neutrosophic hypersoft set (NHSS) with construction of NHSS-TOPSIS and applications. IEEE Access 2021, 9, 30803–30816. [Google Scholar] [CrossRef]
  16. Atagün, A.O.; Kamacı, H.; Oktay, O. Reduced soft matrices and generalized products with applications in decision making. Neural Comput. Appl. 2018, 29, 445–456. [Google Scholar] [CrossRef]
  17. Kamacı, H.; Atagün, A.O.; Sönmezoğlu, A. Row-products of soft matrices with applications in multiple-disjoint decision making. Appl. Soft Comput. 2018, 62, 892–914. [Google Scholar] [CrossRef]
  18. Riaz, M.; Razzaq, A.; Aslam, M.; Pamucar, D. M-parameterized N-soft topology-based TOPSIS approach for multi-attribute decision making. Symmetry 2021, 13, 748. [Google Scholar] [CrossRef]
  19. Ali, S.; Kousar, M.; Xin, Q.; Pamučar, D.; Hameed, M.S.; Fayyaz, R. Belief and possibility belief interval-valued N-soft set and their applications in multi-attribute decision-making problems. Entropy 2021, 23, 1498. [Google Scholar] [CrossRef]
  20. Mahmood, T.; Rehman, U.U.; Jaleel, A.; Ahmmad, J.; Chinram, R. Bipolar complex fuzzy soft sets and their applications in decision-making. Mathematics 2022, 10, 1048. [Google Scholar] [CrossRef]
  21. Ali, Z.; Mahmood, T.; Ullah, K.; Khan, Q. Einstein geometric aggregation operators using a novel complex interval-valued pythagorean fuzzy setting with application in green supplier chain management. Rep. Mech. Eng. 2021, 2, 105–134. [Google Scholar] [CrossRef]
  22. Ahmad, M.R.; Saeed, M.; Afzal, U.; Yang, M.S. A novel MCDM method based on plithogenic hypersoft sets under fuzzy neutrosophic environment. Symmetry 2020, 12, 1855. [Google Scholar] [CrossRef]
  23. Zhan, J.; Zhu, K. Reviews on decision making methods based on (fuzzy) soft sets and rough soft sets. J. Intell. Fuzzy Syst. 2015, 29, 1169–1176. [Google Scholar] [CrossRef]
  24. Chen, D.; Tsang, E.; Yeung, D.S.; Wang, X. The parameterization reduction of soft sets and its applications. Comput. Math. Appl. 2005, 49, 757–763. [Google Scholar] [CrossRef] [Green Version]
  25. Kong, Z.; Gao, L.; Wang, L.; Li, S. The normal parameter reduction of soft sets and its algorithm. Comput. Math. Appl. 2008, 56, 3029–3037. [Google Scholar] [CrossRef] [Green Version]
  26. Ma, X.; Sulaiman, N.; Qin, H.; Herawan, T.; Zain, J.M. A new efficient normal parameter reduction algorithm of soft sets. Comput. Math. Appl. 2011, 62, 588–598. [Google Scholar] [CrossRef] [Green Version]
  27. Renukadevi, V.; Sangeetha, G. Characterizations of normal parameter reductions of soft sets. Ann. Fuzzy Math. Inform. 2015, 11, 415–424. [Google Scholar]
  28. Kong, Z.; Jia, W.; Zhang, G.; Wang, L. Normal parameter reduction in soft set based on particle swarm optimization algorithm. Appl. Math. Model. 2015, 39, 4808–4820. [Google Scholar] [CrossRef]
  29. Danjuma, S.; Ismail, M.A.; Herawan, T. An alternative approach to normal parameter reduction algorithm for soft set theory. IEEE Access 2017, 5, 4732–4746. [Google Scholar] [CrossRef]
  30. Ma, X.; Qin, H. Soft set based parameter value reduction for decision making application. IEEE Access 2019, 7, 35499–35511. [Google Scholar] [CrossRef]
  31. Khan, A.; Zhu, Y. An improved algorithm for normal parameter reduction of soft set. J. Intell. Fuzzy Syst. 2019, 37, 2953–2968. [Google Scholar] [CrossRef]
  32. Akram, M.; Ali, G.; Alcantud, J.C.; Fatimah, F. Parameter reductions in N-soft sets and their applications in decision-making. Expert Syst. 2021, 38, e12601. [Google Scholar] [CrossRef]
  33. Zhan, J.; Alcantud, J.C.R. A survey of parameter reduction of soft sets and corresponding algorithms. Artif. Intell. Rev. 2018, 52, 1839–1872. [Google Scholar] [CrossRef]
  34. Khan, A.; Zhu, Y. New algorithms for parameter reduction of intuitionistic fuzzy soft sets. Comput. Appl. Math. 2020, 39, 232. [Google Scholar] [CrossRef]
  35. Akram, M.; Ali, G.; Alcantud, J.C.R. Parameter reduction analysis under interval-valued m-polar fuzzy soft information. Artif. Intell. Rev. 2021, 54, 5541–5582. [Google Scholar] [CrossRef]
  36. Kandemir, M.B. The concept of σ-algebraic soft set. Soft Comput. 2018, 22, 4353–4360. [Google Scholar] [CrossRef]
  37. Emelyanov, E. Introduction to Measure Theory and Lebesgue Integration; Middle East Technical University Press: Ankara, Turkey, 2007. [Google Scholar]
Table 1. Tabular form of ( F , E ) in Example 1.
Table 1. Tabular form of ( F , E ) in Example 1.
U / E e 1 e 2 e 3 e 4 e 5 f E ( . )
u 1 100012
u 2 000101
u 3 010102
u 4 010113
Table 2. NPR of ( F , E ) in Example 3.
Table 2. NPR of ( F , E ) in Example 3.
U / B e 2 e 5 f B ( . )
u 1 011
u 2 000
u 3 101
u 4 112
Table 3. Tabular form of ( F , E E ^ ) .
Table 3. Tabular form of ( F , E E ^ ) .
U / E E ^ e 1 e 2 e 3 e 4 e 5 e ^ 1 e ^ 2 e ^ 3 f E E ^ ( u i )
u 1 100010103
u 2 000101114
u 3 010100103
u 4 010111004
Table 4. Tabular form of ( F , B E ^ ) .
Table 4. Tabular form of ( F , B E ^ ) .
U / B E ^ e 1 e 2 e 1 ^ e 2 ^ e 3 ^ f B E ^ ( . )
u 1 010102
u 2 001113
u 3 100102
u 4 111003
Table 5. Tabular form of IRSS ( F * , [ E ] ) .
Table 5. Tabular form of IRSS ( F * , [ E ] ) .
U / [ E ] [ e 1 ] [ e 2 ] [ e 3 ] [ e 4 ] f [ E ] ( u i )
u 1 10001
u 2 00011
u 3 00011
u 4 01012
Table 6. Tabular form of ( F , [ E ] E ^ ) .
Table 6. Tabular form of ( F , [ E ] E ^ ) .
U / [ E ] E ^ [ e 1 ] [ e 2 ] [ e 3 ] [ e 4 ] e ^ 1 e ^ 2 e ^ 3 f [ E ] E ^ ( u i )
u 1 10000102
u 2 00011114
u 3 00010102
u 4 01011003
Table 7. Tabular form of ( F , E ) with impact of parameters.
Table 7. Tabular form of ( F , E ) with impact of parameters.
U / E e 1 e 2 e 3 e 4 e 5 f E ( u i )
u 1 100012
u 2 000101
u 3 010102
u 4 010113
γ e i 1 4 1 2 0 3 4 1 2
Table 8. Comparison table.
Table 8. Comparison table.
ComparisonAlgorithm 1Algorithm 2Improvement (in %Age)
  Optimal normal parameter reduction e 1 , e 3 , e 5 , e 6 , e 8 , e 9 , e 11 , e 13 , e 15 , e 16 , e 18 , e 19 , e 20 e 1 , e 3 , e 5 , e 6 , e 8 , e 9 , e 11 , e 13 , e 15 , e 16 , e 18 , e 19 , e 20    The same
Accessed entriesFor estimating parameter importance degrees = 3688For estimating impact of parameters = 16895.44%
Estimating the FPRSTotal parameter importance degree accessed = 1,048,575Total impact of parameters accessed = 131,07187.50%
Total number of FPRSs122,87916,38386.66%
Involved operationsAddition, subtraction, division and classification of objectsOnly addition and divisionAlgorithm 2 requires fewer operations than Algorithm 1
Table 9. Tabular form of the soft set ( F , E ) .
Table 9. Tabular form of the soft set ( F , E ) .
U / E e 1 e 2 e 3 e 4 e 5 e 6 e 7 e 8 e 9 e 10 e 11 e 12 e 13 e 14 e 15 e 16 e 17 e 18 e 19 e 20 f E ( u i )
u 1 010000010001110100006
u 2 010000000101001101006
u 3 110001110010010100109
u 4 010010000111100101019
u 5 1100011010111010001010
u 6 010010110100000001017
u 7 0100100111110010010110
u 8 0110101010111011000111
γ e i 1 4 1 1 8 0 1 2 1 4 1 2 1 2 3 8 1 2 5 8 3 4 1 2 1 4 1 2 5 8 0 1 2 1 4 1 2
Table 10. NPR of the soft set ( F , E ) .
Table 10. NPR of the soft set ( F , E ) .
U / B e 1 e 3 e 5 e 6 e 8 e 9 e 11 e 13 e 15 e 16 e 18 e 19 e 20 f B ( u i )
u 1 00001000110003
u 2 00000010011003
u 3 10011001010106
u 4 00100011111016
u 5 10010101100107
u 6 00101010001014
u 7 00101111001017
u 8 01100101110018
Table 11. Tabular form of ( F , E ) in the scholarship award problem.
Table 11. Tabular form of ( F , E ) in the scholarship award problem.
U / E e 1 e 2 e 3 e 4 e 5 e 6 e 7 e 8 e 9 e 10 e 11 e 12 e 13 e 14 e 15 e 16 f E ( u i )
u 1 111111001111001112
u 2 11110011100000119
u 3 110111101110011112
u 4 111111010111101113
u 5 111111001110001111
u 6 110111101101011112
u 7 111101110001100110
u 8 110111110111111114
u 9 111111000110111112
u 10 110101110010111111
u 11 111110110111111114
u 12 110111110111111114
u 13 110101110010111111
u 14 111111110110000111
u 15 111110110111111114
u 16 111111100111110113
u 17 111110110110011112
u 18 111111010110001111
u 19 111110110110011112
u 20 111110101100011111
u 21 111111100111110113
u 22 110111110111111114
u 23 111110111111101114
u 24 111110111111101114
u 25 111110101111111114
u 26 111110110100111112
u 27 111110101111111114
u 28 110101111001101111
u 29 111111101100000110
u 30 111110110100011111
u 31 111110110111111114
u 32 110111110111111114
u 33 11110100100000118
u 34 111110111100001111
u 35 11110101000000118
γ e i 11 26 35 1 4 5 3 5 4 5 23 35 2 5 4 5 24 35 18 35 4 7 3 5 6 7 1
Table 12. NPR of ( F , E ) in the scholarship award problem.
Table 12. NPR of ( F , E ) in the scholarship award problem.
U / R e 5 e 7 e 10 e 11 e 12 e 13 e 15 f R 3 ( u i )
u 1 10111015
u 2 01000012
u 3 11110015
u 4 10111116
u 5 10110014
u 6 11101015
u 7 01001103
u 8 11111117
u 9 10110115
u 10 01010114
u 11 11111117
u 12 11111117
u 13 01010114
u 14 11110004
u 15 11111117
u 16 11111106
u 17 11110015
u 18 10110014
u 19 11110015
u 20 11100014
u 21 11111106
u 22 11111117
u 23 11111117
u 24 11111117
u 25 11111117
u 26 11100115
u 27 11111117
u 28 01001114
u 29 11100003
u 30 11100014
u 31 11111117
u 32 11111117
u 33 00000011
u 34 11100014
u 35 00000011
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Khan, A.; Yang, M.-S.; Haq, M.; Shah, A.A.; Arif, M. A New Approach for Normal Parameter Reduction Using σ-Algebraic Soft Sets and Its Application in Multi-Attribute Decision Making. Mathematics 2022, 10, 1297. https://doi.org/10.3390/math10081297

AMA Style

Khan A, Yang M-S, Haq M, Shah AA, Arif M. A New Approach for Normal Parameter Reduction Using σ-Algebraic Soft Sets and Its Application in Multi-Attribute Decision Making. Mathematics. 2022; 10(8):1297. https://doi.org/10.3390/math10081297

Chicago/Turabian Style

Khan, Abid, Miin-Shen Yang, Mirajul Haq, Ashfaq Ahmad Shah, and Muhammad Arif. 2022. "A New Approach for Normal Parameter Reduction Using σ-Algebraic Soft Sets and Its Application in Multi-Attribute Decision Making" Mathematics 10, no. 8: 1297. https://doi.org/10.3390/math10081297

APA Style

Khan, A., Yang, M. -S., Haq, M., Shah, A. A., & Arif, M. (2022). A New Approach for Normal Parameter Reduction Using σ-Algebraic Soft Sets and Its Application in Multi-Attribute Decision Making. Mathematics, 10(8), 1297. https://doi.org/10.3390/math10081297

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop