1. Introduction
As one of the basic activities of human society, decision making generally exists in all aspects of today’s life. It is usually defined as a mental process that involves judging multiple options or alternatives to select one, so as to best fulfill the aims or goals of the decision makers. As the real world is complex and changeable, the relationships between things are mostly random, imprecise and fuzzy, which are the main source of uncertainty in our daily life decision making. A number of mathematical theories have been introduced so far to address the problem of uncertainty, such as Kolmogorov’s theory of probability [
1], Zadeh’s theory of fuzzy sets (FSs) [
2], Atanassov’s theory of intuitionist FSs [
3], Gorzalzany’s theory of interval FSs [
4], Pawlak’s theory of rough sets [
5] and so on. As a result of these theories, different mathematical models are developed to deal with different kinds of uncertainties. However, due to the lake of parameterization tools, these theories failed to successfully describe some uncertainty problems, which greatly limit the applications of these theories in decision making [
6]. In order to solve the blind areas of the above-mentioned theories, Molodtsov [
6] described things from the perspective of parameterization and introduced the concept of soft set theory. Due to sufficient parametrization tools, a real-world object can easily be described by soft sets in a more realistic and simple way. The applications of soft set theory had been explored in many directions, especially in decision making. Maji et al. [
7] applied soft set theory to solve a decision-making problem using the rough set approach. Cagman and Enginoglu [
8] introduced matrix representation for soft sets and defined some product operations in soft set theory. Using the product operations, they proposed the uni-int decision-making method and applied it to a decision-making problem based on soft sets. Cagman and Karatas [
9] introduced a novel algorithm for intiutionistic fuzzy soft sets (IFSSs)-based decision making. Xu and Xiao [
10] proposed a financial ratio selection model to predict business failure by using soft set theory. Khameneh et al. [
11] introduced an adjustable approach to multi-criteria group decision-making based on a preference relationship under FS information. Ali et al. [
12] presented a novel approach to multi-attribute decision making using complex intuitionistic fuzzy soft sets (CIFSSs) based on prioritized aggregation operators. Ali et al. [
13] introduced the idea of interval-valued fuzzy soft preordered (IVFSpreordered) set and discussed its applications in decision-making. Ali et al. [
14] developed a new multi-attribute decision-making framework under q-rung orthopair fuzzy bipolar soft sets (q-ROFBSSs). Saqlain et al. [
15] discussed distance and similarity measures for neutrosophic hypersoft set (NHSS) and presented an NHSS-TOPSIS method for decision making. Some useful decision-making approaches were also presented by using the concepts of soft matrices [
16,
17] and N-soft sets [
18,
19]. Recently, Mahmood et al. [
20] introduced the concept of bipolar complex fuzzy soft set (BCFSS) and discussed its applications in decision-making. For more study about applications of soft set theory and its extended models, we refer to [
21,
22,
23].
It is evident from the last paragraph that the soft set with its extended models has many applications, especially in decision making. However, with the increasing amount of data in these decision-making problems, there exist increasingly useless or redundant data that need to be excluded; otherwise, the decision-making goals become increasingly complex. Therefore, parameter reduction is a useful process which can be used to eliminate such unnecessary or redundant information in soft set-based decision-making problems without changing their decision abilities. Generally, it is a minimum subset of parameters that can provide the same descriptive or decision ability as the entire set of parameters. Some successful implementations had been made by different researchers towards parameter reduction of soft sets. The first attempt was made by Maji et al. [
7], who displayed soft sets in the form of an information table and given the idea of a reduction of soft sets by using the rough set approach. Chen et al. [
24] showed that Maji’s reduction method can be used for attribute reduction in rough set theory, but it cannot be used for parameter reduction in soft set theory. Therefore, Chen et al. [
24] introduced a new method for parameter reduction of soft sets. However, Chen et al.’s method failed to maintain the entire decision order of decision alternatives after the reduction process. Later, Kong et al. [
25] addressed the problem of suboptimal choices and added new parameters to soft set theory. According to Kong et al. [
25], most of the methods related to soft set reduction, such as Maji et al. [
7] and Chen et al. [
24], have only considered the optimal choice, but they ignored suboptimal choices at the time of decision making. However, in many real-world decision-making problems, such as evaluation of supply chain partners, scholarships evaluation, etc., we consider the entire ranking order of alternatives rather than the optimal one. Thus, much time is needed to make new reductions in most datasets where the data of optimal choice is deleted. Similarly, in some cases, we do not have sufficient parameters to fully characterize the alternatives in decision-making problems so that we need to add some more parameters to existing parameter sets. However, in this case, we may also need a new reduction, as the new parameters may change the decision order of decision-making problems. To overcome these two drawbacks, Kong et al. [
25] introduced the concept of normal parameter reduction (NPR) of (fuzzy) soft sets and presented its heuristic algorithm. NPR can reduce the number of parameters without changing the entire ranking (or decision) order of decision alternatives. However, the algorithm of NPR, as proposed in [
25], was based on the parameter importance degree, which was hard to compute and involved a great amount of computation. Therefore, Ma et al. [
26] proposed the new efficient normal parameter reduction algorithm (NENPR) for soft sets to reduce the computational complexity of the NPR process. Renukadevi and Sangeetha [
27] discussed some interesting characterizations of NPR of soft sets. Kong et al. [
28] used the particle swarm optimization algorithm to provide proper mathematical representation to the problem of NPR of soft sets. Danjuma et al. [
29] considered the case of repeated columns in soft set reduction and proposed the alternative normal parameter reduction (ANPR) algorithm of soft sets. Ma and Qin [
30] introduced soft set-based parameter value reduction which keeps the entire decision ability of decision alternative with a high success rate of finding reduction and low amount of computation. Khan and Zhu [
31] developed another improved algorithm for NPR of soft sets. Akram et al. [
32] proposed four different algorithms for parameter reduction of N-soft set and discussed their application in decision making. For more study about soft set reduction and its applications, we refer to [
33,
34,
35].
Kandamir [
36] introduced the concept of
-algebraic soft sets by taking the cardinality of sets as a measure on all subsets of the universal set. Furthermore, he defined two different relations (i.e., preferability and indiscernibility relations) on the parameter set, which further led to the idea of the intersectional reduced soft sets (IRSSs). However, in this study, we show that the IRSSs method is unable to maintain the entire decision order of alternatives. The main contributions of the study are summarized below:
We present some useful examples to show that the IRSSs method does not keep the decision order invariant.
We propose a new algorithm for NPR using -algebraic soft sets that not only overcomes the existing problems of IRSSs method, but also makes the reduction process more simple and convenient.
We provide a comparative study to show that the proposed algorithm has less computational complexity and workload as compared to the previous algorithm of Kong et al. [
25].
We present an application of the proposed algorithm in a real-life decision-making problem.
The rest of the paper is organized as follows.
Section 2 recalls some basic definitions and results related to soft set theory. In
Section 3, We discuss the basic idea of NPR of soft sets and give its initial algorithm proposed by Kong et al. [
25].
Section 4 highlights some setbacks of Kandamir’s approach to soft set reduction. In
Section 5, first we derive some useful results, and then develop a new algorithm for NPR of soft set. In
Section 6, we compare our new algorithm with Kong et al’s algorithm in terms of computational complexity.
Section 7 provides an application of the proposed algorithm in a real-world decision-making problem. Finally,
Section 8 presents the conclusion of the paper.
2. Preliminaries
This section briefly reviews some basic definitions and results related to soft set theory. Let U denote a finite universe of objects, E represent the set of parameters which can describe the properties of objects in U, and denote the power set of U.
Definition 1 ([
6]).
A pair is called a soft set over U, where F is a mapping given by The following example will clarify the concept of soft sets.
Example 1. Suppose that is the set of four houses under consideration, and is the set of parameters where each for stands for beautiful, new, cheap, reliable and well-furnished, respectively. A soft set can be defined to describe “the attractiveness of the houses" by:
.
Maji et al. [
7] represented a soft set by a binary table to store it in computer memory. The choice value of each
is defined by
, where
are the entries in the table of the soft set
for
and
. For example, the tabular representation of the soft set
as defined in Example 1 is given by
Table 1, where the last column shows the choice values of all
. From
Table 1, it is clear that
has the maximum choice value in the table. Therefore, by choice value criteria, the optimal choice object is
and it can be selected as the best house among the four houses.
It is seen in the last example that soft sets can be applied to decision-making problems under uncertain environment. However, sometimes, these decision-making problems involve such parameters which do not take any part in the decision-making process. For example, if we consider
in
Table 1, then we see that it has no role in the decision-making process. That is,
provides the same decision ability (order) as the entire set of parameters. Therefore, it is necessary to reduce such useless parameters from
E to minimize the workload and processing time in the decision-making process. Some successful implementations have been made by different researchers towards soft set reduction. Normal parameter reduction is one of them, which is described in the next section.
3. Normal Parameter Reduction of Soft Sets
Normal parameter reduction is a good approach to soft set reduction which was introduced by Kong et al. [
25]. It eliminates unnecessary parameters from
E without changing the decision order of decision alternatives.
Definition 2 ([
25]).
For any nonempty subset , an indiscernibility relation is defined by: Using the above indiscernibility relation, a decision partition:
is obtained for
which ranks or classify the objects in
U according to their choice values
. If we delete a parameter
from
E, then we obtain a new decision partition, which is denoted by:
For simplicity, we take and .
Definition 3 ([
25]).
For a soft set over U, if such that , then A is said to be dispensable; otherwise, it is called indispensable in E. is called the normal parameter reduction (NPR) of E, if B is indispensable and . Definition 4 ([
25]).
For a soft set over U, if and are the decision partition and the decision partition deleted , respectively, then for each parameter , the parameter importance degree is defined by , where:and denotes the cardinality of set.
Theorem 1 ([
25]).
For a soft set over U. If such that is the NPR of E, then or and . Based on Theorem 1, an algorithm for NPR of soft sets was proposed by Kong et al. [
25] which is labeled by Algorithm 1. The following example will illustrate Algorithm 1.
Algorithm 1 | NPR algorithm proposed by Kong et al. [25] |
Step 1. | Input the and its parameter set E; |
Step 2. | Calculate for all , where ; |
Step 3. | Find such that is a nonnegative integer and put it into the feasible parameter reduction set (FPRS); |
Step 4. | If the condition is satisfied for a subset A in the FPRS, then A is saved, otherwise it will be deleted; |
Step 5. | Calculate as the optimal NPR of , where A has the maximum cardinality in the FPRS. |
Example 2. If we consider the soft set as given by Table 1, then according to Algorithm 1: Step 1. Take and its parameter set E as an input.
Step 2. Compute the choice values for all and obtain the decision partition as given by: Similarly, obtain the deleted-decision partitions , for all as: Compute the importance degrees for all by Definition 4 as: Similarly, we can compute , , , .
Step 3. Find such that is a nonnegative integer and put A into FPRS. In this way we obtain the subsets, such as , , , and so on.
Step 4. Filter the FPRS. If is satisfied for a subset A, then it will be saved. Otherwise, it will be deleted from the FPRS. In this way, we obtain only three subsets which satisfy the given condition, such as , and .
Step 5. Finally, select as the maximum cardinality in the FPRS. Thus, is the optimal NPR of as given by: Table 2. We can verify that NPR can solve the problems of suboptimal choices and update the parameter set. For this, we consider the decision partition of
as given by (
1). Similarly, the decision partition for the reduced soft set
is given by
By comparing (
1) with (
2), we observe that the optimal choice and all the levels of suboptimal choices are invariant after the NPR. This shows that NPR not only maintains the optimal choice but also keep the entire ranking order of decision alternatives to be invariant.
We next discuss the problem of updated parameter sets. We assume that the character of objects (houses) in
U cannot be completely embodied by the given parameter set
E. Assume that we add some new parameters
to the existing parameter set
E, where each
stand for good color, in a hilly area and near the road, respectively. The updated soft set
is represented by
Table 3. From
Table 3, the decision partition for
is given by:
Similarly, if we add the new parameter set
to the reduced soft set
, then we obtain another soft set
, which is represented by
Table 4. From
Table 4, the decision partition for
is given by:
It is clear from (
3) and (
4) that after adding the new parameters, we obtain the same decision partition for
and its NPR
. This implies that one can use
instead of
as the new parameters have the same affect on both of the soft sets. This shows that NPR can support the case of updated parameter sets.
5. An Approach towards Normal Parameter Reduction Using -Algebraic Soft Sets
It is mentioned in the last section that every soft set can be regarded as a -algebraic soft set (see Example 3). Therefore, from onwards, every soft set will be considered as a -algebraic soft set over the measurable universe , where U is the initial universe, is the power set of U and is the cardinality of set defined as measure on .
Definition 11. For any soft set over the measurable universe , the impact of a parameter is defined by: For any nonempty subset , we have .
Definition 12. For any soft set over the measurable universe , a parameter is called a universal parameter denoted by , if . Similarly, is called a null parameter denoted by , if .
Proposition 1. For any soft set over the measurable universe , if for any , then .
Proof. The proof is straightforward by using Definition 11. □
Proposition 2. For a soft set over the measurable universe , , .
Proof. According to the definition of , for all . This implies that , which shows that . Again, by definition of , we have , for all . This implies that and . Hence, . □
Definition 13. For a soft set over the measurable universe , the impact of and are defined by:
, and
respectively, where and .
Proposition 3. For a soft set over the measurable universe , the following results hold:
(i)
(ii)
Proof. (i). We know that for any
,
and
. This implies that
and
. By using the definition of measure
, we can write:
The second part can be proved in similar way. □
Theorem 2. For a soft set over the measurable universe , if such that is the NPR of E, then = nonnegative integer and .
Proof. Suppose that is the NPR of E. Then, by Definition 3, and obviously .
Case 1. Suppose that
. Then for all
, we have:
Case 2. Let
, where
k is any natural number. Then,
which further implies that:
This completes the proof. □
Based on the result of Theorem 2, we propose a new algorithm for NPR of soft sets as labeled by Algorithm 2. To illustrate the idea of Algorithm 2, we present the following example.
Algorithm 2 | The proposed algorithm |
Step 1. | Input and its parameter set E; |
Step 2. | Compute for all , where ; |
Step 3. | Identify the parameters and in E and put them into the reduced parameter set Z; |
Step 4. | Find such that is a nonnegative integer, and put A into the FPRS; |
Step 5. | If the condition is satisfied for a subset A in the FPRS, then A is saved, otherwise delete A from the FPRS; |
Step 6. | Calculate as the optimal NPR of , where A has the maximum cardinality in the FPRS. |
Example 4. Once again, we consider the same soft set over U as given by Table 1. According to Algorithm 2: Step 1. Take and its parameter set E as an input.
Step 2. From Example 3, we can write: Using Definition 11, we can compute for all , which are listed in the last row of Table 7. Step 3. From Table 7, , so it can be put into the reduced parameter set Z. Step 4. Find such that is a nonnegative integer and put it into FPRS. In this way, we obtain only two subsets, such as and .
Step 5. After filtering the FPRS, we observe that only can satisfy the condition .
Step 6. Finally, is the required NPR of , which is the same as obtained by Algorithm 1 in Example 2.
It is evident from the last example that our proposed algorithm has greatly reduced the computational complexity of Algorithm 1 by computing the impact of parameters rather than parameter importance degrees. This shows that the proposed algorithm not only overcomes the existing problems of the IRSS method (already verified in Example 2), but also minimizes the workload of the NPR process.
6. Comparative Analysis
In this section, we compare the proposed algorithm with Algorithm 1 in terms of computational complexity. We also provide some experimental results to show that the proposed algorithm is more efficient than Algorithm 1 in capturing the NPR of soft sets.
6.1. Computational Complexity
We compare the computational complexity of both algorithms from the following three aspects.
1. Estimating the parameter importance degrees and impact of parameters: It is clear from Algorithms 1 and 2 that both of the algorithms follow the same footsteps to reach the NPR of soft sets. However, Algorithm 1 uses parameter importance degrees while Algorithm 2 uses impact of parameters to calculate the FPRS. For estimating the parameter importance degrees, Algorithm 1 first needs to obtain the decision partition and all deleted-decision partitions for . In this process, the total number of access elements is given by . Then, for estimating and for each , it needs to access elements. Since there are total m parameters in E, the total number of access elements in this step is given by . That is, for computing all parameter importance degrees, Algorithm 1 needs to access elements. On the other hand, to estimate the impact of parameters, Algorithm 2 first computes for each , and then obtains for all . The number of access elements in this whole process is , which is much less compared to .
2. Estimating the FPRS: To compute the FPRS, Algorithm 1 needs to test the sum of all possible combinations of parameter importance degrees from combination-1 to combination-m, that is, the number of access parameter importance degrees is given by . On the other hand, Algorithm 2 first puts the parameters and into the reduced parameter set Z. Suppose that the parameter number in and is z. Then, Algorithm 2 tests the sum of all possible combinations of the parameter impacts from combination-1 to combination-, where . That is, the number of access impact of parameters is given by . This shows that if the value of z is increasing, then the number of accessed entries for Algorithm 2 will be decreasing.
3. Filtering the PFRS: Suppose that there are
k FPRSs for Algorithm 2 and
z is the total parameter number of
and
. Then, the total number of FPRSs for Algorithm 1 must be equal to
(can be verified from
Table 8). The difference between FPRSs of both algorithms is given by
. Thus, once again, a large value of
z will cause a large difference between the FPRSs of Algorithms 1 and 2.
6.2. Experimental Results and Discussion
Here, we consider some experimental results to compare the computational complexity of Algorithm 1 with Algorithm 2. We apply both algorithms to the same soft set
, whose tabular representation is given by
Table 9. The results obtained from both algorithms are summarized in
Table 8. According to
Table 8, the optimal NPR of
E obtained from both algorithms is the same, which is given by
Table 10. However, Algorithm 1 accesses 3688 entries to estimate the parameter importance degrees, while Algorithm 2 accesses just 168 entries to estimate all parameter impacts. Similarly, Algorithm 1 accesses 1,048,575 parameter importance degrees to estimate the PFRS, while Algorithm 2 accesses only 131,071 parameter impacts for the same PFRS. Furthermore, Algorithm 1 checks 122,879 PFRSs for the dispensability condition
, while Algorithm 2 only checks 16,383 PFRSs for the same dispensability condition. This shows that Algorithm 2 has reduced the computational complexity at every stage in the NPR process and provides better results than Algorithm 1.
7. Application in Multi-Attribute Decision Making
In this section, we present an application of the proposed algorithm in a multi-attribute decision-making problem. We consider the scholarship selection problem of the Kano state scholarship board (KSB), Nigeria. The KBS works under the ministry of education Kano state that award a scholarship position to the indigene of the state whose parents are of Kano state origin and obtain admissions into tertiary institutions in Nigeria (or in some cases overseas). The board is responsible for:
Awarding the scholarship and improving the welfare of the state-sponsored students for foreign training;
Formulation and review of policies governing the awarding of scholarships;
Providing guidance and counseling for students;
Contacting Government establishment, institutes of learning and foreign universities;
Applying the selection criteria to all the applicants;
Providing a formal recommendation of suitably qualified applicants for overseas training to the governor of the state through the commissioner of education.
Here, we take the dataset of 35 students sponsored with a foreign scholarship by the KSB (available in [
29]). Each student is evaluated with respect to 15 decision attributes (or parameters). Let
denote the set of all students and
represent the set of parameters, where each
for
stands for English proficiency, mathematics, physics, chemistry, biology, agricultural sciences, Hausa language, Islamic studies, having attended public school, being above 17 years, having leadership potential, having ambassadorial potential, being an indigene of the state, being healthy, scoring a 2.1 in their undergraduate education and having completed NYSC, respectively. The views of the selection board are described by the soft set
, whose tabular representation is given by
Table 11. It is clear from
Table 11 that the students
have the highest choice values in the table, so they can be recommended as the best candidates for the scholarship awards by the KBS, while the students with suboptimal choice values, such as
, can be considered as the second-best choice for the scholarship awards if the total number of scholarships exceeds the number of first priority students.
Now, our goal is to find such parameters in
E which do not take any part in the decision-making process, and eliminate them without changing the decision order of the alternatives (students). In other words, we have to find those parameters in
E which are jointly sufficient and individually necessary for the decision order of the students. For this, we apply the proposed algorithm to the given soft set
. Initially, we compute
for all
, which are listed in the last row of
Table 11. From
Table 11, we see that
. Thus, these parameters can be put in the reduced parameter set
Z. Next, we search for those
for which
is a nonnegative integer. As a result, we obtain subsets such as
,
,
,
, and so on, which are put in the FPRS. After filtering the FPRS, we observe that
is the maximum subset of
that satisfies the condition
. Therefore, by the proposed algorithm,
is the optimal NPR of
as given by
Table 12. It is clear from
Table 12 that the optimal choices and all the levels of suboptimal choices of the reduced soft set
are the same as
. Thus, instead of taking the whole parameter set
E, the selection board can take only seven parameters in
to decide whether a student is suitable for the scholarship award or not. This shows that our proposed algorithm is helpful to minimize the work-load and processing time in decision-making problems.
8. Conclusions and Future Work
Parameter reduction is a key step in soft set-based decision-making problems, which eliminates unnecessary and redundant information without changing the decision ability of the decision-making problem. To date, various methods have been developed for soft set reduction; however, the problems of suboptimal choices and updated parameter sets are only addressed by Kong et al. [
25]. They introduced the concept of normal parameter reduction (NPR), which can reduce any soft set-based decision-making system without changing the decision order of decision alternatives. In this paper, we developed a new algorithm for NPR using the concept of
-algebraic soft sets. Kandamir [
36] also used the concept of
-algebraic soft sets for soft set reduction, but Kandamir’s method failed to maintain the entire decision order of decision alternatives. Thus, it is desired to modify their approach and develop such a method which does not suffer from the above-mentioned problems. For this reason, we applied the concept of
-algebraic soft sets to NPR, and proposed a new algorithm that not only overcomes the existing problems of Kandamir’s method, but also reduces the computational complexity of the NPR process. We compared the proposed algorithm with Kong et al.’s algorithm in terms of computational complexity and provided some experimental results. It is evident from the experimental results that the proposed algorithm greatly reduced the computational complexity and work-load of NPR as compared to Kong et al.’s algorithm. At the end of the paper, we presented an application of the proposed algorithm in a real-life decision-making problem.
Soft set-based decision making is a hot topic for researchers, but still, very limited literature can be found regarding soft set reduction. Thus, additional attention from the researchers is required to develop new reduction methods for soft sets. Some specific future research directions can be suggested as follows.
More general and efficient approaches are presented day by day for soft set-based decision making, and thus, we need to develop new reduction methods regarding these new decision criterions.
We need to study parameter reduction of some useful extended models of soft sets, such as picture fuzzy soft sets, probabilistic soft sets, neutrosophic soft sets and so on.
At present, very limited applications of soft set reduction can be found in the literature. Therefore, applications of soft set reduction require more attention and should be explored further.