Next Article in Journal
Measuring Hotel Service Productivity Using Two-Stage Network DEA
Previous Article in Journal
Implementing Antimony Supply and Sustainability Measures via Extraction as a By-Product in Skarn Deposits: The Case of the Chalkidiki Pb-Zn-Au Mines
Previous Article in Special Issue
Environmental, Social and Economic Attitudes and Sustainable Knowledge on the Sustainable Behaviour of Engineering Students: An Analysis Based on Attitudes towards Teachers
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Fuzzy Multi-Criteria Approach for Selecting Sustainable Power Systems Simulation Software in Undergraduate Education

by
Olubayo Babatunde
1,2,*,
Michael Emezirinwune
2,
John Adebisi
3,
Khadeejah A. Abdulsalam
2,
Busola Akintayo
1 and
Oludolapo Olanrewaju
1
1
Department of Industrial Engineering, Durban University of Technology, Durban 4001, South Africa
2
Department of Electrical Electronics Engineering, University of Lagos, Nigeria 100213, South Africa
3
Division of Engineering & Technology, The University of West Alabama, Livingston, AL 35470, USA
*
Author to whom correspondence should be addressed.
Sustainability 2024, 16(20), 8994; https://doi.org/10.3390/su16208994
Submission received: 19 September 2024 / Revised: 10 October 2024 / Accepted: 14 October 2024 / Published: 17 October 2024

Abstract

:
Selecting the most preferred software for teaching power systems engineering at the undergraduate level is a complex problem in developing countries, and it requires making an informed decision by compromising on various criteria. This study proposes a multi-criteria framework to determine the most preferred software solution for instructing undergraduate power system modules using the Fuzzy-ARAS (additive ratio assessment) method and expert opinions. Twelve evaluation criteria were used to evaluate eight widely used software packages. A questionnaire was designed to capture views from professionals in academia and industry on the criteria weights and ranking of software options. Linguistic terms were used to represent the experts’ judgment, and weights were assigned to each criterion. The Fuzzy-ARAS multi-criteria decision approach was applied to obtain ratings for each software alternative. Based on the result, MATLAB emerged as the most preferred software for instructing power systems analysis, whereas MATPOWER (V 8.0) was rated as the least preferred choice. In addition, the Fuzzy-TOPSIS (Technique for Order Preference by Similarity to Ideal Solution) approach was used, producing a separate ranking; the most preferred software was MATPOWER, while the least preferred software was NEPLAN (V 360 10.5.1). A new coefficient that combines the findings of the two approaches was suggested to reconcile the ranks. The combined ranking aligns with the result of the Fuzzy-TOPSIS method by returning MATLAB as the most preferred, while the least preferred software was NEPLAN. This study significantly contributes to the choice of software for undergraduate power systems analysis instruction by providing direction to educators and institutions looking for software solutions to improve undergraduate power systems analysis education.

1. Introduction

The proliferation of computer applications has given rise to educational software packages, such as simulation software packages deployed to complement teaching and research. Simulation packages have enhanced laboratory experiences, especially when dangerous chemicals are involved or when studying electricity phenomena. Deploying simulation software is cost-effective and fills the gap for the dearth of teachers [1]. Simulation is a design process whereby the system’s behavior is represented by a model representative of design requirements using input and output behavior. It could be by mathematical or physical representation as a prototype, as the case may be. Simulation is used to represent system reality and to predict system reality based on various inputs and outputs. Simulation is a formal method of summarizing the specification of a system [2]. With augmented reality, these simulation models can be overlaid onto real lab equipment and spaces, allowing students to visualize the simulated phenomena in context [3]. Students wearing AR headsets can see virtual electric currents overlaid on physical circuit boards or animated chemical reactions projected onto real beakers and test tubes.
The availability of electricity is the main backbone of modern life; hence, power systems are an essential aspect of human life. Due to its interconnectedness across long distances, the electricity grid is very complicated and complex [4]. Carrying out life-comprehensive experimental testing may be impossible on a power grid because of the safety of personnel and equipment. Shutting down electrical power systems for fault tracing or testing for a considerable time may negatively impact socio-economic activities. Therefore, simulation is a safer way to test power systems, especially real-time simulation [2]. There are many simulation tools available to understudy and teach the field of power systems engineering. Choosing which one suits a particular problem can be daunting, especially considering the cost, user-friendliness, and what students can gain. Also, simulators use the black box approach; understanding is critical before application. In addition, no single simulator tool can answer all of a designer’s questions, especially in modern electrical power systems, which are complicated and dynamic. Therefore, in selecting the most preferred power system simulation tool for teaching, it is essential to consider various criteria to make an informed decision. This study attempts to determine the most preferred software package for instructing undergraduate power system modules using the multi-criteria decision-making method and expert opinions.

2. Literature Review

Power systems analysis and simulation can be conducted with various software tools. These tools are capable of processing structured instructions at various levels. Power systems analysis is integral to undergraduate education in electrical, electronics, and computer engineering programs. Utilizing these tools facilitates the precise and effective transmission of knowledge to students. Due to continuous advancements in technology and the competitive landscape among software vendors, numerous options are available for power systems analysis. Power systems analysis tools commonly used for teaching power systems simulations include the Electrical Transient Analyzer Program (ETAP), NEPLAN, POWER World, MATLAB, DIgSILENT, PSAT, PSCAD/EMTDC, and MATPOWER. Identifying the most suitable software solution entails assessing numerous criteria. This section examines prior research relevant to the objective of this paper. The literature examines several important attributes when evaluating software packages: assessment competence, usability, data storage quality, graphical interface, processing time, memory requirements, deployment ease, functionality, scalability, vendor support and training, long-term vendor viability, and cost. Different types of research have been devoted to building efficient methods in ranking and selecting tools used in teaching various modules at different levels in university.
Ref. [5] proposed a method that university-level renewable energy educators can use to choose an HRES simulation software for teaching and learning. The study utilizes multi-criteria decision-making methods to create a framework that combines fuzzy entropy and Fuzzy VIKOR methodologies. This framework is used to assess HRES simulation software, providing significant insights for educators and academics working on scaling renewable energy systems. The results show that Fuzzy VIKOR prefers the BCHP screening tool, COPRAS prefers HOMER, and RETScreen (VIKOR) and ORCED (COPRAS) are least preferred. These findings can help educators and academics find appropriate renewable energy system sizing and optimization tools.
Ref. [6] developed a comprehensive metric for choosing supply chain management software. The authors investigated decision-support systems, software solutions, and the elements that affect the choice of IT applications. The study examined the components and evolution of supply chain management software and described the operation of various modules within supply chain packages. Furthermore, they propose using a percentage-based weighted Tree technique to pick appropriate supply chain solutions, providing significant insights for practitioners in the industry. Ref. [7] proposed a systematic approach for defining specific criteria and sub-criteria for enterprise resource planning (ERP) software selection, particularly emphasizing manufacturing enterprises in developing nations. The study highlighted the limitations of existing research in connecting ERP software selection to real-world decision-making processes such as Multiple-Criteria Decision Making (MCDM) and Fuzzy MCDM. Through three interconnected phases, the study discovered suitable criteria from previous literature, validated them with expert feedback, and ranked them using a Fuzzy Analytic Hierarchy Process (FAHP) method. Security, investment, software features, maintainability, a support center, and report features have all been cited as essential criteria for selecting ERP software in manufacturing environments.
Ref. [8] presented a technique for selecting a blockchain platform to construct enterprise solutions, recognizing the constraints given by the abundance of accessible platforms. Demand for varied industry applications increased as Blockchain 3.0 expanded beyond Bitcoin transactions. The methodology, which included four stages—identification, selection, evaluation, and validation—used a multi-criteria decision-making method such as the Simple Multi-Attribute Rating Technique (SMART) to choose a suitable platform. The detailed examination considered system architecture, tools, domain-specific applications, and capability analysis. Validation through the development of a blockchain-based enterprise solution demonstrated the methodology’s effectiveness and scalability, assisting stakeholders in selecting acceptable blockchain platforms. Ref. [9] addressed the essential challenge of determining the best GIS software package for a project, referring to it as a multi-criteria decision-making (MCDM) problem. The success of a GIS project is significantly dependent on this decision, which requires considering a wide range of elements and balancing multiple goals. The study proposed the Analytic Hierarchy Process (AHP) as a solution, providing a structured approach to help system developers make educated judgments. The practicality of the AHP decision model is proven using a hypothetical case study, demonstrating its potential to streamline decision-making and speed up GIS software selection.
Ref. [10] used a multi-criteria decision-making technique to evaluate the performance of computer programming languages (CPLs) in higher education. They acknowledge the challenge of selecting acceptable languages amid the proliferation of programming packages. They emphasize the necessity of minimizing learning time and effort while discussing the essential characteristics of programming languages and using the Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS). The study presents a Mathematica function implementing TOPSIS to assess traditional CPLs across seven criteria. It compares TOPSIS results with Analytic Hierarchy Process (AHP) methods, offering insights into educational software selection regarding technical characteristics and learning efficiency. Ref. [11] examined the challenges associated with choosing advanced planning and scheduling (APS) software from a wide range of alternatives, focusing on the software’s critical nature in resource allocation and operational vulnerabilities. By integrating the fuzzy quality function deployment (QFD), analytic hierarchy process (AHP), and VIKOR techniques, it unveiled an innovative APS software selection methodology. By integrating APS criteria and company requirements, this methodical approach utilized triangular fuzzy numbers and the house of quality to reduce uncertainties. The simplicity and efficacy of the concept are demonstrated through its application to a case involving an aero-derivative gas turbine.
The literature review offers an in-depth assessment of the challenges and approaches involved in choosing software tools for different fields, such as renewable energy, supply chain management, enterprise resource planning, blockchain development, GIS, computer programming, and advanced planning and scheduling [12,13,14]. Each study examines the intricacy of decision-making while choosing the most suitable software solution, presenting strategies that range from multi-criteria decision-making approaches to fuzzy analytic techniques. These approaches seek to increase effectiveness, decrease ambiguity in software selection, and streamline the decision-making process. This study will add to this body of knowledge by presenting a fuzzy multi-criteria method for selecting simulation software for teaching power systems at the undergraduate level. By incorporating existing research insights and leveraging literature methodologies, the paper aims to provide a structured and reproducible approach to guiding educators and decision-makers in selecting the most appropriate simulation software for teaching power systems, ultimately improving the learning experience and effectiveness of power systems education at the undergraduate level.

Contributions to Knowledge

The importance of selecting suitable software for undergraduate power systems analysis instruction in developing countries cannot be overemphasized. The universities in these nations often face resource constraints, and allocating funds and technology is crucial for maximizing educational impact. Though MCDM techniques have been further extended to the area of software selection for most domains, only little systematic evaluation was carried out to evaluate the software alternatives for teaching power systems. This research fills this gap using Fuzzy-ARAS and Fuzzy-TOPSIS methods to rank power systems education software. An important contribution of this work is that it identifies and defines 12 evaluation criteria based on expert domain input. These 12 criteria comprehensively cover key considerations in selecting educational power systems software based on technical capabilities, usability, vendor support, and cost. This research employs two fuzzy MCDM techniques, ARAS and TOPSIS, to rank the software. To ensure the robustness and reliability of the results, we introduce a new index, the ‘combined rank coefficient,’ which harmonizes the rankings obtained from these two techniques. This dual methodology and the rank combination technique enhance the credibility of this study’s findings. This study offers valuable insights and clear directions, presenting a fresh, evidence-based framework for power systems educators and program administrators. This framework aids in selecting the most suitable software based on their specific needs and priorities, thereby enhancing power system pedagogy and educational outcomes in a more practical and effective manner.

3. Methodology

Figure 1 shows the framework used in this research; the suggested framework begins by identifying the relevant power systems simulation software used for teaching in Nigerian universities. An extensive survey is conducted among the relevant stakeholders to identify the tools and important criteria to consider when selecting the tools. The next step is to build a fuzzy-based system and specify the importance of the criteria using linguistic expressions. After that, a questionnaire is developed so that experts can evaluate the identified software based on the previously established linguistic terms.
Existing fuzzy multi-criteria decision-making (MCDM) techniques provide a variety of strategies for dealing with the uncertainties and imprecisions inherent in decision-making processes. The Fuzzy Analytic Hierarchy Process (FAHP) is a method that facilitates hierarchical decision-making by enabling decision-makers to articulate their preferences using language concepts [15]. This enhances the comprehensibility of the outcomes [16]. Another method, the Fuzzy Technique for Order Preference by Similarity to Ideal Solution (Fuzzy-TOPSIS), allows for an in-depth evaluation of alternatives by considering both positive and negative aspects [17]. This approach provides valuable insights into the relative performance of alternatives in situations of uncertainty. The Fuzzy Additive Ratio Assessment approach (Fuzzy-ARAS) enables an objective assessment of alternatives by considering their relative ratios. This method offers an explicit assessment procedure that considers uncertainties [18].
Meanwhile, the Multi-Attribute Ratio Comparison Optimization System (MARCOS) is an approach that combines ratio comparison and optimization approaches [19]; it enables decision-makers to balance conflicting objectives successfully. The Multi-Attribute Border Approximation Area Comparison (MABAC) method provides a dynamic strategy to deal with incomplete and inaccurate information by estimating the limits of the decision space, enabling robust decision making in uncertain environments [20]. The Multi-Attribute Ideal-Real Comparative Analysis (MAIRCA) offers an extensive structure for evaluating alternatives based on ideal and real benchmarks [21]. This allows decision-makers to examine the relative performance of alternatives in dynamic choice situations. These fuzzy MCDM techniques provide decision-makers with useful tools to help negotiate challenging decision-making environments. However, carefully considering their advantages and limitations is necessary for their effective application. Each technique has its unique strengths and drawbacks, as outlined in Table 1.
Two MCDM methods, namely the Additive Ratio Assessment System (ARAS) and Technique for Order Preference by Similarity to Ideal Solution (TOPSIS), were used to rank the software, and the results were compared. The Fuzzy-ARAS and Fuzzy-TOPSIS approaches were selected to evaluate the most suitable power system software based on their technical capabilities, which aligned with this study’s specific objectives. Specifically, the utilization of fuzzy set theory played a crucial role in enabling these strategies to effectively manage the uncertainty and imprecision that naturally arise from expert criteria weights and software ratings [24,25]. Furthermore, Fuzzy-ARAS and Fuzzy-TOPSIS were selected because they adapted to quantitative and qualitative criteria and demonstrated the capacity to rank alternatives, simple computational processes, and successful prior implementations in software evaluation challenges. Additionally, using two complementary methodologies enabled a comparison of outcomes and yielded a further understanding of software preferences.
More recently, the Fuzzy-TOPSIS method has been successfully used in the selection of suppliers for speech recognition products in IT projects by combining techniques [26], the evaluation and selection of open-source EMR software packages [27], analysis of agricultural production [28,29], and the selection of wind turbine sites [30]. Also, various researchers have adopted TOPSIS for conveyor equipment evaluation and selection [24,31], the evaluation of mobile banking services [32], the selection of vendors for wind farms [33], and sustainable recycling partner selection [34]. The successful prior implementations of Fuzzy-TOPSIS and Fuzzy-ARAS in various software evaluation challenges, as evidenced by their use in supplier selection, analysis of agricultural production, wind turbine site selection, and other domains, underscores their suitability for the task at hand.
To leverage the strengths of each method, this study developed a combined ranking coefficient that harmonizes the differences in rankings from Fuzzy-ARAS and Fuzzy-TOPSIS [24,25]. This coefficient was derived by averaging the normalized outcomes from both methods, ensuring that the final ranking captures the balanced performance highlighted by Fuzzy-ARAS and the ideal-solution-oriented assessment of Fuzzy-TOPSIS. By merging these two techniques, the combined coefficient provides a more dependable and thorough ranking of the alternatives. This approach to combining rankings reduces the potential inconsistencies that can occur when relying on a single method and enhances the overall reliability of the study. The outcome is a more robust decision-making framework that considers both the overall performance and the unique advantages of each software alternative.

3.1. Fuzzy Additive Ratio Assessment (Fuzzy-ARAS)

The Fuzzy-ARAS assessment method is a multi-criteria decision method used for ranking different alternatives. The decision process is based on different experts’ opinions, and the summation of these opinions is used to determine their respective criterion weights and performance rating for optimal decision making. Several pieces of literature on Fuzzy-ARAS are available [31,35,36,37,38,39,40]. To perform a multi-criteria analysis using the Fuzzy-ARAS method, the following steps were followed:
Step 1: decide the linguistic variables for criteria weights and performance rating.
Step 2: Convert the experts’ opinions to internally valued triangular fuzzy numbers.
l = min k ( l k )
i = ( k = 1 k l k ) 1 k
m = ( k = 1 k m k ) 1 k
u = ( k = 1 k u k ) 1 k
u = min k ( u k )
where x ~ = [ l , l ,   m ,   u ,   u ] denotes the corresponding interval-valued triangular fuzzy number,   x ~ k = ( l k , m k , u k ) denotes the triangular fuzzy obtained based on the opinion of the k t h participant (decision-maker), k = 1 . . k , and k is the number of participants.
Step 3: Form the decision-making matrix.
D = [ x i j ] m × n
where D is a decision matrix, x i j is the performance rating of the i - t h alternative of the j - t h criterion, and i = 1 , 2 , m , where m is the number of criteria.
W = w j ,
where W is a weight vector, w j is the weight of the j - t h criterion, and j = 1 , 2 , . . n , where n is the number of criteria.
D = [ x i j k ] m × n × k
W = w j k n × k
where x i j k is the performance rating of the i - t h alternative to the j - t h criteria given by the k t h decision-maker; k = 1 , 2 , k , where k is a number of decision-makers and/or experts in the Multi-Criteria Group Decision Making (MCGDM).
Step 4: Determine the optimal performance rating for each criterion.
x o j = m a x i x i j ;   j     Ω m a x ,   m i n i x i j ;   j     Ω m i n ,
where x o j is the optimal performance rating of the j - t h criterion, Ω m a x is the benefit criteria (the higher the values, the better), and Ω m i n is the set of non-beneficial criteria (the lower the values, the better).
x ~ o j = l o j ,   l o j ,   m o j ,   u o j ,   u o j
where
l o j = m a x i l i j ;   j     Ω m a x , m i n i l i j ;   j     Ω m i n ,
l o j = m a x i l i j ;   j     Ω m a x , m i n i l i j ;   j     Ω m i n ,
m o j = m a x i m i j ;   j     Ω m a x , m i n i m i j ;   j     Ω m i n ,
u o j = m a x i u i j ;   j     Ω m a x , m i n i u i j ;   j     Ω m i n ,
u o j = m a x i u i j ;   j     Ω m a x , m i n i u i j ;   j     Ω m i n ,
If the criteria are beneficial, the maximum will be chosen, but if the criteria are non-beneficial, the minimum will be chosen.
Step 5: Normalize the decision-making matrix.
x ~ i j = x i j i = 0 m x i j
The preferable criteria, the values of which are minima, are normalized through a two-stage process:
x i j = 1 x i j *
[ a i j c j + ,   a i j c j + ,   b i j c j + ,   c i j c j + , c i j c j + ] ;   j     Ω m a x , [ 1 a i j a j ,   1 a i j a j ,   1 b i j a j ,   1 c i j a j , 1 c i j a j ] ;   j     Ω m i n ,
where r ~ is the normalized internal-valued fuzzy performance rating of the i - t h alternative in relation to the j - t h criterion, i = 0 , 1 , . m ,
c j + = i = 0 m c i j
and
a j = i = 0 m 1 a i j
Step 6: Weigh the internal-valued normalized fuzzy decision matrix.
V i j ~ = w j ~     .       r i j ~
where V i j ~ is the weighted normalized interval-valued fuzzy performance rating of the i - t h alternative in relation to the j - t h criterion, i = 0 , 1 , m .
Step 7: Determine the overall interval-valued fuzzy performance rating.
S ~ i = j = 1 n V ~ i j
where S ~ i is the overall interval-valued fuzzy performance rating of the i - t h alternative, i = 0 , 1 , . . m .
Step 8: Carry out defuzzification of S i .
g m   B ~ = l + l + m + u + u   5
where B ~ is the interval-valued triangular fuzzy number of the form [ l , l ,   m ,   u ,   u ] .
Step 9: Determine the degree of utility Q i for each of the alternatives.
Q ~ i = S ~ i S ~ 0
Step 10: Rank the alternatives based on the defuzzified Q i values.
Fuzzy Technique for Order of Preference by Similarity to Ideal Solution (Fuzzy-TOPSIS) is a multi-criteria decision-making (MCDM) method used to evaluate and rank alternatives based on multiple criteria [41]. It is an extension of the well-known Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS) method, which uses fuzzy logic to deal with uncertainty and imprecision in the evaluation data. In the Fuzzy-TOPSIS method, the criteria are first converted into fuzzy numbers to represent their relative importance. The performance of each alternative is then rated on each criterion using fuzzy numbers to represent the degree to which the alternative satisfies each criterion. The similarity between each alternative and the ideal solution (the best alternative) and the worst solution (the worst alternative) is then calculated using fuzzy set theory. The final ranking of the alternatives is based on the distance between each alternative and the ideal solution, with the closest alternative being ranked as the best. The Fuzzy-TOPSIS method comprehensively evaluates the alternatives based on multiple criteria and considers the uncertainty and imprecision in the evaluation data [42].
The Fuzzy-TOPSIS method has been applied in various fields, such as engineering, environmental management, and social sciences, and has been found to be an effective tool for solving complex decision-making problems. To perform a multi-criteria analysis using the Fuzzy-TOPSIS method, you need to follow the following steps [29]:
Step 1: Define the problem and decision criteria.
Define the problem and the criteria that are relevant to the decision-making process.
Step 2: Develop the evaluation matrix.
Create a matrix that contains the fuzzy numbers representing the performance of each alternative on each criterion using x ~ i j = a i j , b i j , c i j .
a i j   = m i n k ( a i j k ) ,
b i j   = 1 k k = 1 k b i j k
c i j = m a x k   { c i j k } .
Step 3: Normalize the evaluation matrix.
Normalize each row of the evaluation matrix by dividing each value by the maximum value in that row using the following:
~ r i j = a i j c j +   ,   b i j c j +   ,   c i j c j +   a n d   c j + = m a x i { c i j }   ( b e n e f i c i a l   c r i t e r i a )
~ r i j = a j c i j   ,   a j b i j   ,   a j a i j   a n d   a j = m i n i { a i j }   ( n o n b e n e f i c i a l   c r i t e r i a )
Step 4: Assign criteria weights.
Assign weights to each criterion to represent the relative importance of each criterion. The weights can be assigned as fuzzy numbers using the following:
v ~ i j = r ~ i j × w j
Step 5: Weighted normalization of the evaluation matrix.
Multiply the normalized evaluation matrix by the criteria weights to get a weighted normalization matrix using the following:
A ~ 1 × A ~ 2 = a 1 , b 1 , c 1 × a 2 , b 2 , c 2 = ( a 1 × a 2 , b 1 × b 2 , c 1 × c 2 )
Step 6: Identify ideal and negative-ideal solutions.
Identify the ideal solution as the alternative that has the highest values in the weighted normalization matrix and the negative-ideal solution as the alternative that has the lowest values in the weighted normalization matrix using the following:
A + = V ~ 1 + ,   V ~ 2 + ,   V ~ n + ,   w h e r e   V ~ j + = max i { V i j 3 }  
A = V ~ 1 ,   V ~ 2 ,   V ~ n ,   w h e r e   V ~ j = min i { V i j 1 }  
Step 7: Perform distance calculation.
Calculate the distance between each alternative and the ideal solution and the distance between each alternative and the negative-ideal solution using the following:
d ( x ~ , y ~ . ) = 1 3   a 1 a 2 2 + b 1 b 2 2 + c 1 c 2 2
d i + = j = 1 n d ( v ~ i j ,   v ~ j + )
d i = j = 1 n d ( v ~ i j ,   v ~ j )
Step 8: Perform relative closeness calculation.
Calculate the relative closeness of each alternative to the ideal solution using the following:
c c i = d i d i + d i +
Step 9: Determine the final ranking.
Rank the alternatives in decreasing order of their relative closeness to the ideal solution.
Step 10: Interpret the results.
Finally, interpret the results of the analysis and make a decision based on the rankings.

3.2. Empirical Illustrations

To illustrate the efficacy of the proposed Fuzzy-ARAS framework in selecting the most preferred software for teaching power systems analysis at the undergraduate level, 8 popular software used in teaching power systems simulations were selected and compared based on 12 criteria. Figure 2 shows the hierarchical structure of the MCDM problem to be solved.

3.3. Software Selection Criteria

The criteria selected for this study were based on insights from the literature. Ref. [5] placed importance on criteria related to evaluation capacity (assessment competence), ease of deployment (implementation ease), memory requirements, storage quality, graphical interface, computational or process time, and usability (ease of use). Ref [43] emphasized scalability as an important criterion when selecting software. Ref [27] also validated the importance of functionality, support and training, and vendor viability, while Ref. [44] stressed the need to include the cost criterion when evaluating software selection.
The primary criteria for assessing educational software solutions are educational quality, which evaluates how well the software supports student learning; usability, which assesses how user-friendly the interfaces and workflows are for learners; and technical capability, encompassing aspects like memory requirements, storage capacity, and scalability. Others include performance, particularly the application’s processing speed and responsiveness; capability, or the range of features and functions offered; vendor quality, which considers the quality of training and support offered by the vendor as well as the viability of the vendor; and cost regarding the software’s affordability and educational value proposition. Technical capability, performance, and capability cover complementary aspects of evaluating educational software. Technical capability refers to the underlying technical infrastructure that enables software functionality. Performance focuses on processing speed and how quickly the software can deliver results, while capability relates to the breadth and depth of features and functions provided by the software. Based on this, technical capability provides the technical foundations, performance measures speed, and capability assesses the functional scope. These seven categories, taken together, offer a thorough framework for evaluating the merits and demerits of any particular educational software platform in terms of its influence on education, ease of use, technical underpinnings, speed, functionality, vendor standing, and affordability.
Assessment Competence (C1): The kind of scrutiny/evaluation the software can handle while impacting a wide range of knowledge in the students.
Usability (C2): How fast and easily students will be able to use the software in the learning procedure.
Storage Quality (C3): How much capacity the solution has to store information without crashing or slowing down the learning process.
Graphical Interface (C4): How simple the interfaces are and how presentable they are in the software to handle chats and complex simulation results in a friendly manner to the students.
Process time (C5): How fast the software can process power systems solutions for large data.
Memory Prerequisites (C6): This considers the memory size required to install the software solution and process study-related tasks for the students.
Ease of Deployment (C7): How the results of simulation can be deployed and implemented easily.
Functionality (C8): How many functions are available to complement the power system knowledge required by the students.
Scalability (C9): How scalable the software is in multisystem and advanced environments where multiple instances of the system are required.
Support and Training (C10): How much support is available from the software vendors and trainer program availability for training.
Vendor Viability (C11): How sustainable the vendor of the software system is.
Cost (C12): How cost-effective the software is to be affordable for educational purposes.

3.4. Expert Opinion Collection

A questionnaire was designed to collect the experts’ opinions on the weights of the selected criteria and the ranking of the software based on the selected criteria. All with more than 5 years of experience, four of the experts consulted have a doctoral degree in electrical engineering, while one of them has a master’s degree in electrical engineering. The developed questionnaire was divided into 2 sections; section one focuses on obtaining experts’ opinions on software solutions for power systems analysis (on its adoption for teaching at the undergraduate level) against the criteria, while the second section consists of questions that rate the importance of the criteria on the selection of power systems analysis software for teaching and learning. The scale used for obtaining the opinions of the experts was a 10-point linguistic term, as shown in Figure 3.
This study followed ethical guidelines; participants were made aware that their involvement was voluntary and that their answers would only be used for research purposes, with all data kept anonymous. By returning the completed questionnaires, participants implied their consent. Since the study did not include any physical or psychological interventions, we did not need formal ethical approval from the University of Ministry of Education. The study adhered to the principles outlined in the Declaration of Helsinki and complied with relevant data protection regulations to ensure the confidentiality and proper management of all data.

4. Results and Discussion

A survey across Nigerian institutions academics shows that the most common software that instructors use in teaching power systems analysis include ETAP (M1), NEPLAN (M2), POWER World (M3), MATLAB (M4), DIgSILENT (M5), PSAT (M6), PSCAD/EMTDC (M7), and MATPOWER (M8). To select the criteria, software and power systems engineers in the industry and academia were consulted; they suggested that the most relevant criteria for the present study are assessment competence (C1), usability (C2), storage quality (C3), graphical interface (C4), process time (C5), memory prerequisites (C6), ease of deployment (C7), functionality (C8), scalability (C9), support and training (C10), vendor viability (C11), and cost (C12). These are discussed briefly in the next subsection. According to the experts’ responses, validating the software used to teach power system analysis at the undergraduate level is essential. The experts also agreed that various criteria should be considered when selecting software solutions concerning power system analysis. The opinions of the five experts on the importance of the 12 criteria are presented in Table 2 and Table 3, while the opinions of the experts on the rank of the software based on the selected criteria are presented in Appendix A.
Following steps 1–7 of the ARAS method, the overall interval-valued fuzzy performance rating was obtained and is given in Table 4. Defuzzification is then performed on the values given in Table 5 to obtain the following interval-valued fuzzy performance ratings:
S 0 = 16.7099
S 1 = 14.9927
S 2 = 14.7223
S 3 = 16.0213
S 4 = 16.4243
S 5 = 15.8728
S 6 = 15.4216
S 7 = 14.8636
S 8 = 7.24401
By dividing the values of the various S i corresponding to each alternative with S 0 , the degree of utility Q i for each of the alternatives is obtained. The values of the Q i are used in ranking the alternatives. The alternative with the lowest value is taken as the least preferred alternative, while the one with the highest value is taken as the most preferred alternative.
Q 1 = 0.8972
Q 2 = 0.8810
Q 3 = 0.9587
Q 4 = 0.9829
Q 5 = 0.9499
Q 6 = 0.9229
Q 7 = 0.8895
Q 8 = 0.4335
From the ARAS analysis, it is seen that MATLAB is the most preferred software that experts think is good for teaching power systems analysis, while the least preferred is MATPOWER. The TOPSIS method was also implemented using the responses obtained from the experts, and it produced a result different from that of the ARAS method. The TOPSIS method shows that the most preferred software is MATPOWER, while the least preferred software is NEPLAN (Figure 4). The use of Fuzzy-ARAS and Fuzzy-TOPSIS methods led to varying rankings for the software alternatives assessed, which might prompt questions about the consistency of the methodology. However, this variation is not a drawback; it highlights the distinct advantages and focus of each approach. Fuzzy-ARAS offers a thorough evaluation by computing additive ratios, allowing for a well-rounded comparison of each alternative across all criteria. On the other hand, Fuzzy-TOPSIS assesses alternatives based on their closeness to an ideal solution, giving greater weight to how close each alternative is to optimal performance for each criterion. The differences in rankings between the two methods emphasize the need to consider various viewpoints when making decisions, especially in complex scenarios like selecting software for educational use. While Fuzzy-ARAS offers a comprehensive perspective by consolidating all criteria into a single metric, Fuzzy-TOPSIS ensures that options excelling in specific key areas are not missed.
To bridge the gap between these two methods and enhance the study’s reliability, the study introduced a combined ranking coefficient. This coefficient aligns the results from both methods by averaging the normalized scores. This approach guarantees that the final ranking reflects both the balanced performance captured by Fuzzy-ARAS and the ideal-solution-focused evaluation provided by Fuzzy-TOPSIS. The integration of these two methods results in a more thorough and nuanced assessment, minimizing the risk of any single method’s limitations or biases disproportionately influencing the final outcomes. The combined ranking coefficient highlights its importance by integrating the strengths of both methods, leading to a more dependable evaluation of the software options. This strategy guarantees that alternatives that consistently perform well across all criteria, along with those that excel in particular aspects, are both adequately represented in the final rankings.
From the details of the results obtained from the ARAS and TOPSIS method (Table 5), by using Equation (39), c c i and Q i are used to derive another coefficient, which is proposed for combining the rank of the two methods used in this study. This new coefficient shows that MATLAB is still returned as the most preferred alternative for teaching power systems courses, while NEPLAN is the least preferred alternative (Figure 5).
R i f = 1 1 / Q i A R A S + 1 / C C i T O P S I S
This study’s results are similar to what other research has reported about the ranking and selection of software using MCDM methods. However, there are also some noticeable differences because of the specific application and methods used. The result of this study agrees with Ref. [6], Ref. [7,10] on the importance of technical capabilities, usability, and vendor support as key criteria for software selection. Although these studies focused on different areas like ERP and computer programming languages, they all reported that technical features, user-friendliness, and vendor factors are crucial in choosing software. Additionally, this study supports the use of structured MCDM approaches for decision making in software selection; this aligns with the findings of Ref. [8,9,11]. Although the studies focused on different application areas, they show how fuzzy MCDM methods are practical for considering various factors and reducing uncertainty in the selection process.
Nonetheless, there are distinctions between this research and the examined literature. The MCDM methods employed in this paper (ARAS and TOPSIS) are not the same as those used in some of these studies. For example, Fuzzy VIKOR and COPRAS were utilized by Ref. [5], while Ref. [9] used AHP. This disparity in method might lead to different software being ranked highest in different studies. In addition, the context of application for which the present study was conducted (educational systems in electrical engineering) is different from what has been covered by other works under review, such as sustainable energy [5], ERPs [6,7], and blockchain technology [8]. This difference in domains may lead to variations in the specific software alternatives considered and the prioritization of certain criteria.
Despite these differences, the current study makes novel contributions to the existing body of knowledge on software selection using MCDM methods. To the best of the authors’ knowledge, this study is the first to apply MCDM methods, specifically ARAS and TOPSIS, to select software for power systems education. This extends the application of these MCDM techniques to a new domain and provides valuable insights for educators and decision-makers in this field. Furthermore, the introduction of a combined rank coefficient to reconcile the rankings obtained from ARAS and TOPSIS is a methodological innovation not observed in the reviewed literature. This approach offers a novel way to synthesize results from multiple MCDM methods and enhance the robustness of the software selection process.

5. Conclusions

This study used the Fuzzy-ARAS method to compare and evaluate various power systems analysis software for undergraduate instruction. A framework combining the Fuzzy-ARAS (Fuzzy Additive Ratio Assessment) approach and expert judgments was used to choose the most suitable software. Eight software products were evaluated using twelve criteria, including assessment competency, usability, storage quality, graphical interface, process time, memory requirements, ease of deployment, functionality, scalability, support and training, vendor viability, and cost.
This study makes notable contributions to the literature on power systems education and MCDM applications. To the best of our knowledge, it represents the first systematic application of fuzzy MCDM methods to evaluate and rank software alternatives for teaching power systems courses. Our rigorous evaluation approach, based on expert-defined criteria, provides a new methodological template for software selection in this domain. Secondly, by employing two different fuzzy MCDM methods (ARAS and TOPSIS) and proposing a novel ‘combined rank coefficient,’ this study demonstrates a robust and reliable approach to synthesizing rankings from multiple MCDM techniques. This methodological innovation enhances the trustworthiness of our software rankings.
Significantly, this work provides a practical, evidence-based framework for power systems educators and program leaders. By identifying the top-ranking software across a range of key criteria, these findings empower educational institutions to select tools that optimally support student learning, align with instructor priorities, and fit within budget constraints. This contribution has the potential to improve the quality and outcomes of power systems education substantially. This study builds upon existing MCDM methods, applying them in a novel context with a new set of expert-defined criteria. It also introduces a new rank combination approach, providing robust, practically useful power systems education software rankings. These contributions significantly advance the understanding of best practices in power systems pedagogy and underscore the value of MCDM methods for educational technology selection.
The results of the Fuzzy-ARAS method were compared with the TOPSIS method. The following are the study’s main findings:
  • Given its effectiveness and adaptability for educational objectives, MATLAB emerged as the most suitable software for teaching power systems analysis at the undergraduate level in Nigeria. NEPLAN was rated as the least preferred software, implying that meeting the standards and needs for efficient teaching and learning in the Nigerian academic context may be difficult.
  • In the area of power systems analysis education, the Fuzzy-ARAS method established its effectiveness in handling multi-criteria decision-making problems and provided a systematic methodology for software selection.
  • The study demonstrated the potential inconsistencies in rankings generated using various evaluation techniques, as seen by the discrepancy between the Fuzzy-ARAS and TOPSIS ranks. This emphasizes how important it is to consider various evaluation techniques to thoroughly understand software performance.
Furthermore, this study emphasizes the advantages of utilizing various methodologies to assess complex decision-making situations. By applying both Fuzzy-ARAS and Fuzzy-TOPSIS methods, the study was able to evaluate the software options from two complementary angles: one that looks at overall performance across all criteria and another that focuses on proximity to an ideal solution. The addition of a combined ranking coefficient enhances the reliability of the final results by integrating these two approaches. The combined ranking coefficient, which merges the outcomes from Fuzzy-ARAS and Fuzzy-TOPSIS, provides an additional layer of strength to the evaluation process. This method ensures that the final rankings are not disproportionately affected by the limitations or biases of any single approach. Consequently, educators and decision-makers can feel more assured in choosing the most appropriate software for teaching power systems analysis, as the ranking reflects both balanced performance and alignment with the ideal solution.
The findings of this study can help educators, universities, and curriculum developers choose appropriate software for instructing power systems analysis. Institutions can improve the standard of undergraduate instruction in power systems analysis, resulting in enhanced learning outcomes and better preparing students for real-world issues in the area, by considering the preferences and opinions of experts and implementing evaluation criteria.

Author Contributions

Conceptualization, O.B., M.E., J.A. and K.A.A.; methodology, O.B. and M.E.; software, J.A.; validation, O.B.; formal analysis, K.A.A. and M.E.; investigation, O.B. and J.A.; resources, O.B., M.E., J.A. and K.A.A.; data curation, O.B., M.E., J.A. and K.A.A.; writing—original draft preparation, O.B., M.E., J.A., B.A., O.O. and K.A.A.; writing—review and editing, O.B., M.E., J.A., B.A., O.O. and K.A.A.; visualization, O.B. and J.A.; project administration, O.B. and J.A. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

This study was conducted in accordance with the principles of the Declaration of Helsinki, and data were handled according to the General Data Protection Regulation.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Data are contained within the article.

Acknowledgments

The authors used AI to articulate the ideas and for grammar checks. We acknowledge the affiliations of the authors for providing enabling environments for research activities.

Conflicts of Interest

The authors declare no conflicts of interest.

Appendix A

Expert 1
123456789101112
ETAP Very HighVery Very HighVery Very HighVery Very HighVery Very HighAbove AverageVery Very HighVery Very HighVery Very HighAbove AverageBelow AverageVery Very High
NEPLAN Above AverageAbove AverageAbove AverageAbove AverageAbove AverageAverageAverageAverageAverageAbove AverageAverageAverage
POWER World Very HighVery HighHighAbove AverageVery Very HighAbove AverageVery HighHighAbove AverageAverageBelow AverageVery High
MATLAB Very Very HighVery Very HighVery Very HighAbove AverageVery HighAbove AverageVery Very HighVery Very HighVery Very HighVery Very HighVery Very HighLow
DIgSILENT HighHighHighAbove AverageHighAbove AverageVery HighVery HighAverageBelow AverageBelow AverageHigh
PSATAbove AverageAbove AverageAbove AverageAbove AverageAbove AverageAverageAverageAverageAverageAverageAverageAverage
PSCAD/EMTDCAverageAverageAverageAverageAverageAverageAverageAverageAverageAverageAverageAverage
MATPOWERLowLowLowLowLowBelow AverageAbove AverageAverageAverageAbove AverageAbove AverageAverage
Expert 2
123456789101112
ETAP Very Very HighVery Very HighVery HighExtremely HighVery Very HighExtremely HighExtremely HighExtremely HighExtremely HighHighVery HighHigh
NEPLAN Very Very HighVery HighVery Very HighVery Very HighVery Very HighExtremely HighExtremely HighExtremely HighExtremely HighHighVery HighAbove Average
POWER World Extremely HighVery Very HighVery Very HighExtremely HighVery Very HighVery Very HighExtremely HighExtremely HighExtremely HighVery Very HighVery Very HighHigh
MATLAB Extremely HighVery HighExtremely HighExtremely HighExtremely HighExtremely HighVery HighExtremely HighExtremely HighExtremely HighExtremely HighVery Very High
DIgSILENT Very Very HighVery Very HighExtremely HighExtremely HighVery Very HighVery Very HighVery Very HighExtremely HighVery Very HighHighVery HighAbove Average
PSATExtremely HighVery Very HighVery Very HighExtremely HighExtremely HighExtremely HighExtremely HighExtremely HighExtremely HighExtremely HighVery Very HighVery High
PSCAD/EMTDCVery Very HighVery Very HighVery Very HighVery Very HighVery Very HighVery Very HighExtremely HighVery Very HighVery Very HighHighHighAverage
MATPOWERExtremely HighExtremely HighExtremely HighVery Very HighExtremely HighExtremely HighVery Very HighVery Very HighVery Very HighVery Very HighVery Very HighHigh
Expert 3
123456789101112
ETAP HighAbove AverageHighAbove AverageHighAbove AverageLowVery HighHighVery Very HighLowLow
NEPLAN HighAbove AverageAverageAbove AverageAbove AverageAbove AverageHighVery Very HighHighVery Very HighHighHigh
POWER World HighAbove AverageAverageAbove AverageExtremely HighLowHighVery Very HighHighVery Very HighVery HighVery High
MATLAB Extremely HighAbove AverageVery very highExtremely HighHighExtremely HighHighExtremely HighExtremely HighVery Very HighExtremely LowExtremely Low
DIgSILENT HighAbove AverageAverageAbove AverageAbove AverageAbove AverageHighVery Very HighHighVery Very HighExtremely LowExtremely Low
PSATExtremely HighAbove AverageVery Very HighExtremely HighHighExtremely HighHighVery Very HighHighVery Very HighVery LowVery Low
PSCAD/EMTDCExtremely HighVery HighVery Very HighExtremely HighExtremely HighBelow AverageHighExtremely HighHighVery Very HighVery LowVery Low
MATPOWER
Expert 4
123456789101112
ETAP HighAverageHighVery HighAbove AverageHighHighHighAbove AverageAbove AverageChoose an item.Average
NEPLAN Below AverageLowAbove AverageAverageAbove AverageLowAverageAverageAbove AverageLowBelow AverageAverage
POWER World Above AverageHighAbove AverageHighHighAbove AverageHighHighAbove AverageAbove AverageAverageAverage
MATLAB Very Very HighExtremely HighVery HighVery HighVery HighHighVery HighVery HighVery HighVery HighHighAbove Average
DIgSILENT HighHighAbove AverageHighVery HighHighHighHighAbove AverageAbove AverageAbove AverageHigh
PSATHighVery HighAbove AverageHighHighVery HighAbove AverageAbove AverageAbove AverageAverageAverageAverage
PSCAD/EMTDCVery HighHighHighVery HighHighHighHighHighHighAbove AverageAbove AverageHigh
MATPOWERVery HighHighHighAbove AverageHighAbove AverageHighHighAbove AverageAverageAverageAbove Average
Expert 5
123456789101112
ETAP Extremely HighVery HighHighVery HighHighHighAbove AverageHighVery Very HighVery Very HighVery Very HighAverage
NEPLAN HighAbove AverageVery HighAbove AverageAbove AverageAverageAbove AverageAbove AverageAbove AverageVery HighVery Very HighAbove Average
POWER World Very Very HighExtremely HighAverageVery HighAverageBelow AverageBelow AverageVery Very HighAverageVery Very HighExtremely HighVery Low
MATLAB Very HighHighExtremely HighVery Very HighVery HighVery Very HighAverageExtremely HighAverageExtremely HighExtremely HighVery High
DIgSILENT Very HighHighVery Very HighVery HighHighVery Very HighAbove AverageHighHighVery HighVery Very HighHigh
PSATVery Very HighVery HighVery HighVery HighHighHighAbove AverageHighHighExtremely HighVery HighAbove Average
PSCAD/EMTDCAbove AverageAverageHighHighAverageAverageAverageHighAbove AverageVery HighVery HighAbove Average
MATPOWERVery HighVery HighAbove AverageVery HighBelow AverageVery HighAbove AverageVery HighBelow AverageExtremely HighVery Very HighVery Low

References

  1. Finkelstein, N.D.; Adams, W.K.; Keller, C.J.; Kohl, P.B.; Perkins, K.K.; Podolefsky, N.S.; Reid, S.; LeMaster, R. When learning about the real world is better done virtually: A study of substituting computer simulations for laboratory equipment. Phys. Rev. Spéc. Top.-Phys. Educ. Res. 2005, 1, 010103. [Google Scholar] [CrossRef]
  2. De Carne, G.; Lauss, G.; Syed, M.H.; Monti, A.; Benigni, A.; Karrari, S.; Kotsampopoulos, P.; Faruque, M.O. On modeling depths of power electronic circuits for real-time simulation—A comparative analysis for power systems. IEEE Open Access J. Power Energy 2022, 9, 76–87. [Google Scholar] [CrossRef]
  3. Noah, N.; Das, S. Exploring evolution of augmented and virtual reality education space in 2020 through systematic literature review. Comput. Animat. Virtual Worlds 2021, 32, e2020. [Google Scholar] [CrossRef]
  4. Abdulsalam, K.A.; Adebisi, J.; Emezirinwune, M.; Babatunde, O. An overview and multicriteria analysis of communication technologies for smart grid applications. E-Prime-Adv. Electr. Eng. Electron. Energy 2023, 3, 100121. [Google Scholar] [CrossRef]
  5. Ighravwe, D.E.; Babatunde, M.O.; Mosetlhe, T.C.; Aikhuele, D.; Akinyele, D. A MCDM-based framework for the selection of renewable energy system simulation tool for teaching and learning at university level. Environ. Dev. Sustain. 2021, 24, 13035–13056. [Google Scholar] [CrossRef]
  6. Sahay, B.; Gupta, A. Development of software selection criteria for supply chain solutions. Ind. Manag. Data Syst. 2003, 103, 97–110. [Google Scholar] [CrossRef]
  7. Galankashi, M.R.; Ahmadshoar, A.; Helmi, S.A.; Arjmand, M.M. ERP Software Selection Criteria: A Fuzzy Analytic Hier-archy Process (FAHP) Approach. In Proceedings of the International Conference on Industrial Engineering and Operations Management, Istanbul, Turkey, 7–10 March 2022. [Google Scholar]
  8. Nanayakkara, S.; Rodrigo, M.; Perera, S.; Weerasuriya, G.; Hijazi, A.A. A methodology for selection of a Blockchain platform to develop an enterprise system. J. Ind. Inf. Integr. 2021, 23, 100215. [Google Scholar] [CrossRef]
  9. Eldrandaly, K. GIS software selection: A multi criteria decision making approach. Appl. GIS 2007, 3, 1–17. [Google Scholar]
  10. Yıldızbaşı, A.; Rouyendegh, B.D. Multi-criteria decision making approach for evaluation of the performance of computer programming languages in higher education. Comput. Appl. Eng. Educ. 2018, 26, 1992–2001. [Google Scholar] [CrossRef]
  11. Piengang, F.C.N.; Beauregard, Y.; Kenné, J.P. An APS software selection methodology integrating experts and decisions-maker’s opinions on selection criteria: A case study. Cogent Eng. 2019, 6, 1594509. [Google Scholar] [CrossRef]
  12. Okegbile, S.D.; Maharaj, B.T.; Alfa, A.S. Malicious users control and management in cognitive radio networks with priority queues. In Proceedings of the 2020 IEEE 92nd Vehicular Technology Conference (VTC2020-Fall), Victoria, BC, Canada, 18 November–16 December 2020; pp. 1–7. [Google Scholar] [CrossRef]
  13. Okegbile, S.D.; Maharaj, B.T.; Alfa, A.S. A Multi-Class Channel Access Scheme for Cognitive Edge Computing-Based Internet of Things Networks. IEEE Trans. Veh. Technol. 2022, 71, 9912–9924. [Google Scholar] [CrossRef]
  14. Okegbile, S.D.; Cai, J.; Zheng, H.; Chen, J.; Yi, C. Differentially Private Federated Multi-Task Learning Framework for Enhancing Human-to-Virtual Connectivity in Human Digital Twin. IEEE J. Sel. Areas Commun. 2023, 41, 3533–3547. [Google Scholar] [CrossRef]
  15. Rouyendegh, B.D.; Erkan, T.E. Selecting the best supplier using analytic hierarchy process (AHP) method. Afr. J. Bus. Manag. 2012, 6, 1455. [Google Scholar]
  16. Chen, C.-T. Extensions of the TOPSIS for group decision-making under fuzzy environment. Fuzzy Sets Syst. 2000, 114, 1–9. [Google Scholar] [CrossRef]
  17. John, A.A.; Damilola, E.B.; Olubayo, M.B. A multicriteria framework for selecting information communication technology alternatives for climate change adaptation. Cogent Eng. 2022, 9, 2119537. [Google Scholar] [CrossRef]
  18. Liu, N.; Xu, Z. An overview of ARAS method: Theory development, application extension, and future challenge. Int. J. Intell. Syst. 2021, 36, 3524–3565. [Google Scholar] [CrossRef]
  19. Vinogradova, I. Multi-attribute decision-making methods as a part of mathematical optimization. Mathematics 2019, 7, 915. [Google Scholar] [CrossRef]
  20. Liu, P.; Pan, Q.; Xu, H. Multi-attributive border approximation area comparison (MABAC) method based on normal q-rung orthopair fuzzy environment. J. Intell. Fuzzy Syst. 2021, 40, 9085–9111. [Google Scholar] [CrossRef]
  21. Hadian, S.; Tabarestani, E.S.; Pham, Q.B. Multi attributive ideal-real comparative analysis (MAIRCA) method for evaluating flood susceptibility in a temperate Mediterranean climate. Hydrol. Sci. J. 2022, 67, 401–418. [Google Scholar] [CrossRef]
  22. Liu, Y.; Eckert, C.M.; Earl, C. A review of fuzzy AHP methods for decision-making with subjective judgements. Expert Syst. Appl. 2020, 161, 113738. [Google Scholar] [CrossRef]
  23. Turskis, Z.; Zavadskas, E.K. A new fuzzy additive ratio assessment method (ARAS-F). Case study: The analysis of fuzzy multiple criteria in order to select the logistic centers location. Transport 2010, 25, 423–432. [Google Scholar] [CrossRef]
  24. Sohaib, O.; Naderpour, M. Decision making on adoption of cloud computing in e-commerce using fuzzy TOPSIS. In Proceedings of the 2017 IEEE International Conference on Fuzzy Systems (FUZZ-IEEE), Naples, Italy, 9–12 July 2017; pp. 1–6. [Google Scholar]
  25. Sohaib, O.; Arman, A.; Begum, V.; Arshi, T. Applying fuzzy logic to balanced scorecard for the performance evaluation of government e-services. J. Sci. Technol. Policy Manag. 2024, in press. [Google Scholar] [CrossRef]
  26. Taghipour, A.; Rouyendegh, B.D.; Ünal, A.; Piya, S. Selection of suppliers for speech recognition products in IT projects by combining techniques with an integrated fuzzy MCDM. Sustainability 2022, 14, 1777. [Google Scholar] [CrossRef]
  27. Zaidan, A.; Zaidan, B.; Al-Haiqi, A.; Kiah, M.; Hussain, M.; Abdulnabi, M. Evaluation and selection of open-source EMR software packages based on integrated AHP and TOPSIS. J. Biomed. Inform. 2015, 53, 390–404. [Google Scholar] [CrossRef]
  28. Akhigbe, B.I.; Adebisi, J.; Asikhia, E.N.; Ejidokun, A.O. Modelling a Career Path Computer-based System: The Mechanism of Four Colour Codes. In Proceedings of the 1st International Conference on Applied Information and Communication Technology, Moscow, Russia, 20–22 September 2017; pp. 31–37. [Google Scholar]
  29. Adebisi, J.A.; Babatunde, O.M. Selection of Wireless Communication Technologies for Embedded Devices Using Multi-Criteria Approach and Expert Opinion. Niger. J. Technol. Dev. 2022, 19, 373–381. [Google Scholar] [CrossRef]
  30. Rouyendegh, B.D.; Yildizbasi, A.; Arikan, Z.B. Using intuitionistic fuzzy TOPSIS in site selection of wind power plants in Turkey. Adv. Fuzzy Syst. 2018, 2018, 1–14. [Google Scholar] [CrossRef]
  31. Nguyen, H.-T.; Dawal, S.Z.M.; Nukman, Y.; Rifai, A.P.; Aoyama, H. An integrated MCDM model for conveyor equipment evaluation and selection in an FMC based on a fuzzy AHP and fuzzy ARAS in the presence of vagueness. PLoS ONE 2016, 11, e0153222. [Google Scholar] [CrossRef]
  32. Ecer, F. An integrated Fuzzy AHP and ARAS model to evaluate mobile banking services. Technol. Econ. Dev. Econ. 2018, 24, 670–695. [Google Scholar] [CrossRef]
  33. Chatterjee, N.C.; Bose, G.K. Selection of vendors for wind farm under fuzzy MCDM environment. Int. J. Ind. Eng. Comput. 2013, 4, 535–546. [Google Scholar] [CrossRef]
  34. Mishra, A.R.; Rani, P. A q-rung orthopair fuzzy ARAS method based on entropy and discrimination measures: An application of sustainable recycling partner selection. J. Ambient Intell. Humaniz Comput. 2023, 14, 6897–6918. [Google Scholar] [CrossRef]
  35. Ulutas, A. Using of fuzzy SWARA and fuzzy ARAS methods to solve supplier selection problem. In Theoretical and Applied Mathematics in International Business; IGI Global: Hershey, PA, USA, 2020; pp. 136–148. [Google Scholar]
  36. Dahooie, J.H.; Zavadskas, E.K.; Abolhasani, M.; Vanaki, A.; Turskis, Z. A novel approach for evaluation of projects using an interval–valued fuzzy additive ratio assessment (ARAS) method: A case study of oil and gas well drilling projects. Symmetry 2018, 10, 45. [Google Scholar] [CrossRef]
  37. Zamani, M.; Rabbani, A.; Yazdani-Chamzini, A.; Turskis, Z. An integrated model for extending brand based on fuzzy ARAS and ANP methods. J. Bus. Econ. Manag. 2014, 15, 403–423. [Google Scholar] [CrossRef]
  38. Jocic, K.J.; Jocic, G.; Karabasevic, D.; Popovic, G.; Stanujkic, D.; Zavadskas, E.K.; Nguyen, P.T. A novel integrated piprecia–interval-valued triangular fuzzy aras model: E-learning course selection. Symmetry 2020, 12, 928. [Google Scholar] [CrossRef]
  39. Mishra, A.R.; Chandel, A.; Saeidi, P. Low-carbon tourism strategy evaluation and selection using interval-valued intuitionistic fuzzy additive ratio assessment approach based on similarity measures. Environ. Dev. Sustain. 2022, 24, 7236–7282. [Google Scholar] [CrossRef]
  40. Jovčić, S.; Simić, V.; Průša, P.; Dobrodolac, M. Picture fuzzy ARAS method for freight distribution concept selection. Symmetry 2020, 12, 1062. [Google Scholar] [CrossRef]
  41. Adebisi, J.; Babatunde, O. Green Information and Communication Technologies Implementation in Textile Industry Using Multicriteria Method. J. Niger. Soc. Phys. Sci. 2022, 4, 165–173. [Google Scholar] [CrossRef]
  42. Rouyendegh, B.D.; Yildizbasi, A.; Üstünyer, P. Intuitionistic Fuzzy TOPSIS method for green supplier selection problem. Soft Comput. 2020, 24, 2215–2228. [Google Scholar] [CrossRef]
  43. Srivastava, A.; Singh, B.; Chabra, A.; Majumdar, R. Application and use of MCDM technique in software industry. In Proceedings of the 2017 International Conference on Infocom Technologies and Unmanned Systems (Trends and Future Directions) (ICTUS), Dubai, United Arab Emirates, 18–20 December 2017; pp. 487–491. [Google Scholar]
  44. Puzovic, S.; Vasovic, J.V.; Radojicic, M.; Paunovic, V. An Integrated MCDM Approach to PLM Software Selection. Acta Polytech. Hung. 2019, 16, 45–65. [Google Scholar]
Figure 1. Method flowchart for ranking simulation software for teaching power systems at the undergraduate level.
Figure 1. Method flowchart for ranking simulation software for teaching power systems at the undergraduate level.
Sustainability 16 08994 g001
Figure 2. The hierarchical structure of the MCDM problem in this study.
Figure 2. The hierarchical structure of the MCDM problem in this study.
Sustainability 16 08994 g002
Figure 3. Ten-point linguistic term.
Figure 3. Ten-point linguistic term.
Sustainability 16 08994 g003
Figure 4. Comparison of TOPSIS and ARAS methods.
Figure 4. Comparison of TOPSIS and ARAS methods.
Sustainability 16 08994 g004
Figure 5. Rank based on combined coefficients of TOPSIS and ARAS.
Figure 5. Rank based on combined coefficients of TOPSIS and ARAS.
Sustainability 16 08994 g005
Table 1. Benefits and drawbacks of selected fuzzy multi-criteria decision-making (MCDM) techniques.
Table 1. Benefits and drawbacks of selected fuzzy multi-criteria decision-making (MCDM) techniques.
Fuzzy MCDM MethodAdvantagesWeaknesses
Fuzzy Analytic Hierarchy Process (FAHP) [22]Enhanced interpretability, structured hierarchical decision makingSubjectivity in linguistic term interpretation, potential inconsistencies in pairwise comparisons
Fuzzy Technique for Order Preference by Similarity to Ideal Solution (Fuzzy-TOPSIS) [16]Comprehensive evaluation framework, consideration of positive and negative aspectsSensitivity to normalization methods may not capture non-linear relationships.
Fuzzy Additive Ratio Assessment System (Fuzzy-ARAS) [23]Systematic comparison of alternatives, transparent evaluation processChallenges with handling uncertainties in ratio assessments, may not adequately capture decision uncertainties
Multi-Attribute Ratio Comparison Optimization System (MARCOS) [19]Effective balance of conflicting objectives, optimization capabilitiesComplexity in optimization processes, may require extensive computational resources
Multi-Attribute Border Approximation Area Comparison (MABAC) [20]Flexible handling of incomplete information, robustness to imprecise dataChallenges with boundary approximation accuracy, potential biases in boundary delineation
Multi-Attribute Ideal-Real Comparative Analysis (MAIRCA) [21]Comprehensive evaluation against ideal and real benchmarks, dynamic decision-making frameworkDifficulty in establishing ideal and real benchmarks, complexity in dynamic decision contexts
Table 2. Responses of experts.
Table 2. Responses of experts.
QuestionsExpert 1Expert 2Expert 3Expert 4Expert 5
Do you feel it is important to validate the software being used to teach power system analysis at the undergraduate level?Yes Yes Yes Yes Yes
If yes, do you agree that different criteria should be considered in selecting software solutions with respect to Power System Analysis?Yes Yes YesYes Yes
Highest QualificationM.ScPhDPhDPhDPhD
Years of Experience11–156–101–511–156–10
Table 3. Opinion of the 5 experts on the importance of the 12 criteria.
Table 3. Opinion of the 5 experts on the importance of the 12 criteria.
Criteria Expert 1Expert 2Expert 3Expert 4Expert 5
C1VMIEMIVVMIMIVMI
C2VMIEMIVVMIVMIVMI
C3VMIVVMIMIEMIVMI
C4VVMIEMIMIEMIVVMI
C5VMIEMIVMIEMIVMI
C6VMIEMIIIVMI
C7VVMIVVMIVLIMIMI
C8VVMIVVMIEMIMIMI
C9VMIVVMIIII
C10VVMIVVMIMIVMIVMI
C11VVMIEMIMIMIMI
C12VVMIEMIVVIII
Table 4. Overall interval-valued fuzzy performance ratings.
Table 4. Overall interval-valued fuzzy performance ratings.
Si
A04.38886.72578.791310.618713.3333
M13.53335.71237.64879.574713.2222
M22.45554.77036.58748.459113.0000
M32.84445.51207.43809.322313.2222
M43.61116.36608.296610.160413.3333
M53.65555.84757.78589.732113.3333
M63.01115.68767.64629.442513.2222
M72.72225.57357.50079.301813.0000
M82.12225.10457.00998.872313.1111
Table 5. Details of the results obtained from the ARAS and TOPSIS methods.
Table 5. Details of the results obtained from the ARAS and TOPSIS methods.
SoftwareTOPSISARASCombined
d i + d i c c i Rank S i Q i RankCombined Rank
Coefficient
Rank
ETAP 49.268031.35470.3889714.99270.897250.27137
NEPLAN 70.142912.99760.1563814.72230.881070.13278
POWER World 26.425753.76330.6704316.02130.958720.39452
MATLAB 23.134856.89360.7109216.42430.982910.41251
DIgSILENT 29.270551.95870.6396515.87280.949930.38223
PSAT28.769251.81750.6430415.42160.922940.37894
PSCAD/EMTDC29.510551.10730.6339614.86360.889560.37015
MATPOWER17.310865.88300.791917.244010.433580.28016
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Babatunde, O.; Emezirinwune, M.; Adebisi, J.; Abdulsalam, K.A.; Akintayo, B.; Olanrewaju, O. A Fuzzy Multi-Criteria Approach for Selecting Sustainable Power Systems Simulation Software in Undergraduate Education. Sustainability 2024, 16, 8994. https://doi.org/10.3390/su16208994

AMA Style

Babatunde O, Emezirinwune M, Adebisi J, Abdulsalam KA, Akintayo B, Olanrewaju O. A Fuzzy Multi-Criteria Approach for Selecting Sustainable Power Systems Simulation Software in Undergraduate Education. Sustainability. 2024; 16(20):8994. https://doi.org/10.3390/su16208994

Chicago/Turabian Style

Babatunde, Olubayo, Michael Emezirinwune, John Adebisi, Khadeejah A. Abdulsalam, Busola Akintayo, and Oludolapo Olanrewaju. 2024. "A Fuzzy Multi-Criteria Approach for Selecting Sustainable Power Systems Simulation Software in Undergraduate Education" Sustainability 16, no. 20: 8994. https://doi.org/10.3390/su16208994

APA Style

Babatunde, O., Emezirinwune, M., Adebisi, J., Abdulsalam, K. A., Akintayo, B., & Olanrewaju, O. (2024). A Fuzzy Multi-Criteria Approach for Selecting Sustainable Power Systems Simulation Software in Undergraduate Education. Sustainability, 16(20), 8994. https://doi.org/10.3390/su16208994

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop