The decision fusion approach using Monte Carlo simulation and Gray Analytic Hierarchy Process (G-AHP) has shown promising results in enhancing the selection of wastewater treatment systems. Decision-making involves identifying and selecting the best course of action, based on one’s own values and preferences, that has the highest chance of success and is most consistent with stated goals. The presence of a large number of criteria to be taken into account, both quantitative and qualitative, increases the complexity of the decision-making process [
1]. Consequently, numerous decision-making methodologies and tools have emerged, including multidisciplinary approaches, multi-objective techniques, hierarchical systems, and data fusion strategies, particularly in the last few decades. However, it is worth noting that the latter method has garnered considerable popularity due to its heightened dependability as well as its ability to reduce interference from extraneous factors while significantly decreasing unreliable data occurrences. Having been introduced and documented in America in 1984 [
2], a comprehensive framework has emerged that has found application across various scientific disciplines. This framework is widely known for its capabilities in water pollution quality assessments, climate change analysis, decision-making and risk analysis, image processing and tracking, accounting and auditing practices, data mining endeavors, artificial intelligence development, facial recognition studies, and medical research, among many others. To solve the intricate problems concerning data fusion within these fields several methods have been employed, including the Dempster–Shafer evidence approach, Yager theory utilization, the use of averaging techniques, as well as weighted average fusion methods along with employing tools such as Kalman filters backed by assessments derived from MC algorithms [
2]. Notably, it is concluded that the most practical method when addressing complex data fusion issues on a decision-making level remains the employment of the aforementioned Dempster–Shafer evidence approach. In 1967, Dempster introduced the concept which would later be known as evidence theory through his publication on upper and lower probability bounds. It was not until 1976 that Shafer further refined and expanded this theory, addressing its limitations and enabling analysis of incomplete and ambiguous information. Consequently, it was identified as the “Dempster–Shafer evidence theory” [
3]. However, one obstacle encountered within this theoretical framework lies in its failure to adequately incorporate uncertainties when conflicting expert opinions are present. The issue of inter-evidence conflict was initially highlighted by Professor Lotfizadeh using an illustrative example. His demonstration revealed a significant dependence of the Dempster–Shafer fusion law on inter-evidence consistency while disregarding any conflicts in the evidence. When multiple pieces of evidence are harmonious, the uncertainty associated with combined results can be reduced. Yet if this evidence clashes in a substantial manner, the outcomes become void of logic and cannot be accepted due to their inconsistency. Among the numerous proposed solutions put forth to address this quandary, one is known as the TBM (Transferable Belief Model). In this approach, masses remain unaltered and the issue regarding conflicting mass allocations to the empty set is stored [
4]. Another suggestion comes from Dubois and Prade, who propose generating masses through opposing pair foci. Lastly, we have Yager’s theory that tackles the problems associated with Dempster–Shafer evidence theory; the method was presented in 1987 [
5]. Although Yager’s method successfully resolves uncertainty concerns within the Dempster–Shafer theory, it is not without its fundamental drawbacks such as computational complexity and limited capability for accurate deductions when evidence indicates a lack of information about the system. To refine and enhance fusion rules within Dempster–Shafer theory, researchers have presented various methods. Among these approaches is Wang et al.’s hybrid evidence validity-based method, which remains unaffected by the quantity of evidence provided while addressing irrational cases. In a revolutionary approach to decision-making, the novel method underwent thorough evaluation as it aimed to select wind turbines. It carefully assessed the validity of the evidence and employed a ranking technique predicated on its likeness to the ideal solution. The outcomes astoundingly demonstrated its effectiveness in identifying the optimal design from an array of offshore wind turbines [
6]. Addressing the vexatious issue of conflicting evidence, Liu et al. introduced an ingenious reliability estimation method rooted in failure modes and effects analysis (FMEA). This remarkable approach not only incorporated experts’ distinctive personality traits but also accounted for inter-factor dependencies while effectively managing contradictory pieces of information. Armed with this proposed discount methodology, they triumphantly resolved a critical quandary associated with the supercritical water-to-gas (SCWG) conversion system during an illuminating case study. Through extensive discourse and meticulous analysis, their work undeniably showcased both its efficiency and unparalleled superiority [
7]. Lia and her colleagues unveiled an enhanced fusion algorithm, which hinges upon the weighted average of the evidence conflict probability. This novel approach was employed to forecast levels of risk pertaining to water infiltration during various phases of subaqueous tunnel excavation. While conventional means yielded lower estimations of risk, this algorithm impressively anticipated a heightened degree of hazard in the twelfth stage of the boring process—a prognosis that aligned remarkably well with the grave seepage ardently witnessed in experimental trials. Succinctly put, this algorithm possesses the capacity to yield more precise predictions concerning calamitous flooding events while serving as an invaluable point of reference for comparable engineering predicaments [
8]. Due to its inability to provide a satisfactory mathematical framework for effectively managing uncertainties when evaluating risk parameters and prioritizing failure modes, the Dempster–Shafer evidence theory was combined with failure mode effects and critical analysis (FMECA) by Sazer et al. in order to assess potential system failures and their underlying causes [
9]. Similarly, Wang et al. utilized this approach to identify groundwater zones in arid basins using GIS information. Their study showed that assessment methods derived from this integration were able to reliably and successfully predict the presence of groundwater in the region, thereby serving as a solid scientific basis for ensuring groundwater security and effective management [
10]. In the area of data analysis and prediction, one can find comfort in the Dempster–Shafer evidence theory. This remarkable theory has a special feature: it allows all relevant parameters involved in a given problem to be considered simultaneously, without any restrictions. In stark contrast to prevailing statistical methods and machine learning algorithms, which unfortunately do not take into account all relevant factors or have limited accuracy, this theory proves to be a highly viable alternative. Regrettably, few studies have ventured into exploring the combined impacts that may arise from garnering data from multiple sources—be they sensors or expert opinions—on predicting various phenomena. It is deeply disheartening to observe that these crucial aspects have not undergone meticulous investigation, despite their potential for enhancing comprehensive understanding and accurate forecasting. Jiang et al. presented a profound paradigm of learning that seamlessly integrates diverse data sources in order to analyze and predict the quality of urban drainage water in the southern region of China. This pioneering approach merges environmental and social indicators with measurements of water quantity and quality, employing advanced techniques that fuse multiple sources of data. When compared to linear methods such as multiple linear regression, as well as traditional algorithms like multilayer perceptron, it is clear from the researchers’ findings that the deep learning algorithm possesses exceptional predictive capabilities. Through incorporating recurrent neural networks and mechanisms known as short long-term memory, Jiang et al.’s methodology surpasses alternative models in performance [
11]. To predict the water depths of lakes in São Paulo, Brazil, Manzione and Catrignano employed remote sensors and the co-Kriging data fusion method. Through a process of comparison and cross-validation they demonstrated that the uncertainty associated with estimating water depth using the data fusion method was significantly lower than that of the univariate method. Additionally, it is noteworthy that the data fusion method allows for an examination of vegetation and soil characteristics, while this capability is lacking in the univariate approach [
12]. To predict potential underground water reserves accurately and facilitate optimal management strategies, Obeidavi et al. recommended utilizing both the Dempster–Shafer learning model and remote sensing data. They implemented this methodology in the northern Khuzestan region for precise measurement and effective administration of underground water resources [
13]. Chen et al. employed the Bayesian inference technique to integrate data from remote sensing and on-site observations, with the aim of determining water quality. In comparing various methods of data fusion such as linear regression, nonlinear regression, cumulative distribution function, and Bayesian approaches, it was found that the latter yielded lower errors and higher correlations. Consequently, this method exhibited promise in determining drinking water quality [
14]. Additionally, apart from employing the Dempster–Shafer evidence method for data fusion mentioned earlier, there exist alternative methodologies like the Kalman filter approach. The Kalman filter is a potent tool that enables information synthesis in situations characterized by uncertainty; serving as an estimator leveraging previous state estimates along with current state observations to calculate estimates of the present state. The astonishing capability of the Kalman filter to extract precise information has ensured its longstanding use as an optimal solution for a multitude of data-tracking and prediction tasks [
15]. Utilizing the second-order Kalman filter error method and employing soft algorithms known for probabilistic analysis of multi-sensor data, Gunia et al. merged various sensor inputs for water quality monitoring at the esteemed Finnish Environmental Institute. They meticulously calculated uncertainties and spatiotemporal correlations within observational data, yielding accurate and realistic results. Importantly, their findings demonstrated that the implementation of the Kalman filter presented a suitable approach to enhancing water-quality monitoring programs while adhering to established environmental standards [
16]. Other techniques for data fusion include the illustrious Monte Carlo (MC) algorithm. Initially introduced in 1988 by Kampke and Pearl, this method was specifically designed to address the challenges presented by the Dempster–Shafer–Yager evidence theory. Their novel approach involved employing a straightforward MC algorithm to assess uncertainty within the Dempster–Shafer evidence theory while acknowledging its intertwining connection with classical probability theory; however, it fell short of achieving satisfactory convergence [
17,
18]. In subsequent years, Moral and Wilson made significant contributions to refining the utilization of MC algorithms within this context. In 1994, they ingeniously incorporated Markov chains into their MC algorithms as a means to calculate the Dempster–Shafer belief function more efficiently. Remarkably, their groundbreaking research demonstrated that this methodology remained effective even when faced with high levels of inter-evidence conflicts [
19]. Ultimately, these scholars continued advancing their proposed technique and solidifying its efficacy through empirical analysis in 1996. Through an insightful experimental example, Moral and Wilson compared their refined method against both the simple MC algorithm and Markov chain variants [
20]. Salehy and Okten devised the Monte Carlo and pseudo-Monte Carlo methodologies as a means to tackle the intricate dilemma of the combined Dempster–Shafer rule with regard to time. They successfully demonstrated that through this research, precisely by incorporating techniques such as variance reduction and low-discrepancy sequence, the MC algorithms possess the potential to broaden the scope of application for the Dempster–Shafer theory in relation to obstacles previously deemed insurmountable. Furthermore, they presented empirical data showcasing both the efficacy and convergence rate of select algorithms. Notably, their investigation revealed that utilizing the Dempster rule allows one to obtain a satisfactorily accurate approximation of a merged belief function within a mere half-minute timeframe—an achievement made all-the-more impressive considering that obtaining an exact solution for this particular problem previously demanded over six days [
21]. In addition to the importance of choosing the best wastewater treatment system, determining the most effective and important sub-criteria is also of particular importance. To determine the effect of different factors in a system, methods such as TOPSIS, AHP, and G-AHP are used. TOPSIS (Technique for Order Preference by Similarity to Ideal Solution) is a multi-indicator decision-making method like AHP for evaluating and prioritizing options based on criteria according to their distance from positive and negative ideals. This method was proposed by Huang and Yun in 1981 and soon found its place in multi-criteria decision-making. Anaokar et al., in a report, evaluated the effective indicators in urban wastewater treatment using TOPSIS multi-criteria decision-making models. In this research, the relative importance of the criteria were determined by the decision makers. The performance of six urban sewage treatment plants was evaluated using the multi-criteria decision-making technique for ranking with similarity to the ideal solution. Their efficiency was based on the characteristics of the wastewater and their performance was based on the parameters of temperature, total suspended solids, total dissolved solids, biological oxygen consumption (BOD), chemical oxygen consumption (COD), and pH [
22]. In another study, Golfam et al. presented a method to select the best alternative for the reuse of treated wastewater based on the gray system. The criteria are selected by the Analytic Hierarchy Process (AHP) method. The results show that reuse of wastewater in the environmental sector has the highest priority among several alternative applications [
1]. Perez et al. discussed the risk assessment method to reduce the operational risk of facilities in a domestic wastewater treatment plant system based on constructed wetlands. The approach used in this research is a three-dimensional risk matrix, which is a simplified version of the probabilistic risk assessment method that makes it more accessible and allows for wider application. The results show that human factors appear significantly as the main risk factors related to wetland operations [
23]. The aim of this research is to significantly advance the Dempster–Shafer evidence theory and Yager theory by introducing the innovative application of the Monte Carlo (MC) algorithm. While existing methods require experts to reach a common consensus while simultaneously selecting a system in order to calculate the associated probability, they falter when confronted with an increasing amount of evidence or sensors. This increase requires an excessive number of random samples for accurate probability estimation, resulting in increased computation time and higher sampling requirements. Additionally, it reduces the likelihood that all evidence or sensors will relate to a single option at any given time. While the Dempster–Shafer evidence theory is effective in many contexts, it often has problems dealing efficiently with uncertainty and conflicting expert opinions. Existing methods are computationally intensive and may not integrate large amounts of evidence or sensor data well. Furthermore, a notable gap exists in the methodologies that combine advanced decision algorithms with structured frameworks for comprehensive assessment and prioritization of decision criteria.
The approach presented in this study ingeniously solves this dilemma by constructing a robust probability space. This space not only accounts for the uncertainty inherent in each individual expert’s opinion, but also facilitates decision-making in scenarios involving a larger number of experts. By integrating the MC algorithm, this research not only overcomes the computational challenges associated with large-scale evidence or sensor inputs, but also improves the reliability and efficiency of decision-making processes in complex scenarios. To further highlight the novelty and importance of our approach, we also use the Gray Analytic Hierarchy Process (G-AHP) to rank the sub-criteria. This method complements the MC algorithm by providing a structured framework for evaluating and prioritizing the various factors that contribute to decision-making processes.
Incorporating the Monte Carlo algorithm helps overcome the computational challenges associated with large-scale evidence or sensor data inputs, thereby improving the reliability and efficiency of decision-making processes. This approach also constructs a robust probability space that accounts for the uncertainty inherent in each expert’s opinion, making it easier to handle scenarios involving numerous experts. The Gray Analytic Hierarchy Process (G-AHP) is employed to rank sub-criteria, offering a structured framework for evaluating and prioritizing various factors that influence decision-making. This complements the MC algorithm by providing a clear and systematic approach to criteria evaluation, ultimately enhancing the decision-making framework’s comprehensiveness and reliability. Through this combined approach, our research provides a comprehensive solution to the challenges of selecting optimal wastewater treatment systems, filling a critical gap in current methodologies, and paving the way for more effective decision-making in complex areas.