Linear and Non-Linear Soft Sensors for Predicting the Research Octane Number (RON) through Integrated Synchronization, Resolution Selection and Modelling
Abstract
:1. Introduction
2. The Continuous Catalyst Regeneration (CCR) Unit
3. Data Analysis Workflow
3.1. Data Collection
3.2. Data Cleaning
3.3. Pre-Processing
3.3.1. Selecting the Resolution for Data Analysis
3.3.2. Missing Data Imputation
3.4. Model Comparison Framework
4. Predictive Modelling Methodologies
5. Results
5.1. Data Acquisition and Inspection
5.2. Data Cleaning
5.3. Data Pre-Processing
5.4. Prediction Accuracy Assessment and Comparison
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Appendix A. A Brief Overview of Regression Methods Studied in This Work
Appendix B. Details on Hyperparameters Selection for All Regression Methods
Method | Hyperparameter(s) | Possible Value(s) | Selection Strategy |
---|---|---|---|
MLR | - | - | - |
FSR | 0.05 0.10 | - | |
PCR | 10-fold cv | ||
PCR-FS | 0.05 0.10 | 10-fold cv | |
PLS | 10-fold cv | ||
RR | 0 0.001; 0.01; 0.1; 1; 10 | 10-fold cv | |
LASSO | 1 0.001; 0.01; 0.1; 1; 10 | 10-fold cv | |
EN | 0; 0.167; 0.333; 0.500; 0.667; 0.833; 1 0.001; 0.01; 0.1; 1; 10 | 10-fold cv | |
BRT | 50; 100; 500; 1000; 5000 | 10-fold cv | |
RF | 50; 100; 500; 1000; 5000 | 10-fold cv | |
BT | 50; 100; 500; 1000; 5000 | 10-fold cv | |
SVR-linear | 0.001; 0.005; 0.01; 0.05; 0.1 | 10-fold cv | |
SVR-poly | 0.001; 0.005; 0.01; 0.05; 0.1 | 10-fold cv | |
SVR-rbf | 0.001; 0.005; 0.01; 0.05; 0.1 | 10-fold cv | |
K-PCR-poly | 2; 4; 6; 8; 10 | 10-fold cv | |
K-PCR-rbf | 1:30 0.1; 1; 10; 50; 100; 300; 1000 | 10-fold cv | |
K-PLS-poly | 2; 4; 6; 8; 10 | 10-fold cv | |
K-PLS-rbf | 1:30 0.1; 1; 10; 50; 100; 300; 1000 | 10-fold cv | |
ANN-LM | 1 5; 10; 15 | 10-fold cv | |
ANN-RP | 1 5; 10; 15 | 10-fold cv |
Appendix C. Additional Results for the Comparison Study
Method | S-SR24 | S-SR4 | S-SR3 | S-SR2 | S-SR1 |
---|---|---|---|---|---|
MLR | 0.555 | 0.705 | 0.505 | 0.366 | 0.410 |
FSR | 0.722 | 0.697 | 0.660 | 0.607 | 0.577 |
RR | 0.730 | 0.727 | 0.622 | 0.549 | 0.745 |
LASSO | 0.730 | 0.732 | 0.719 | 0.708 | 0.713 |
EN | 0.738 | 0.729 | 0.732 | 0.717 | 0.728 |
SVR-poly | 0.681 | 0.728 | 0.680 | 0.630 | 0.683 |
SVR-rbf | 0.683 | 0.729 | 0.681 | 0.641 | 0.689 |
SVR-linear | 0.622 | 0.627 | 0.649 | 0.661 | 0.606 |
PCR | 0.680 | 0.687 | 0.686 | 0.670 | 0.666 |
PCR-FS | 0.605 | 0.612 | 0.586 | 0.553 | 0.531 |
PLS | 0.729 | 0.709 | 0.717 | 0.709 | 0.738 |
Bagging | 0.672 | 0.636 | 0.663 | 0.635 | 0.583 |
RF | 0.674 | 0.637 | 0.652 | 0.644 | 0.610 |
Boosting | 0.701 | 0.676 | 0.681 | 0.674 | 0.661 |
K-PCR-poly | 0.376 | 0.392 | 0.403 | 0.353 | 0.351 |
K-PCR-rbf | 0.714 | 0.696 | 0.702 | 0.711 | 0.718 |
K-PLS-poly | 0.073 | 0.166 | 0.185 | 0.138 | 0.163 |
K-PLS-rbf | 0.713 | 0.687 | 0.681 | 0.662 | 0.695 |
ANN-LM | 0.238 | 0.287 | 0.353 | 0.436 | 0.269 |
ANN-RP | 0.488 | 0.485 | 0.486 | 0.474 | 0.413 |
References
- Qin, S.J. Process Data Analytics in the Era of Big Data. AIChE J. 2014, 60, 3092–3100. [Google Scholar] [CrossRef]
- Reis, M.S.; Braatz, R.D.; Chiang, L.H. Big Data—Challenges and Future Research Directions. Chem. Eng. Prog. 2016, 46–50. Available online: https://www.aiche.org/resources/publications/cep/2016/march/big-data-challenges-and-future-research-directions (accessed on 10 April 2022).
- Guo, F.; Xie, R.; Huang, B. A Deep Learning Just-in-Time Modeling Approach for Soft Sensor Based on Variational Autoencoder. Chemom. Intell. Lab. Syst. 2020, 197, 103922. [Google Scholar] [CrossRef]
- Reis, M.S.; Kenett, R. Assessing the Value of Information of Data-Centric Activities in the Chemical Processing Industry 4.0. AIChE J. 2018, 64, 3868–3881. [Google Scholar] [CrossRef]
- Fortuna, L.; Graziani, S.; Rizzo, A.; Xibilia, M.G. Soft Sensors for Monitoring and Control of Industrial Processes, 1st ed.; Springer: London, UK, 2007; ISBN 9788578110796. [Google Scholar]
- Lin, B.; Recke, B.; Knudsen, J.K.H.; Jørgensen, S.B. A Systematic Approach for Soft Sensor Development. Comput. Chem. Eng. 2007, 31, 419–425. [Google Scholar] [CrossRef]
- Seborg, D.E.; Edgar, T.F.; Mellichamp, D.A. Process Dynamics and Control, 3rd ed.; John Wiley & Sons: Hoboken, NJ, USA, 2011; ISBN 2381970770. [Google Scholar]
- Souza, F.A.A.; Araújo, R.; Mendes, J. Review of Soft Sensor Methods for Regression Applications. Chemom. Intell. Lab. Syst. 2016, 152, 69–79. [Google Scholar] [CrossRef]
- Rato, T.J.; Reis, M.S. Sensitivity Enhancing Transformations for Monitoring the Process Correlation Structure. J. Process Control 2014, 24, 905–915. [Google Scholar] [CrossRef] [Green Version]
- Jolliffe, I.T. Principal Component Analysis, 2nd ed.; Springer: New York, NY, USA, 2002; ISBN 0387954422. [Google Scholar]
- Geladi, P.; Kowalski, B.R. Partial Least-Squares Regression: A Tutorial. Anal. Chim. Acta 1986, 185, 1–17. [Google Scholar] [CrossRef]
- Naes, T.; Isakson, T.; Fearn, T.; Davies, T. A User-Friendly Guide to Multivariate Calibration and Classification; NIR Publications: Chichester, UK, 2004. [Google Scholar]
- Jackson, J.E. A User’s Guide to Principal Components; John Wiley & Sons, Inc.: New York, NY, USA, 1991; Volume 87, ISBN 0471622672. [Google Scholar]
- Reis, M.S.; Saraiva, P.M. A Comparative Study of Linear Regression Methods in Noisy Environments. J. Chemom. 2004, 18, 526–536. [Google Scholar] [CrossRef] [Green Version]
- Tibshirani, R. Regression Shrinkage and Selection Via the Lasso. J. R. Stat. Soc. Ser. B 1996, 58, 267–288. [Google Scholar] [CrossRef]
- Draper, N.R.; Smith, H. Applied Regression Analysis, 3rd ed.; John Wiley & Sons, Inc.: New York, NY, USA, 1998; ISBN 9780471170822. [Google Scholar]
- Hoerl, A.E.; Kennard, R.W. Ridge Regression: Biased Estimation for Nonorthogonal Problems. Technometrics 1970, 12, 55–67. [Google Scholar] [CrossRef]
- Yan, X. Modified Nonlinear Generalized Ridge Regression and Its Application to Develop Naphtha Cut Point Soft Sensor. Comput. Chem. Eng. 2008, 32, 608–621. [Google Scholar] [CrossRef]
- Hastie, T.; Tibshirani, R.; Friedman, J. The Elements of Statistics Learning: Data Mining, Inference and Prediction, 2nd ed.; Springer: New York, NY, USA, 2009. [Google Scholar]
- Hesterberg, T.; Choi, N.H.; Meier, L.; Fraley, C. Least Angle and L1 Penalized Regression: A Review. Stat. Surv. 2008, 2, 61–93. [Google Scholar] [CrossRef]
- Zou, H.; Hastie, T. Regularization and Variable Selection via the Elastic Net. J. R. Stat. Soc. Ser. B Stat. Methodol. 2005, 67, 301–320. [Google Scholar] [CrossRef] [Green Version]
- Reis, M.S. Multiscale and Multi-Granularity Process Analytics: A Review. Processes 2019, 7, 61. [Google Scholar] [CrossRef] [Green Version]
- Rato, T.J.; Reis, M.S. Building Optimal Multiresolution Soft Sensors for Continuous Processes. Ind. Eng. Chem. Res. 2018, 57, 9750–9765. [Google Scholar] [CrossRef]
- Gary, J.H.; Handwerk, G.E.; Kaiser, M.J. Petroleum Refining: Technology and Economics, 5th ed.; CRC Press: Boca Raton, FL, USA, 2011; ISBN 9780203907924. [Google Scholar]
- Jones, D.S.J.; Pujadó, P.R.; Treese, S.A. Handbook of Petroleum Processing, 2nd ed.; Springer: Cham, Switzerland, 2006; ISBN 978-1-4020-2819-9. [Google Scholar]
- Meyers, R.A. Handbook of Petroleum Refining Processes, 3rd ed.; McGraw-Hill Education: Berkshire, UK, 2004; Volume C, ISBN 9780071850506. [Google Scholar]
- Park, S.; Han, C. A Nonlinear Soft Sensor Based on Multivariate Smoothing Procedure for Quality Estimation in Distillation Columns. Comput. Chem. Eng. 2000, 24, 871–877. [Google Scholar] [CrossRef]
- Warne, K.; Prasad, G.; Rezvani, S.; Maguire, L. Statistical and Computational Intelligence Techniques for Inferential Model Development: A Comparative Evaluation and a Novel Proposition for Fusion. Eng. Appl. Artif. Intell. 2004, 17, 871–885. [Google Scholar] [CrossRef]
- Kadlec, P.; Gabrys, B.; Strandt, S. Data-Driven Soft Sensors in the Process Industry. Comput. Chem. Eng. 2009, 33, 795–814. [Google Scholar] [CrossRef] [Green Version]
- Chiang, L.H.; Pell, R.J.; Seasholtz, M.B. Exploring Process Data with the Use of Robust Outlier Detection Algorithms. J. Process Control 2003, 13, 437–449. [Google Scholar] [CrossRef]
- Pearson, R. Outliers in Process Modeling and Identification. IEEE Trans. Control Syst. Technol. 2002, 10, 55–63. [Google Scholar] [CrossRef]
- Scheffer, J. Dealing with Missing Data. Res. Lett. Inf. Math. Sci. 2002, 3, 153–160. [Google Scholar] [CrossRef]
- Reis, M.S.; Saraiva, P.M.; Bakshi, B.R. Denoising and Signal-to-Noise Ratio Enhancement: Wavelet Transform and Fourier Transform. In Comprehensive Chemometrics; Elsevier: Amsterdam, The Netherlands, 2009; Volume 2, pp. 25–55. ISBN 9780444527011. [Google Scholar]
- Reis, M.S.; Rendall, R.; Chin, S.T.; Chiang, L. Challenges in the Specification and Integration of Measurement Uncertainty in the Development of Data-Driven Models for the Chemical Processing Industry. Ind. Eng. Chem. Res. 2015, 54, 9159–9177. [Google Scholar] [CrossRef]
- Reis, M.S.; Saraiva, P.M. Heteroscedastic Latent Variable Modelling with Applications to Multivariate Statistical Process Control. Chemom. Intell. Lab. Syst. 2006, 80, 57–66. [Google Scholar] [CrossRef] [Green Version]
- Arteaga, F.; Ferrer, A. Dealing with Missing Data in MSPC: Several Methods, Different Interpretations, Some Examples. J. Chemom. 2002, 16, 408–418. [Google Scholar] [CrossRef]
- Little, R.J.A.; Rubin, D.B. Statistical Analysis with Missing Data, 2nd ed.; Wiley Series in Probability and Statistics: Hoboken, NJ, USA, 2002; ISBN 3175723993. [Google Scholar]
- Nelson, P.R.C.; Taylor, P.A.; Macgregor, J.F. Missing Data Methods in PCA and PLS: Score Calculations with Incomplete Observations. Chemom. Intell. Lab. Syst. 1996, 35, 45–65. [Google Scholar] [CrossRef]
- Walczak, B.; Massart, D.L. Dealing with Missing Data: Part I. Chemom. Intell. Lab. Syst. 2001, 58, 15–27. [Google Scholar] [CrossRef]
- Geisser, S.; Eddy, W.F. A Predictive Approach to Model Selection. J. Am. Stat. Assoc. 1979, 74, 153–160. [Google Scholar] [CrossRef]
- Krzanowski, W.J. Between-Group Comparison of Principal Components—Some Sampling Results. J. Stat. Comput. Simul. 1982, 15, 141–154. [Google Scholar] [CrossRef]
- Rendall, R.; Reis, M.S. Which Regression Method to Use? Making Informed Decisions in “Data-Rich/Knowledge Poor” Scenarios—The Predictive Analytics Comparison Framework (PAC). Chemom. Intell. Lab. Syst. 2018, 181, 52–63. [Google Scholar] [CrossRef]
- Stone, M. Cross-Validatory Choice and Assessment of Statistical Predictions. J. R. Stat. Soc. Ser. B 1974, 36, 111–133. [Google Scholar] [CrossRef]
- Wold, S. Cross-Validatory Estimation of the Number of Components in Factor and Principal Components Models. Technometrics 1978, 20, 397–405. [Google Scholar] [CrossRef]
- Wold, S. Pattern Recognition by Means of Disjoint Principal Components Models. Pattern Recognit. 1976, 8, 127–139. [Google Scholar] [CrossRef]
- Montgomery, D.C.; Peck, E.A.; Vining, G.G. Introduction to Linear Regression Analysis, 5th ed.; John Wiley & Sons: Hoboken, NJ, USA, 2012. [Google Scholar]
- Andersen, C.M.; Bro, R. Variable Selection in Regression—A Tutorial. J. Chemom. 2010, 24, 728–737. [Google Scholar] [CrossRef]
- Montgomery, D.C.; Runger, G.C. Applied Statistics and Probability for Engineers, 3rd ed.; John Wiley & Sons: New York, NY, USA, 2003; ISBN 0471204544. [Google Scholar]
- Murtaugh, P.A. Methods of Variable Selection in Regression Modeling. Commun. Stat. Simul. Comput. 1998, 27, 711–734. [Google Scholar] [CrossRef]
- Jackson, J.E. Principal Components and Factor Analysis: Part I—Principal Components. J. Qual. Technol. 1980, 12, 201–213. [Google Scholar] [CrossRef]
- Krzanowski, W.J. Principles of Multivariate Analysis: A User’s Perspective; Oxford University Press: New York, NY, USA, 1988. [Google Scholar]
- Martens, H.; Naes, T. Multivariate Calibration; Wiley: Chichester, UK, 1989. [Google Scholar]
- Wold, S.; Esbensen, K.; Geladi, P. Principal Component Analysis. Chemom. Intell. Lab. Syst. 1987, 2, 37–52. [Google Scholar] [CrossRef]
- Geladi, P. Notes on the History and Nature of Partial Least Squares (PLS) Modelling. J. Chemom. 1988, 2, 231–246. [Google Scholar] [CrossRef]
- Geladi, P.; Esbensen, K. Regression on Multivariate Images: Principal Component Regression for Modeling, Prediction and Visual Diagnostic Tools. J. Chemom. 1991, 5, 97–111. [Google Scholar] [CrossRef]
- Haaland, D.M.; Thomas, E.V. Partial Least-Squares Methods for Spectral Analyses. 1. Relation to Other Quantitative Calibration Methods and the Extraction of Qualitative Information. Anal. Chem. 1988, 60, 1193–1202. [Google Scholar] [CrossRef]
- Helland, I.S. On the Structure of Partial Least Squares Regression. Commun. Stat. Simul. Comput. 1988, 17, 581–607. [Google Scholar] [CrossRef]
- Helland, I.S. Some Theoretical Aspects of Partial Least Squares Regression. Chemom. Intell. Lab. Syst. 2001, 58, 97–107. [Google Scholar] [CrossRef] [Green Version]
- Höskuldsson, A. Prediction Methods in Science and Technology; Thor Publishing: New York, NY, USA, 1996. [Google Scholar]
- Lindgren, F.; Geladi, P.; Wold, S. The Kernel Algorithm for PLS. J. Chemom. 1993, 7, 45–59. [Google Scholar] [CrossRef]
- Wold, S.; Ruhe, A.; Wold, H.; Dunn, W.J. The Collinearity Problem in Linear Regression. The Partial Least Squares (PLS) Approach to Generalized Inverses. J. Sci. Stat. Comput. 1984, 5, 735–743. [Google Scholar] [CrossRef] [Green Version]
- Wold, S.; Sjöström, M.; Eriksson, L. PLS-Regression: A Basic Tool of Chemometrics. Chemom. Intell. Lab. Syst. 2001, 58, 109–130. [Google Scholar] [CrossRef]
- Dietterich, T.G. Ensemble Methods in Machine Learning. In Multiple Classifier Systems; Springer: Berlin, Germany, 2000; Volume 1857, pp. 1–15. ISBN 978-3-540-67704-8. [Google Scholar]
- Cao, D.-S.; Xu, Q.-S.; Liang, Y.-Z.; Zhang, L.-X.; Li, H.-D. The Boosting: A New Idea of Building Models. Chemom. Intell. Lab. Syst. 2010, 100, 1–11. [Google Scholar] [CrossRef]
- Breiman, L.; Friedman, J.H.; Olshen, R.A.; Stone, C.J. Classification and Regression Trees; CRC Press: Boca Raton, FL, USA, 1984. [Google Scholar]
- Elith, J.; Leathwick, J.R.; Hastie, T. A Working Guide to Boosted Regression Trees. J. Anim. Ecol. 2008, 77, 802–813. [Google Scholar] [CrossRef]
- Strobl, C.; Malley, J.; Gerhard, T. An Introduction to Recursive Partitioning: Rationale, Application and Characteristics of Classification and Regression Trees, Bagging and Random Forests. Psychol. Methods 2009, 14, 323–348. [Google Scholar] [CrossRef] [Green Version]
- Gurney, K. An Introduction to Neural Networks, 1st ed.; UCL Press: London, UK, 1997; ISBN 1857285034. [Google Scholar]
- LeCun, Y.; Bengio, Y.; Hinton, G. Deep Learning. Nature 2015, 521, 436–444. [Google Scholar] [CrossRef]
- Anderson, J.A. An Introduction to Neural Networks, 3rd ed.; MIT Press: Cambridge, MA, USA, 1997. [Google Scholar]
- McAvoy, T.J.; Wang, N.S.; Naidu, S.; Bhat, N.; Hunter, J.; Simmons, M. Interpreting Biosensor Data via Backpropagation. In Proceedings of the International 1989 Joint Conference on Neural Networks, San Diego, CA, USA, 17–21 June 1989; pp. 227–233. [Google Scholar]
- Venkatasubramanian, V.; Vaidyanathan, R.; Yamamoto, Y. Process Fault Detection and Diagnosis Using Neural Networks—I. Steady-State Processes. Comput. Chem. Eng. 1990, 14, 699–712. [Google Scholar] [CrossRef]
- Willis, M.J.; Di Massimo, C.; Montague, G.A.; Tham, M.T.; Morris, A.J. Artificial Neural Networks in Process Engineering. IEE Proc. D Control Theory Appl. 1991, 138, 256–266. [Google Scholar] [CrossRef]
- Chauvin, Y.; Rumelhart, D.E. Backpropagation: Theory, Architectures and Applications; Lawrence Erlbaum Associates, Inc.: Mahwah, NJ, USA, 1995; ISBN 080581258X. [Google Scholar]
- Curcio, S.; Iorio, G. Models of Membrane Reactors Based on Artificial Neural Networks and Hybrid Approaches. In Handbook of Membrane Reactors; Woodhead Publishing Limited: Cambridge, UK, 2013; Volume 1, pp. 569–597. ISBN 9780857097330. [Google Scholar]
- Rumelhart, D.E.; Hinton, G.E.; Williams, R.J. Learning Internal Representations by Error Propagation. In Parallel Distributed Processing: Explorations in the Microstructure of Cognition: Foundations; MIT Press: Cambridge, MA, USA, 1986; pp. 319–362. [Google Scholar]
- Wythoff, B.J. Backpropagation Neural Networks: A Tutorial. Chemom. Intell. Lab. Syst. 1993, 18, 115–155. [Google Scholar] [CrossRef]
- Rosipal, R.; Trejo, L.J. Kernel Partial Least Squares Regression in Reproducing Kernel Hilbert Space. J. Mach. Learn. Res. 2001, 2, 97–123. [Google Scholar] [CrossRef]
- Vitale, R.; Palací-López, D.; Kerkenaar, H.H.M.; Postma, G.J.; Buydens, L.M.C.; Ferrer, A. Kernel-Partial Least Squares Regression Coupled to Pseudo-Sample Trajectories for the Analysis of Mixture Designs of Experiments. Chemom. Intell. Lab. Syst. 2018, 175, 37–46. [Google Scholar] [CrossRef] [Green Version]
- Scholkopf, B.; Smola, A.; Muller, K.-R. Nonlinear Component Analysis as a Kernel Eigenvalue Problem. Neural Comput. 1998, 10, 1299–1319. [Google Scholar] [CrossRef] [Green Version]
- Vert, J.-P.; Tsuda, K.; Scholkopf, B. A Primer on Kernel Methods. In Kernel Methods in Computational Biology; MIT Press: Cambridge, MA, USA, 2004. [Google Scholar]
- Wang, M.; Yan, G.; Fei, Z. Kernel PLS Based Prediction Model Construction and Simulation on Theoretical Cases. Neurocomputing 2015, 165, 389–394. [Google Scholar] [CrossRef]
- Ahmed, N.; Atiya, A.; Gayar, N. El An Empirical Comparison of Machine Learning Models for Time Series Forecasting. Econom. Rev. 2010, 29, 594–621. [Google Scholar] [CrossRef]
- Smola, A.J.; Scholkopf, B. A Tutorial on Support Vector Regression. Stat. Comput. 2004, 14, 199–222. [Google Scholar] [CrossRef] [Green Version]
- Scholkopf, B.; Smola, A.J. Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond; MIT Press: Cambridge, MA, USA, 2002; ISBN 0262194759. [Google Scholar]
- Vapnik, N.V. The Nature of Statistical Learning Theory, 2nd ed.; Springer: Berlin/Heidelberg, Germany, 2000; ISBN 0387987800. [Google Scholar]
- Yan, W.; Shao, H.; Wang, X. Soft Sensing Modeling Based on Support Vector Machine and Bayesian Model Selection. Comput. Chem. Eng. 2004, 28, 1489–1498. [Google Scholar] [CrossRef]
- Rato, T.J.; Reis, M.S. Multiresolution Soft Sensors: A New Class of Model Structures for Handling Multiresolution Data. Ind. Eng. Chem. Res. 2017, 56, 3640–3654. [Google Scholar] [CrossRef]
- Cao, D.S.; Liang, Y.Z.; Xu, Q.S.; Hu, Q.N.; Zhang, L.X.; Fu, G.H. Exploring Nonlinear Relationships in Chemical Data Using Kernel-Based Methods. Chemom. Intell. Lab. Syst. 2011, 107, 106–115. [Google Scholar] [CrossRef]
1. For i = 1: (Number of Outer Cycles) perform: |
|
2. Apply a paired t-test to assess the statistical significance of the difference between the for all pairs of methods. |
3. Using the p-values for paired statistical tests, compute the overall performance criteria: |
|
Property | Number of Samples | Property Values | |||
---|---|---|---|---|---|
Min. | Max. | Mean | SD | ||
RON | 243 | 96.80 | 102.50 | 100.38 | 0.97 |
Resolution Scenario | Number of Samples | Number of Predictors | |
---|---|---|---|
Raw Data | 1,048,320 | 41 | 0.02 |
After Cleaning | 1,048,320 | 41 | 3.56 |
S-SR24 | 243 | 41 | 0.00 |
S-SR4 | 243 | 41 | 0.00 |
S-SR3 | 243 | 41 | 0.00 |
S-SR2 | 243 | 41 | 0.00 |
S-SR1 | 243 | 41 | 0.00 |
Method | S-SR24 | S-SR4 | S-SR3 | S-SR2 | S-SR1 |
---|---|---|---|---|---|
MLR | 0.607 | 0.529 | 0.637 | 0.666 | 0.624 |
FSR | 0.500 | 0.538 | 0.548 | 0.571 | 0.571 |
RR | 0.494 | 0.513 | 0.560 | 0.578 | 0.464 |
LASSO | 0.493 | 0.508 | 0.502 | 0.511 | 0.493 |
EN | 0.486 | 0.510 | 0.490 | 0.500 | 0.477 |
SVR-poly | 0.533 | 0.510 | 0.531 | 0.558 | 0.506 |
SVR-rbf | 0.531 | 0.510 | 0.530 | 0.552 | 0.502 |
SVR-linear | 0.586 | 0.600 | 0.561 | 0.551 | 0.585 |
PCR | 0.537 | 0.548 | 0.530 | 0.544 | 0.533 |
PCR-FS | 0.597 | 0.612 | 0.606 | 0.627 | 0.633 |
PLS | 0.494 | 0.530 | 0.502 | 0.508 | 0.471 |
Bagging | 0.545 | 0.595 | 0.550 | 0.574 | 0.599 |
RF | 0.546 | 0.595 | 0.561 | 0.566 | 0.581 |
Boosting | 0.520 | 0.559 | 0.535 | 0.539 | 0.539 |
K-PCR-poly | 0.752 | 0.766 | 0.731 | 0.752 | 0.745 |
K-PCR-rbf | 0.509 | 0.540 | 0.517 | 0.507 | 0.489 |
K-PLS-poly | 0.918 | 0.896 | 0.856 | 0.880 | 0.851 |
K-PLS-rbf | 0.510 | 0.547 | 0.532 | 0.544 | 0.504 |
ANN-LM | 0.852 | 0.860 | 0.782 | 0.725 | 0.811 |
ANN-RP | 0.690 | 0.732 | 0.690 | 0.692 | 0.718 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Dias, T.; Oliveira, R.; Saraiva, P.M.; Reis, M.S. Linear and Non-Linear Soft Sensors for Predicting the Research Octane Number (RON) through Integrated Synchronization, Resolution Selection and Modelling. Sensors 2022, 22, 3734. https://doi.org/10.3390/s22103734
Dias T, Oliveira R, Saraiva PM, Reis MS. Linear and Non-Linear Soft Sensors for Predicting the Research Octane Number (RON) through Integrated Synchronization, Resolution Selection and Modelling. Sensors. 2022; 22(10):3734. https://doi.org/10.3390/s22103734
Chicago/Turabian StyleDias, Tiago, Rodolfo Oliveira, Pedro M. Saraiva, and Marco S. Reis. 2022. "Linear and Non-Linear Soft Sensors for Predicting the Research Octane Number (RON) through Integrated Synchronization, Resolution Selection and Modelling" Sensors 22, no. 10: 3734. https://doi.org/10.3390/s22103734
APA StyleDias, T., Oliveira, R., Saraiva, P. M., & Reis, M. S. (2022). Linear and Non-Linear Soft Sensors for Predicting the Research Octane Number (RON) through Integrated Synchronization, Resolution Selection and Modelling. Sensors, 22(10), 3734. https://doi.org/10.3390/s22103734