Next Article in Journal
Investigating the Influence of Temperature on the Kaolinite-Base Synthesis of Zeolite and Urease Immobilization for the Potential Fabrication of Electrochemical Urea Biosensors
Next Article in Special Issue
A Novel Online Sequential Extreme Learning Machine for Gas Utilization Ratio Prediction in Blast Furnaces
Previous Article in Journal
Detecting Unknown Artificial Urban Surface Materials Based on Spectral Dissimilarity Analysis
Previous Article in Special Issue
The Ship Movement Trajectory Prediction Algorithm Using Navigational Data Fusion
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Just-in-Time Correntropy Soft Sensor with Noisy Data for Industrial Silicon Content Prediction

1
Department of Electrical and Information Engineering, Shaoxing University, Shaoxing 312000, China
2
Institute of Process Equipment and Control Engineering, Zhejiang University of Technology, Hangzhou 310014, China
*
Author to whom correspondence should be addressed.
Sensors 2017, 17(8), 1830; https://doi.org/10.3390/s17081830
Submission received: 28 May 2017 / Revised: 27 July 2017 / Accepted: 27 July 2017 / Published: 8 August 2017
(This article belongs to the Special Issue Soft Sensors and Intelligent Algorithms for Data Fusion)

Abstract

:
Development of accurate data-driven quality prediction models for industrial blast furnaces encounters several challenges mainly because the collected data are nonlinear, non-Gaussian, and uneven distributed. A just-in-time correntropy-based local soft sensing approach is presented to predict the silicon content in this work. Without cumbersome efforts for outlier detection, a correntropy support vector regression (CSVR) modeling framework is proposed to deal with the soft sensor development and outlier detection simultaneously. Moreover, with a continuous updating database and a clustering strategy, a just-in-time CSVR (JCSVR) method is developed. Consequently, more accurate prediction and efficient implementations of JCSVR can be achieved. Better prediction performance of JCSVR is validated on the online silicon content prediction, compared with traditional soft sensors.

1. Introduction

The silicon content is an important index of the thermal state in industrial blast furnaces. It must be controlled appropriately to facilitate stable production. To achieve this goal, the accurate state of silicon content in hot metal should be known online. Previously, there was extensive research on the kinetic and thermodynamic behaviors occurring in the blast furnace iron-making process [1,2,3,4,5,6]. Nevertheless, an accurate mechanism model suitable for industrial applications is still not available. Nowadays, process data can be easily measured in industrial blast furnaces. To explore and utilize useful information hidden in the data, several empirical soft-sensing approaches were applied to online predict the silicon content. Existing popular data-driven methods include artificial neural networks [7,8,9,10,11,12,13,14], multivariate regression [14,15], time series analysis [16,17,18,19], fuzzy systems [20], subspace identification [21], support vector regression (SVR) and least squares SVR (LSSVR) [22,23,24,25], and multi-scale and multiple models [26,27,28,29,30]. These data-driven empirical models for short-term silicon content prediction can be constructed in a quick way [31,32,33].
Actually, the development of a good soft sensor model is easy. This is mainly because the modeling data are noisy and often contain unwanted outliers. They may come from instrument degradation, transmission problems, etc. Generally, a good soft sensor model is dependent on the high quality of modeling data. Different kinds of noise should be considered when training artificial neural networks and other data-driven models [34,35,36]. Without enough attention, a soft sensor model trained with outliers and inappropriate data may tend to be over-fitting and, thus, lead to unreliable prediction. For practical use, a reliable prediction model should be constructed by reducing the negative effect of outliers. Generally, obvious outliers can be deleted by most of traditional outlier detection methods [37,38,39,40]. However, it is not easy to detect those inconspicuous outliers mainly because they may be masked by their adjacent data. In our opinion, the soft sensor development and outlier detection should be integrated into a unified framework rather than be separated into two tasks.
For industrial blast furnace iron-making processes, only using a global/fixed model is not possible to describe the complex characteristics. Additionally, it is difficult to update the global models quickly when the process dynamics are changing [41]. Nurkkala et al. [30] proposed multiple autoregressive vector models to describe complex systems. To construct the local models automatically, several just-in-time learning methods were utilized for nonlinear process modeling problems [42,43,44]. Different from most traditional soft sensors, just-in-time-based models are built in a lazy learning manner when the query sample is required to be predicted. Consequently, the advantage is that the prediction for the query sample can be optimized locally, which might increase the prediction performance. For the silicon content prediction, Liu and Gao [25] utilized the just-in-time LSSVR (JLSSVR) modeling method to better describe process nonlinearity directly. Unfortunately, data samples utilized for construction of a JLSSVR model are assigned with the same weight regardless of their different effects. In such a situation, the negative effect of outliers may not be removed.
A novel online local model is developed for reliable prediction of the industrial silicon content. To handle the noisy data with non-Gaussian and uneven distributions, a just-in-time correntropy SVR (JCSVR) soft sensor is proposed. Compared with traditional soft sensors, the proposed JCSVR method is more reliable and practical in two ways. First, by reduction of the outliers’ negative effect, more accurate prediction of the silicon content can be obtained. Second, the reliability of the database can be improved gradually. These two properties make the JCSVR method better for long-term utilization.
The remainder of this work is structured thusly: The correntropy SVR (CSVR) soft sensing approach is formulated in Section 2. In Section 3, the clustering-based JCSVR local modeling method is proposed. Additionally, the database maintenance is implemented. In Section 4, the JCSVR method is applied to online silicon content prediction. Finally, a conclusion is drawn in Section 5.

2. CSVR-Based Soft Sensor Model

In this section, how to integrate the maximization correntropy criterion [45,46] and SVR into a CSVR-based unified framework is formulated. The CSVR model f is to fit a dataset { S } = { X , Y } , where { X } = { x i } i = 1 N and { Y } = { y i } i = 1 N are N input and output data samples, respectively. The relationship is formulated as [45]:
y i = f ( x i ; θ ) + e i = f ( x i ; w , b ) + e i = w T ϕ ( x i ) + b + e i ,   i = 1 , , N
where f(·) is the model; ei is the noise item of the ith sample; xi is an input vector composed of several online-measured variables. The model parameter vector and the bias are w and b, respectively, and θ = [ w T , b ] T . The CSVR model is solved using the optimization problem below [45]:
{ min J ( w , b , ρ ) = γ 2 i = 1 N ρ ( e i ) e i 2 + 1 2 w 2 s . t .   y i w T ϕ ( x i ) b e i = 0 ,   i = 1 , , N
where γ (γ > 0) is the regularization parameter determining the trade-off between the approximation accuracy and the model’s complexity. Several approaches [45] are available for selection of the kernel width σ of the related items ρ ( e i ) = exp ( e i 2 2 σ 2 ) σ 3 2 π . Here, it is simply adopted as σ = max | e i | 2 2 ,   i = 1 , , N [46].
According to Liu and Chen [46], a two-step iterative algorithm is adopted to obtain the solution of above problem in Equation (2). Finally, a CSVR soft sensor model is established. For a test sample x t , its prediction y ^ t is formulated below:
y ^ t = f ( w , b ; x t ) = i = 1 N α i ϕ ( x i ) , ϕ ( x t ) + b = α T k t + b
where k t = [ k t 1 , , k t N ] T R N × 1 , k t i = ϕ ( x t ) , ϕ ( x i ) , i = 1 , , N is a kernel column.
For an established CSVR model, the corresponding weight of a training sample x i is ρ ( e i ) = exp ( e i 2 2 σ 2 ) σ 3 2 π . Using the weights, the uncertainty of the training data can be quantified. Generally, the outliers are only a small portion of all data, and they can be automatically assigned with relatively smaller weights [46]. Consequently, those candidate outliers can be identified using a simple criterion in Equation (4):
ρ ( e i ) < ρ ¯
where ρ ¯ is a cutoff value and it can be chosen as a small one less than 1 after simply normalizing all the weights ρ ( e i ) , i = 1 , , N into the range of [0, 1].
In summary, the candidate outliers can be detected simultaneously using the weights of an established CSVR model. Interestingly, although the outliers are temporarily not removed out, they cannot degrade the prediction performance of CSVR due to their relatively small weights [45,46]. At a glance, the CSVR method is similar with some weighted SVR methods, e.g., in [47,48]. However, most weighted SVR methods are heuristic [47,48]. For complex industrial data, it is difficult to design suitable weighted strategies. Unlike those heuristic schemes, a reliable CSVR model can be constructed more directly for noisy data.

3. Correntropy-Based Local Modeling and Prediction Method

3.1. JCSVR-Based Local Model

In this section, the JCSVR modeling method for online prediction of a query sample xq is described. First, search similar samples in the database S as a similar set S q using some defined similarity criteria. Second, establish a JCSVR model fJCSVR(xq) with S q . Third, obtain y ^ q for xq online. With the same implementations, a new JCSVR model can be constructed for another query sample.
As a common similarity, the Euclidean-distance-based similarity index (SI) is defined below [42]:
S I q i = exp ( d q i ) = exp ( x i x q ) , i = 1 , , N
where dqi denotes the similarity between xq and xi in the historical set. Obviously, 0 S I q i 1 . When S I q i approaches to 1, xq and xi are almost the same. Other similarity criteria (e.g., correlation-based similarity) [41,43,44] can also be utilized to search similar samples.
To select a suitable dataset S q with n q similar samples, the n max most similar samples can be ranked using the SI criterion in Equation (5). Correspondingly, a cumulative similarity factor (CSF) C S F q n is defined below [44]:
C S F q n = i = 1 n q S I q i i = 1 n max S I q i , n q n max
where C S F q n denotes the cumulative similarity of n q most similar samples of all n max samples. The CSF index can determine the most similar samples simply. For example, C S F q n = 0.85 means 85% of the similar samples are chosen [44]. Using the similarity criterion, a similar dataset S q is utilized to construct the JCSVR model.

3.2. Implementations of the Proposed Method

In this section, the JCSVR-based online modeling method is enhanced for a relative long-term utilization. Generally, there are some outliers in the initial training dataset. In the offline modeling stage, the CSVR method is first applied to the initial training dataset. After this preprocessing step, some outliers can be identified. Additionally, to make the JCSVR method more efficient in computation, the training data are clustered into several groups. This can divide the whole dataset into several subsets. The data in each subset show similar characteristics. Consequently, for online prediction of a query sample, only its similar data are searched. This can improve the computation efficiency. The step-by-step procedures of the JCSVR-based modeling method for online silicon content prediction are summarized below:
Step 1.
Collect the process input and output data, i.e., { S } = { X , Y } , for training of the CSVR model.
Step 2.
Train a CSVR model using the common cross-validation training strategy [46]. The weights ρ ( e i ) ,   i = 1 , , N can be obtained simultaneously. Then normalize all the weights ρ ( e i ) ,   i = 1 , , N into the range of [0, 1]. Using Equation (4) to identify the outliers and assign them into a outlier set S o u t l i e r . The relative clean dataset can be denoted as S n o r m a l = { X n o r m a l , Y n o r m a l } .
Step 3.
Applying a simple fuzzy c-means (FCM) clustering approach [49] to S n o r m a l , the training samples are clustered into l sub-classes, denoted as { S n o r m a l , 1 , S n o r m a l , 2 , , S n o r m a l , l } . For X n o r m a l , each sub-class has a center denoted as { c n o r m a l , 1 , c n o r m a l , 2 , , c n o r m a l , l } .
Step 4.
For online prediction of a new input measurement x q , judge which center of the sub-classes { c n o r m a l , 1 , c n o r m a l , 2 , , c n o r m a l , l } is its nearest one. If c n o r m a l , j is the nearest to x q , only search the similar set S q in S n o r m a l , j using the similarity criterion (Equations (5) and (6)). A JCSVR model for x q can be online constructed and the prediction y ^ q is obtained.
Step 5.
If new training data S n e w = { X n e w , Y n e w } are available, combine these data into S (i.e., S = S n e w S ) and go to step 1. Otherwise, go to step 4 and repeat the same procedure for online prediction of another new input x q + 1 .
The main implemented steps of the JCSVR-based soft sensor modeling and prediction are summarized in Figure 1. For industrial data, candidate outliers are simply identified without considerable efforts. Additionally, step 2 and step 3 can be implemented offline. This can improve the computation efficiency for the online JCSVR modeling method. Consequently, the proposed JCSVR-based local method can provide a relative long-term utilization for the silicon content prediction.

4. Industrial Silicon Content Prediction

The presented JCSVR-based local modeling method is applied to online prediction of the silicon content in an industrial blast furnace in China. The input variables correlated with the silicon content include the blast temperature, the blast volume, the gas permeability, the top pressure, the top temperature, the ore/coke ratio, and the pulverized coal injection [21,22,24]. The sampling time of most of these input variables is 1 min.Additionally, the time difference between the silicon content and input variables is selected according to expert experience [31]. For example, the time difference between the silicon content and the top pressure is about 2 h. The silicon content is analyzed offline and infrequently. Consequently, the soft sensor is constructed using the online measured variables.
After simply removing obvious outliers using the 3-sigma criterion, a set of 440 data samples is investigated. The historical set consists of 240 data. The rest 200 data points are for testing. It should be noted that the data are noisy and still contain some inconspicuous outliers. The normal probability of two input variables, including the top pressure and the top temperature, is shown in Figure 2a,b, respectively. The distribution results indicate that the process variables violate the Gaussian distribution denoting by the red lines in Figure 2a,b, respectively. The other process variables not plotted here are also non-Gaussian distribution. Additionally, as illustrated in Figure 3, several input variables exhibit the nonlinear relationship, and the data in different operating areas are distributed irregularly.
To show the advantage of JCSVR, it is compared with three SVR-based soft sensors, including JLSSVR [25], CSVR [46], and LSSVR [47]. To evaluate the prediction performance of different models, three indices of the root-mean-square error (RMSE), relative RMSE (simply noted as RE), and the hit rate (HR) [21,22,23,24,25,26,27,28,29,30,31] are utilized and defined below, respectively:
RMSE = q = 1 N tst ( y q y ^ q N tst ) 2
RE = q = 1 N tst ( y q y ^ q y q ) 2 / N tst
{ HR = q = 1 N tst H q N tst × 100 % where H q = { 1 , | y ^ q y q | 0.1 0 , else
where y q and y ^ q are the actual value and the predicted value, respectively, and N tst is the number of test data point.
The effect of a CSVR model is first investigated. After training, the main results, including the weighted terms ρ ( e i ) , of a CSVR model are shown in Figure 4. Using the correntropy-based strategy, the outliers can be assigned with smaller weights different from most normal samples. As a result, the bad influence of outliers can be reduced. Here, the cut-off parameter is selected as ρ ¯ = 0.7 . As shown in the bottom subplot of Figure 4, some candidate outliers can be identified directly. Finally, altogether 44 candidate outliers are chosen from all 240 training data. About 20% (44/240 = 18.3%) abnormal data, this indicates that the training data are noisy and contain several inconspicuous outliers. If the negative effect of these outliers are not removed, the prediction performance of established soft sensors cannot be good.
For comparison, the performance indices of the CSVR and LSSVR methods for the training data are listed in Table 1. The fitting results of both CSVR and LSSVR methods are not good. One main reason is that the data are noisy, non-Gaussian, and unevenly distributed. If the model fits all noisy training data, especially for the outliers, the over-fitting problem occurs. It can be noticed that the traditional LSSVR model cannot provide more information about the training data, treating all the training data equally. Different from LSSVR, the CSVR model can distinguish outliers from normal data and assign the training data with suitable weights.
The dynamics may change in an industrial blast furnace. In such a situation, a fixed soft sensor model may be not accurate for future data [25,30]. Here, the proposed JCSVR-based method is compared with a recent local modeling method, named JLSSVR [25]. For the test data, the online prediction results and corresponding absolute prediction errors ( | y q y ^ q | ) of JCSVR and JLSSVR methods are shown in Figure 5 and Figure 6, respectively. To show the result more clear, only the first 70 testing samples are plotted. As aforementioned, clean data are needed for online construction of a good local model. With some unwanted outliers, the prediction performance of a local model may be unreliable. Therefore, the prediction results shown in Figure 5 and Figure 6 indicate that JCSVR is superior to JLSSVR for industrial data-driven modeling problems with noisy data.
The main properties of the JCSVR, JLSSVR, CSVR, and LSSVR approaches are summarized in Table 2. Briefly, the outlier identification and local modeling are integrated into the JCSVR method. Detailed values about online silicon content prediction comparisons of four methods are listed in Table 3. It shows that the JCSVR method, for the test set, achieves the best prediction performance. Additionally, local models are generally more accurate than their global ones. For example, JCSVR shows better prediction performance than only using a CSVR model.
To show the relative prediction errors (i.e., y q y ^ q y q ) of four methods, their corresponding box plots are shown in Figure 7. The band inside the box is the median value, and the box edges denote the first and third quartiles. A few outliers are shown individually. Among four methods, JCSVR exhibits the narrowest distribution. The median value of JCSVR is nearest to 0. These results imply that JCSVR has the best prediction performance. One main reason is that the database is maintained continually. In contrast, without maintenance of the database, JLSSVR and LSSVR become unreliable and not suitable for long-term prediction. This is a common problem of traditional soft sensor models utilized in industrial processes. Based on all the prediction results and analysis, JCSVR is the most suitable one among all of the methods.

5. Conclusions

This work has proposed a correntropy-based local soft sensor modeling method for silicon content prediction when the collected data contain uncertainties. Its main distinguished characteristics are summarized. First, the soft sensor and outlier detection can be integrated into a CSVR modeling framework. By simply removing the candidate outliers, the updated historical data are more reliable for construction of local models. Second, by incorporating the database update into the clustering-based JCSVR method, better prediction performance can be achieved. Consequently, the proposed method can reduce the effect of outliers. Compared with several methods, better silicon content prediction results of JCSVR are obtained. There are still several interesting research directions worth investigating. First, other forms of correntropy can be adopted to adapt to the uncertainty of sensor data. Second, development of efficient feature extraction method for noisy data is interesting. Third, how to incorporate process knowledge to further improve the prediction accuracy is important and challenging.

Acknowledgments

The authors would like to gratefully acknowledge the National Natural Science Foundation of China (grant no. 61640312) and Foundation of Key Laboratory of Advanced Process Control for Light Industry (Jiangnan University), Ministry of Education, China (grant no. APCLI1603) for their financial support.

Author Contributions

K. Chen and Y. Liu conceived and designed the experiments; Y. Liang performed the experiments; K. Chen and Y. Liu analyzed the data; Y. Liu contributed analysis tools; K. Chen, Z. Gao and Y. Liu wrote the paper.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
CSFCumulative Similarity Factor
CSVRCorrentropy Support Vector Regression
HRHit Rate
JCSVRJust-in-time Correntropy Support Vector Regression
JLSSVRJust-in-time Least Squares Support Vector Regression
LSSVRLeast Squares Support Vector Regression
RERelative root-mean-square Error
RMSERoot Mean Square Error
SISimilarity Index
SVRSupport Vector Regression

References

  1. Wijk, O.; Torssell, K.; Bl, X.; Torssell, K.; Wijk, O. Prediction of the blast furnace process by a mathematical model. ISIJ Int. 1992, 32, 481–488. [Google Scholar]
  2. Sugawara, K.; Morimoto, K.; Sugawara, T.; Dranoff, J.S. Dynamic behavior of iron forms in rapid reduction of carbon-coated iron ore. AIChE J. 1999, 45, 574–580. [Google Scholar] [CrossRef]
  3. Radhakrishnan, V.R.; Ram, K.M. Mathematical model for predictive control of the bell-less top charging system of a blast furnace. J. Process. Control. 2001, 11, 565–586. [Google Scholar] [CrossRef]
  4. Nogami, H.; Chu, M.; Yagi, J.I. Multi-dimensional transient mathematical simulator of blast furnace process based on multi-fluid and kinetic theories. Comput. Chem. Eng. 2005, 29, 2438–2448. [Google Scholar] [CrossRef]
  5. Nishioka, K.; Maeda, T.; Shimizu, M. A three-dimensional mathematical modelling of drainage behavior in blast furnace hearth. ISIJ Int. 2005, 45, 669–676. [Google Scholar] [CrossRef]
  6. Ueda, S.; Natsui, S.; Nogami, H.; Jun-Ichiro, Y.; Ariyama, T. Recent progress and future perspective on mathematical modeling of blast furnace. ISIJ Int. 2010, 50, 914–923. [Google Scholar] [CrossRef]
  7. Singh, H.; Sridhar, N.V.; Deo, B. Artificial neural nets for prediction of silicon content of blast furnace hot metal. Steel Res. Int. 1996, 67, 521–526. [Google Scholar] [CrossRef]
  8. Radhakrishnan, V.R.; Mohamed, A.R. Neural networks for the identification and control of blast furnace hot metal quality. J. Process. Control. 2000, 10, 509–524. [Google Scholar] [CrossRef]
  9. Chen, J. A predictive system for blast furnaces by integrating a neural network with qualitative analysis. Eng. Appl. Artif. Intell. 2001, 14, 77–85. [Google Scholar] [CrossRef]
  10. Jimenez, J.; Mochon, J.; Sainz, D.A.J.; Obeso, F. Blast furnace hot metal temperature prediction through neural networks-based models. ISIJ Int. 2007, 44, 573–580. [Google Scholar] [CrossRef]
  11. Saxen, H.; Pettersson, F. Nonlinear prediction of the hot metal silicon content in the blast furnace. ISIJ Int. 2007, 47, 1732–1737. [Google Scholar] [CrossRef]
  12. Pettersson, F.; Chakraborti, N.; Saxén, H. A genetic algorithms based multi-objective neural net applied to noisy blast furnace data. Appl. Soft Comput. 2007, 7, 387–397. [Google Scholar] [CrossRef]
  13. Nurkkala, A.; Pettersson, F.; Saxén, H. Nonlinear modeling method applied to prediction of hot metal silicon in the ironmaking blast furnace. Ind. Eng. Chem. Res. 2011, 50, 9236–9248. [Google Scholar] [CrossRef]
  14. Hao, X.; Shen, F.; Du, G.; Shen, Y.; Xie, Z. A blast furnace prediction model combining neural network with partial least square regression. Steel Res. Int. 2005, 76, 694–699. [Google Scholar] [CrossRef]
  15. Bhattacharya, T. Prediction of silicon content in blast furnace hot metal using partial least squares (PLS). ISIJ Int. 2005, 45, 1943–1945. [Google Scholar] [CrossRef]
  16. And, M.W.; Saxén, H. On the development of predictive models with applications to a metallurgical process. Ind. Eng. Chem. Res. 2000, 39, 982–988. [Google Scholar]
  17. Gao, C.; Zhou, Z.; Chen, J. Assessing the predictability for blast furnace system through nonlinear time series analysis. Ind. Eng. Chem. Res. 2008, 47, 3037–3045. [Google Scholar] [CrossRef]
  18. Waller, M.; Saxen, H. Application of nonlinear time series analysis to the prediction of silicon content of pig iron. ISIJ Int. 2002, 42, 316–318. [Google Scholar] [CrossRef]
  19. Miyano, T.; Kimoto, S.; Shibuta, H.; Nakashima, K.; Ikenaga, Y.; Aihara, K. Time series analysis and prediction on complex dynamical behavior observed in a blast furnace. Phys. D: Nonlinear Phenom. 2000, 135, 305–330. [Google Scholar] [CrossRef]
  20. Martin, R.D.; Obeso, F.; Mochon, J.; Barea, R.; Jimenez, J. Hot metal temperature prediction in blast furnace using advanced model based on fuzzy logic tools. Ironmak. Steelmak. 2007, 34, 241–247. [Google Scholar] [CrossRef]
  21. Zeng, J.S.; Gao, C.H. Improvement of identification of blast furnace ironmaking process by outlier detection and missing value imputation. J. Process. Control. 2009, 19, 1519–1528. [Google Scholar] [CrossRef]
  22. Jian, L.; Gao, C.; Xia, Z. A sliding-window smooth support vector regression model for nonlinear blast furnace system. Steel Res. Int. 2011, 82, 169–179. [Google Scholar] [CrossRef]
  23. Gao, C.; Jian, L.; Luo, S. Modeling of the thermal state change of blast furnace hearth with support vector machines. IEEE Trans. Ind. Electron. 2012, 59, 1134–1145. [Google Scholar] [CrossRef]
  24. Jian, L.; Shen, S.; Song, Y. Improving the solution of least squares support vector machines with application to a blast furnace system. J. Appl. Math. 2012, 2012, 12. [Google Scholar] [CrossRef]
  25. Liu, Y.; Gao, Z. Enhanced just-in-time modelling for online quality prediction in bf ironmaking. Ironmak. Steelmak. 2015, 42, 321–330. [Google Scholar] [CrossRef]
  26. Gao, C.; Chen, J.; Zeng, J.; Liu, X.; Sun, Y. A chaos-based iterated multistep predictor for blast furnace ironmaking process. AIChE J. 2009, 55, 947–962. [Google Scholar] [CrossRef]
  27. Luo, S.; Gao, C.; Zeng, J.; Huang, J. Blast furnace system modeling by multivariate phase space reconstruction and neural networks. Asian J. Control. 2013, 15, 553–561. [Google Scholar] [CrossRef]
  28. Gao, C.; Zeng, J.; Zhou, Z. Identification of multiscale nature and multiple dynamics of the blast furnace system from operating data. AIChE J. 2011, 57, 3448–3458. [Google Scholar] [CrossRef]
  29. Chu, Y.; Gao, C. Data-based multiscale modeling for blast furnace system. AIChE J. 2014, 60, 2197–2210. [Google Scholar] [CrossRef]
  30. Nurkkala, A.; Pettersson, F.; Saxén, H. A study of blast furnace dynamics using multiple autoregressive vector models. ISIJ Int. 2012, 52, 1763–1770. [Google Scholar] [CrossRef]
  31. Saxén, H.; Gao, C.; Gao, Z. Data-driven time discrete models for dynamic prediction of the hot metal silicon content in the blast furnace—A review. IEEE Trans. Ind. Inform. 2013, 9, 2213–2225. [Google Scholar] [CrossRef]
  32. Kano, M.; Nakagawa, Y. Data-based process monitoring, process control, and quality improvement: Recent developments and applications in steel industry. Comput. Chem. Eng. 2008, 32, 12–24. [Google Scholar] [CrossRef] [Green Version]
  33. Kadlec, P.; Gabrys, B.; Strandt, S. Data-driven soft sensors in the process industry. Comput. Chem. Eng. 2009, 33, 795–814. [Google Scholar] [CrossRef]
  34. Zur, R.M.; Jiang, Y.; Pesce, L.L.; Drukker, K. Noise injection for training artificial neural networks: A comparison with weight decay and early stopping. Med. Phys. 2009, 36, 4810–4818. [Google Scholar] [CrossRef] [PubMed]
  35. Arbabi, V.; Pouran, B.; Campoli, G.; Weinans, H.; Zadpoor, A.A. Determination of the mechanical and physical properties of cartilage by coupling poroelastic-based finite element models of indentation with artificial neural networks. J. Biomech. 2016, 49, 631–637. [Google Scholar] [CrossRef] [PubMed]
  36. Arbabi, V.; Pouran, B.; Weinans, H.; Zadpoor, A.A. Combined inverse-forward artificial neural networks for fast and accurate estimation of the diffusion coefficients of cartilage based on multi-physics models. J. Biomech. 2016, 49, 2799–2805. [Google Scholar] [CrossRef] [PubMed]
  37. Liu, H.; Shah, S.; Jiang, W. Online outlier detection and data cleaning. Comput. Chem. Eng. 2004, 28, 1635–1647. [Google Scholar] [CrossRef]
  38. Pearson, R.K. Outliers in process modeling and identification. IEEE Trans. Control. Syst. Technol. 2002, 10, 55–63. [Google Scholar] [CrossRef]
  39. Chiang, L.H.; Pell, R.J.; Seasholtz, M.B. Exploring process data with the use of robust outlier detection algorithms. J. Process. Control. 2003, 13, 437–449. [Google Scholar] [CrossRef]
  40. Hodge, V.J.; Austin, J. A survey of outlier detection methodologies. Artif. Intell. Rev. 2004, 22, 85–126. [Google Scholar] [CrossRef]
  41. Fujiwara, K.; Kano, M.; Hasebe, S.; Takinami, A. Soft-sensor development using correlation-based just-in-time modeling. AIChE J. 2009, 55, 1754–1765. [Google Scholar] [CrossRef]
  42. Bontempi, G.; Birattari, M.; Bersini, H. Lazy learning for local modelling and control design. Int. J. Control. 1999, 72, 643–658. [Google Scholar] [CrossRef]
  43. Cheng, C.; Chiu, M.S. A new data-based methodology for nonlinear process modeling. Chem. Eng. Sci. 2004, 59, 2801–2810. [Google Scholar] [CrossRef]
  44. Liu, Y.; Chen, J. Integrated soft sensor using just-in-time support vector regression and probabilistic analysis for quality prediction of multi-grade processes. J. Process. Control. 2013, 23, 793–804. [Google Scholar] [CrossRef]
  45. Liu, W.; Pokharel, P.P.; Principe, J.C. Correntropy: Properties and applications in non-gaussian signal processing. IEEE Trans. Signal. Process. 2007, 55, 5286–5298. [Google Scholar] [CrossRef]
  46. Liu, Y.; Chen, J. Correntropy kernel learning for nonlinear system identification with outliers. Ind. Eng. Chem. Res. 2014, 53, 5248–5260. [Google Scholar] [CrossRef]
  47. Suykens, J.A.K.; Van Gestel, T.; De Brabanter, J.; De Moor, B.; Vandewalle, J. Least Squares Support Vector Machines; World Scientific: Singapore, 2002. [Google Scholar]
  48. Wen, W.; Hao, Z.; Yang, X. A heuristic weight-setting strategy and iteratively updating algorithm for weighted least-squares support vector regression. Neurocomputing 2008, 71, 3096–3103. [Google Scholar] [CrossRef]
  49. Miyamoto, S.; Ichihashi, H.; Honda, K. Algorithms for Fuzzy Clustering: Methods in c-Means Clustering with Applications; Springer: New York, NY, USA, 2008. [Google Scholar]
Figure 1. Main implemented steps of the proposed soft sensor modeling method integrating both of offline implements and online JCSVR method.
Figure 1. Main implemented steps of the proposed soft sensor modeling method integrating both of offline implements and online JCSVR method.
Sensors 17 01830 g001
Figure 2. (a) The normal probability plot of the top pressure variable in the training set; and (b) the normal probability plot of the top temperature variable in the training set.
Figure 2. (a) The normal probability plot of the top pressure variable in the training set; and (b) the normal probability plot of the top temperature variable in the training set.
Sensors 17 01830 g002
Figure 3. The spatial relationship of several process input variables in the training set.
Figure 3. The spatial relationship of several process input variables in the training set.
Sensors 17 01830 g003
Figure 4. The trained CSVR model for fitting data with the normalized weights ρ ( e i ) . As an affiliated product, those data with ρ ( e i ) < ρ ¯ = 0.7 can be simply identified as candidate outliers.
Figure 4. The trained CSVR model for fitting data with the normalized weights ρ ( e i ) . As an affiliated product, those data with ρ ( e i ) < ρ ¯ = 0.7 can be simply identified as candidate outliers.
Sensors 17 01830 g004
Figure 5. Comparison results of online silicon content prediction using JCSVR and JLSSVR models (test set).
Figure 5. Comparison results of online silicon content prediction using JCSVR and JLSSVR models (test set).
Sensors 17 01830 g005
Figure 6. Comparison results of online silicon content prediction error using JCSVR and JLSSVR models (test set).
Figure 6. Comparison results of online silicon content prediction error using JCSVR and JLSSVR models (test set).
Sensors 17 01830 g006
Figure 7. Comparison results of silicon content prediction relative error distribution with JCSVR, JLSSVR, CSVR, and LSSVR models (test set).
Figure 7. Comparison results of silicon content prediction relative error distribution with JCSVR, JLSSVR, CSVR, and LSSVR models (test set).
Sensors 17 01830 g007
Table 1. Fitting results comparison of CSVR and LSSVR soft sensor models for the training set.
Table 1. Fitting results comparison of CSVR and LSSVR soft sensor models for the training set.
Soft Sensor ModelRMSERE (%)HR (%)
CSVR [46]0.11621.4973.33
LSSVR [24,47]0.12223.0466.25
Table 2. Brief description of four different prediction models.
Table 2. Brief description of four different prediction models.
Prediction ModelBrief Description
Local and UnfixedOutlier DetectionMain ProsMain Cons
JCSVRYesYesMore suitable for noisy and uneven distributed dataModel selection is relatively difficult
JLSSVR [25]YesNoSuitable for online modeling of nonlinear processesNot robust for outliers
CSVR [46]NoYesSuitable for construction of a global model with noisy dataPrediction accuracy for some local regions may not be enough
LSSVR [24,47]NoNoA simple nonlinear modeling methodNot robust for outliers and relatively inaccurate for some local regions
Table 3. Comparison results of online silicon content prediction for the test set using four different models.
Table 3. Comparison results of online silicon content prediction for the test set using four different models.
Prediction ModelRMSERE (%)HR (%)
JCSVR0.07917.7080.5
JLSSVR [25]0.10522.5164.5
CSVR [46]0.12328.1661.5
LSSVR [24,47]0.14131.2952.5

Share and Cite

MDPI and ACS Style

Chen, K.; Liang, Y.; Gao, Z.; Liu, Y. Just-in-Time Correntropy Soft Sensor with Noisy Data for Industrial Silicon Content Prediction. Sensors 2017, 17, 1830. https://doi.org/10.3390/s17081830

AMA Style

Chen K, Liang Y, Gao Z, Liu Y. Just-in-Time Correntropy Soft Sensor with Noisy Data for Industrial Silicon Content Prediction. Sensors. 2017; 17(8):1830. https://doi.org/10.3390/s17081830

Chicago/Turabian Style

Chen, Kun, Yu Liang, Zengliang Gao, and Yi Liu. 2017. "Just-in-Time Correntropy Soft Sensor with Noisy Data for Industrial Silicon Content Prediction" Sensors 17, no. 8: 1830. https://doi.org/10.3390/s17081830

APA Style

Chen, K., Liang, Y., Gao, Z., & Liu, Y. (2017). Just-in-Time Correntropy Soft Sensor with Noisy Data for Industrial Silicon Content Prediction. Sensors, 17(8), 1830. https://doi.org/10.3390/s17081830

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop