Next Article in Journal
Uncertainty-Guided Asymmetric Consistency Domain Adaptation for Histopathological Image Classification
Next Article in Special Issue
Study on Ring Deformation and Contact Characteristics of Thin-Walled Bearing for RV Reducer
Previous Article in Journal
Research on the Calculation Method for the Load Distribution of the Dual-Input Counter-Rotating Transmission System
Previous Article in Special Issue
A Bearing Fault Diagnosis Method Based on Improved Transfer Component Analysis and Deep Belief Network
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Improved Bootstrap Method Based on RBF Neural Network for Reliability Assessment

1
Department of Basic Courses, Naval University of the Engineering, Wuhan 430033, China
2
College of Naval Architecture and Ocean Engineering, Naval University of the Engineering, Wuhan 430033, China
*
Author to whom correspondence should be addressed.
Appl. Sci. 2024, 14(7), 2901; https://doi.org/10.3390/app14072901
Submission received: 26 February 2024 / Revised: 19 March 2024 / Accepted: 28 March 2024 / Published: 29 March 2024
(This article belongs to the Special Issue Advances and Challenges in Reliability and Maintenance Engineering)

Abstract

:
The investigation of the reliability of long-life equipment is typically hindered by the lack of experimental data, which makes accurate assessments challenging. To address this problem, a bootstrap method based on the improved RBF (radial basis function) neural network is proposed. This method utilizes the exponential function to modify the conventional empirical distribution function and fit right-tailed data. In addition, it employs the RBF radial basis neural network to obtain the distribution characteristics of the original samples and then constructs the neighborhood function to generate the input network. The expanded sample is used to estimate the scale and shape parameters of the Weibull distribution and obtain the estimated value of the MTBF (mean time between failures). The bias correction method is then used to obtain the interval estimate for the MTBF. Subsequently, a simulation experiment is conducted based on the failure data of a CNC (computer numerical control) machine tool to verify the effect of this method. The results show that the accuracy of the MTBF point estimation and interval estimation obtained using the proposed method is superior to those of the original and conventional bootstrap methods, which is of major significance to engineering applications.

1. Introduction

Reliability refers to the ability of a product to fulfill a particular function within a predetermined period and under specified conditions. It is an important index that is used to determine the performance of a product. With the advancements in manufacturing technology, the reliability of computer numerical control (CNC) machine tools and other equipment is continuously improving, and failure data are scarce. Therefore, conventional approaches based on large sample data have limited applicability in current reliability research. The accurate estimation of the reliability index of a product using limited sample data remains a key challenge in reliability research.
Currently, two main methods exist to address the problem of an insufficient sample size in reliability assessment. The first approach uses information fusion methods to fuse multiple sources of a priori information to increase the information available for assessment, thereby achieving a higher parameter accuracy. The Bayes method is an example and has been widely used in recent years for parameter estimation with the Weibull distribution [1,2,3]. Although it yields accurate estimation results using fewer samples, it requires a greater amount of a priori information and is affected by subjective factors. The second approach expands small data samples into large data samples. This method can also achieve accurate estimation results using fewer samples; however, the a priori information requirements are higher, and the influence of subjective factors is non-negligible. The bootstrap method is a typical example of this approach, and it is favored by scholars due to its simplicity and convenience. However, it completely relies on the original samples, and, if these samples are not sufficiently informative, the estimation results will have a large error. Therefore, several researchers have studied and improved the bootstrap method, which was originally proposed by Efron [4]. This body of work can be categorized into advancements in methodological accuracy, application to limited datasets, and refinements for enhanced sample representation.
Firstly, concerning methodological accuracy, Picheny et al. [5] leveraged the bootstrap method for reliability estimation and analyzed the relationship between estimation accuracy and confidence level. They established that the bootstrap method attains a higher accuracy at the 95% confidence level. Building on the bootstrap’s foundational use, Amalnerkar et al. [6] integrated the bootstrap information criterion with bootstrap resampling to estimate reliability from limited subsample data, demonstrating the method’s effectiveness even with small subsamples.
Secondly, addressing the challenges of using the bootstrap method with limited data, Zhang et al. [7] proposed an improved bootstrap approach that ensures that expanded samples remain within the mean error of the original samples, thereby maintaining result reliability without altering the probability distribution. Similarly, Sun et al. [8] developed an enhanced Bayes bootstrap method that applies an interpolation method to construct a neighborhood function, facilitating the expansion of the original sample size.
Lastly, on the front of sample representation refinement, Zhao et al. [9] tackled the issue of large deviations between the empirical distribution function of original samples and the actual distribution. They employed a B-spline function to derive an empirical distribution more suited for sampling, which proved to meet the accuracy requirements of engineering applications. Additionally, Tang et al. [10] introduced a bootstrap data expansion technique using the radial basis neural network for assessing small-sample reliability data, validating that the sample distribution characteristics closely mirror the actual distribution.
The Weibull distribution has been demonstrated to effectively model lifetime distributions in several practical engineering problems based on failure data for mechanical components, electronic components, and biological tissues. It can also describe different types of failure rate distributions, ranging from exponential to Rayleigh distributions. Due to the well-characterized nature of the Weibull distribution, it is widely used and among the most successful life models [11]. There is an extensive literature on the applications and analytical methods of Weibull models, such as the recent studies by Thanh Thach et al. [12], Piña Monarrez et al. [13], and Almarashi et al. [14].
This paper proposes an improved bootstrap method using the radial basis neural network based on the research by Tang et al. [10]. Particularly, the exponential function is used to modify the empirical distribution function, and a neighborhood function is introduced to widen the range of values of the expansion samples. Moreover, the confidence interval of the parameters of the Weibull life distribution is estimated using the bias correction method. Finally, the proposed method is validated using failure data obtained from CNC machine tools.

2. Weibull Distribution

The Weibull distribution is extensively used in the field of reliability engineering. It applies to several types of atypical electronic products [15] and can adequately describe the different cases of bathtub curves. Moreover, it can simplify the calculation steps using a transformed functional form.
Several studies on the Weibull distribution have shown that, if a localized failure results in the malfunctioning of the entire system, the life of such a system generally obeys the Weibull distribution.
The probability density function (PDF) of the two-parameter Weibull distribution is the following:
f t = m η t η m 1 exp t η m ,
where m is the shape parameter, η is the scale parameter, and t is the time.
The cumulative distribution function is as follows:
F t = 1 exp t η m .
The reliability function is the following:
R t = 1 F t = exp t η m .
The lapse rate function is as follows:
λ t = f t R t = m η t η m 1 .
The shape parameter m has a strong influence on the Weibull distribution.
Specifically, when m < 1 , the density function f t and the failure rate function λ t are both decreasing functions, suggesting early failure.
When m = 1 , the Weibull distribution is exponential.
Finally, when m > 1 , the density function curve has a single peak, and, when m 3 , the density function curve has a single symmetrical peak, resembling a normal distribution. The failure rate λ t is an increasing function, which suggests wear failure of the product. The density functional curves for different shape parameters m ( n fixed) are shown in Figure 1.

3. Bootstrap Methodology and Its Improvement

3.1. Bootstrap Approach

Let X = x 1 , x 2 , , x n be a set of random variables with a joint distribution F n . To estimate the overall parameter θ , it is generally possible to obtain a sample-based estimate θ n . The basic concept underlying the bootstrap method is that, given X , one can construct an estimate F ^ n of F n and then regenerate a set of random variables X * = x 1 * , x 2 * , , x n * from the distribution F ^ n . If F ^ n is the best estimate of F n , then the relationship between X and F n is adequately represented by the relationship between X * and F ^ n , where F ^ n is called the empirical distribution function of the bootstrap method. This step can be repeated several times to obtain multiple estimates from the reconstructed data according to an estimation equation, as that for θ n . The metrics for measuring the accuracy of the estimator can then be obtained (e.g., using the Bayes method). The principle of the bootstrap method is illustrated in Figure 2. and the procedure is as follows.
The order statistics of the samples can be obtained by arranging the original samples X = x 1 , x 2 , . . . , x n in descending order, i.e., x 1 , x 2 , , x n , where x 1 = min 1 i n x i and x n = max 1 i n x i .
When the parameters of the estimated distribution F ^ n are unknown and if the value of the cumulative distribution function at x i is F i = i n , then the empirical distribution function of the original sample assuming equal probability sampling is
F n x = 0 x < x 1 k / n x k x x k + 1 ,   k = 1 , 2 , , n 1 1 x > x n .
The simulation-based method for generating random samples that obey the empirical distribution function F n x is as follows:
  • Uniformly distributed pseudo-random numbers η in the interval [0, 1] are generated;
  • Let β = n 1 η , i = β + 1 , where β is rounded down;
  • Let x F = x i + β i + 1 x i + 1 x i , where x F is the desired random sample.
A review of existing studies and experimental simulations revealed that the resampling of the data in the bootstrap method relies on the original samples. Therefore, the random samples generated are typically not representative of the whole population, and the estimates obtained may be biased. Moreover, the bootstrap method may not be robust enough in terms of the margins (i.e., extremes) of the data distribution, because the extremes may be over-represented or under-represented in the generated random samples generated. The bootstrap method is, therefore, not reasonable when processing small subsamples of data. There are two main reasons for this. First, when F n x is used to generate random samples, the sample values are extracted from the original sample with a medium probability to form an expanded sample. The resulting sample’s empirical function F n x , which is used to fit the head and the tail samples, is inadequate. Consequently, the samples generated according to F n x are not satisfactorily random. Second, because the generated random samples are limited by the minimum and maximum values x 1 , x n [16] and the values of random samples can only be extracted from the head and tail samples using limited subsample data, the bootstrap method is ineffective.
Moreover, as the values of the random samples can only be obtained from the range of the original samples, the samples are not adequately random [17,18]. Therefore, this paper aims to address these two problems. For the first problem, the exponential distribution function is used to perform the correction of the sample’s empirical function, as the life of electronic products essentially obeys an exponential distribution [19]. For the second problem, based on the corrected empirical distribution function, the radial basis function (RBF) neural network is used to fit the original empirical distribution and obtain the continuous distribution characteristics of the original sample. The input set of the RBF neural network is then obtained using the neighborhood sampling method to ensure that the expanded sample is not limited by the original data. Thus, the expanded sample resembles the actual distribution of the original sample.

3.2. Modified Exponential Sample Empirical Function

For long-life devices such as electronic products, the failure rate rarely increases due to fatigue or wear and tear. Therefore, the tail of the cumulative distribution function of failure can be approximated by the exponential function, with a mean equal to the sample mean [19,20,21]. In this study, the exponential distribution function is utilized to fit the samples and correct the empirical distribution function F n x . The exponential distribution function generally has a good fitting property, which can better estimate the unobserved data points and reduce the influence of random errors on the results. The steps involved are as follows.
  • A linear empirical distribution function is introduced for each segment before n u samples, where n is the total number of samples, and u is the number of tail samples.
  • The samples after n u are fitted using an exponential distribution with the same mean as the original sample. Considering integer values below five for u results in a smaller variance in the right tail fit [22]. The modified empirical distribution function for the samples is
    F n ( x ) = { 0             x < x ( 1 ) i n + x x ( i ) n ( x ( i + 1 ) x ( i ) )      x ( i ) x x ( i + 1 )       i = 1 , , n u + 1 1 u n exp ( x x ( n u ) θ ) x ( n u ) x ,
    where v = 1 u x n u 2 + i = n u + 1 n x i x n u . The simulation-based method for generating random samples that obey the modified empirical distribution function F n x involves the following steps:
    • Uniformly distributed pseudo-random numbers γ in the interval [0, 1] are generated;
    • If η > 1 u n , then x F = x n u v ln 1 γ n u is the desired random number; otherwise, go to step (3);
    • Let β = n 1 γ , and i = β + 1 ; then,
      x F = x i + β i + 1 x i + 1 x i
      is the desired random number.

3.3. Simulation Verification

Let the original sample dataset X = x 1 , x 2 , , x 20 be generated using the exponential distribution with a mean of 100, with a sample capacity of n = 30 . The sample dataset is summarized in Table 1, and its distribution is shown in Figure 3. Sampling is performed N = 1000 times, and X is expanded into X * , with a sample capacity of 1000 × 30. The classical bootstrap method is used to obtain the expanded sample X 1 * , and the improved bootstrap method is used to obtain the expanded sample X 2 * . Their distributions are shown in Figure 4. The distribution characteristics of X * are analyzed, and the results are as follows.
The original samples obey the exponential distribution, and expanded samples are obtained using the bootstrap method by correcting the empirical distribution function X * . Evidently, compared to the conventional bootstrap method, the tails resulting from the correction of the exponential function have an overall distribution that is more in line with the characteristics of the original distribution. The range of the augmented samples generated using the modified bootstrap method increases, which improves the randomness of the augmented samples (Figure 4).
The distribution of the expanded samples in Figure 4, the parameter distribution of the expanded samples in Figure 5, and the estimation of the parameter λ in Table 2 show that the improved bootstrap method overestimates the sample parameter λ . There is an insignificant difference in terms of the accuracy when compared with the conventional method. However, the confidence interval generated by the improved method is markedly narrower compared to that produced by the conventional method, thereby demonstrating a significant superiority in the estimation of confidence intervals. Therefore, this study uses the improved bootstrap method in conjunction with the RBF neural network for sample expansion.

4. Improved Bootstrap Data Expansion Methodology Based on RBF Neural Network and Reliability Assessment

4.1. RBF Neural Network

The RBF neural network is a three-layer feed-forward neural network in which the links from the input layer to the hidden layer are typically fixed and not trained. However, the links from the hidden layer to the output layer are trained. This is a simpler training process than that of standard neural network models [23].
The structure of the RBF neural network is shown in Figure 6.
In the RBF network structure, the input vector of the network is X = x 1 , x 2 , , x n T . The radial basis vector of the RBF network is H = h 1 , h 2 , h j , h m T , and the basis width vector of the hidden nodes of the network is B = b 1 , b 2 , b j , , b m T . Then, the Gaussian basis function h j is
h j = exp X C j 2 b j 2 , j = 1 , 2 , , m ,
where C j is the center vector of the j th hidden node of the network and is determined using the k-means training algorithm [24], and b j is the base width parameter of node j .
The output of the RBF neural network y is
y = ω 1 h 1 + ω 2 h 2 + + ω m h m ,
where W = ω 1 , ω 2 , ω j , ω m T is the weight vector of the network and is determined via least squares approximation learning.

4.2. Improved Bootstrap Data Expansion Method Based on RBF Neural Network

The methodology of the bootstrap data expansion method based on the RBF neural network is depicted in Figure 7.
First, the empirical distribution function expressed in Equation (5) is used to obtain the original sample dataset X = x 1 , x 2 , . . . , x n . The RBF neural network is trained based on the original sample dataset X and the set of empirical distribution values F . Notably, the effectiveness of this method has been demonstrated in [10]. Setting u = 5 , the modified empirical distribution function in Equation (6) is then used to generate a set of empirical distributions based on the original sample dataset F = F n x 1 , F n x 2 , , F n x n . As the RBF neural network produces more reliable outputs for inputs which are close to the training samples [25], a neighborhood function R i is introduced based on the input set. The expanded sample dataset X * is then obtained using the input set S j based on R i . The specific implementation steps are as follows.
  • The original samples X = x 1 , x 2 , . . . , x n are sorted in descending order to obtain the order statistic of the sample X = x 1 , x 2 , , x n , where x 1 = min 1 i n x i , …, x n = max 1 i n x i . Substituting X into Equation (5) yields the set of empirical distribution values of X , F = F n x 1 , F n x 2 , , F n x n .
  • RBF neural network training: The RBF neural network is trained by considering F n x i i = 1 , 2 , , n as the input and x i as the output of the network o j . The Gaussian radial basis function is used in the network, as shown in Equation (8).
  • The neighborhood function R i of the set of X empirical distributions F is introduced. The input set S of the RBF neural network is then obtained, and X is substituted into Equation (6) to obtain the set of X -corrected empirical distribution values F = F n x 1 , F n x 2 , , F n x n . Let f i = F n x i , and the neighborhood function R i be
    R 1 = f 1 f 2 f 1 r , f 1 + f 2 f 1 r R i = f i f i f i 1 r , f i + f i + 1 f i r ,   i = 2 , , n 1 R n = f n f n f n 1 r , f n + f n f n 1 r ,
    where r is the neighborhood parameter ( r 2 ). The input set S = s 1 , s 2 , s n of the RBF neural network is generated sequentially from the uniform distribution of each neighborhood U R i , where s j ~ U R j .
  • The input set S is fed into the RBF neural network to obtain the expanded sample X * . The elements of S are input into the RBF neural network sequentially. When the input is s j , the output is
    o j = w 1 h 1 s j + w 2 h 2 s j + + w m h m s j .
    The set X * consisting of the RBF neural network outputs o j is the augmented sample of X .
  • Steps (3) and (4) are repeated N times to obtain the expanded sample X * k = x k 1 * , x k 2 * , , x k n * , k = 1 , 2 , , N of X .

4.3. Assessment of Reliability Indicators

After obtaining the expanded sample X * k based on the maximum likelihood estimation of the two parameters of the Weibull distribution, the likelihood function L k for the shape parameter m and the scale parameter η is calculated according to the PDF in Equation (1), as follows:
L k ( η ^ k * , m ^ k * ) = i = 1 n f ( x k i * ; η ^ k * , m ^ k * ) ,
where m ^ k * and η ^ k * are the shape parameter estimate and scale parameter estimate for the first k entries of the expanded sample, respectively.
Applying the logarithmic function to Equation (12) yields the log-likelihood function.
l ( η ^ k * , m ^ k * ) = ln L ( η ^ k * , m ^ k * ) = i = 1 n ln f ( x k i * ; η ^ k * , m ^ k * ) = i = 1 n ln m ^ k * ln η ^ k * + ( m ^ k * 1 ) ln x k i * ( m ^ k * 1 ) ln η ^ k * x k i * η ^ k * m ^ k *
The partial derivatives of the parameters m ^ k * and η ^ k * in Equation (13) are equal to 0. This results in two systems of equations, as follows.
l η ^ k * = n η ^ k * + m ^ k * i = 1 n x k i * m ^ k * η ^ k * k + 1 = 0 l m ^ k * = n m ^ k * + i = 1 n ln x k i * n ln η ^ k * i = 1 n x k i * η ^ k * m ^ k * ln x k i * = 0
Solving this system of equations yields m ^ k * and η ^ k * . Then, the mean time between failures (MTBF) is
M T B F k = 0 t f ( t ; η ^ k * , m ^ k * ) d t .
Substituting Equation (1) into Equation (15) yields
M T B F k = 0 t m ^ k * η ^ k * t η ^ k * m 1 e ( t / η ^ k * ) m ^ k * d t .
Using variable substitution and the properties of the Gamma function, this integral can be simplified. The Gamma function is defined as follows:
Γ ( n ) = 0 x n 1 e x d x .
Ultimately, the MTBF is
M T B F k = η ^ k * Γ 1 + 1 m ^ k * .
To ensure the accuracy of interval estimation, the method of bias correction is employed to estimate the confidence intervals. The center point of the confidence interval is modified by calculating the deviation between the original and expanded samples.
The normal quantile corresponding to the position of the original sample in the cumulative distribution function of the expanded sample distribution, z 0 , is calculated as
z 0 = Φ 1 1 N k = 1 N I θ ^ k * < θ ^ ,
where Φ 1 denotes the inverse function of the cumulative distribution function of the standard normal distribution, i.e.,
Φ 1 = 1 2 π x e t 2 / 2 d t 1 ;
I denotes the indicator function; N denotes the number of expanded samples; θ ^ = η ^ , m ^ , M T B F denotes the parameter estimates of the original sample; and θ ^ k * = η ^ k * , m ^ k * , M T B F k denotes the parameter estimates of the k th expansion sample.
The values of the parameter distributions of the expanded samples may not only be biased but also asymmetric, meaning that the width of the confidence intervals may need to be skewed. The acceleration value a is used to modify the shape of the confidence interval and ensure that it adequately covers the true parameter values. In this study, the jackknife resampling method [26] is used to estimate the value of a , as follows:
a = i = 1 n θ ^ i θ ¯ 3 6 i = 1 n θ ^ i θ ¯ 2 3 / 2 ,
where n is the sample size of the original sample, θ ^ i is the parameter estimate of the jackknife sample after excluding the i th observation, and θ ¯ is the average of the parameter estimates of all the jackknife samples, i.e.,
θ ¯ = 1 n i = 1 n θ ^ i .
Using the bias correction z 0 and the acceleration value a , the upper and lower corrected quartiles of the confidence interval α 1 and α 2 are calculated as
α 1 = Φ z 0 + z 0 + Φ a / 2 1 1 a ( z 0 + Φ α / 2 1 ) α 2 = Φ z 0 + z 0 + Φ 1 a / 2 1 1 a ( z 0 + Φ 1 α / 2 1 ) ,
where α is the level of significance and is assumed to be 0.05.
Then, the confidence interval is
C I = percentile θ ^ * , 100 a 1 , percentile θ ^ * , 100 a 2 .

5. Example Analysis

Retrieve the maintenance records for seven CNC machines (designated as K1, K2, …, K7) operating under similar conditions within a single factory, spanning three years, to acquire 61 instances of failure data for this specific model (Table 3).
Point and interval estimation of the shape and scale parameters are performed using the maximum likelihood estimation method, bootstrap method, RBF + bootstrap method, and modified RBF + bootstrap method.
  • Using the maximum likelihood estimation to estimate the parameters of the Weibull distribution for the original data yields m ^ = 1.2694 and η ^ = 1204.7 , and the reliability function is as follows:
    R t = exp t 1204.7 1.2694 .
    Then, MTBF is
    M T B F = 0 R t d t = 0 exp t 1204.7 1.2694 d t = 1118.20   h .
  • The conventional bootstrap method is used to expand the original data, and sampling is performed 1000 times, resulting in the expanded samples X * k = x k 1 * , x k 2 * , , x k 61 * , k = 1 , 2 , , 1000 . The overall distribution of X * is shown in Figure 8a.
Solving Equation (14) yields m ^ k * , η ^ k * , k = 1 , 2 , , 1000 , and the average estimate is the following:
m ¯ * = 1 1000 k = 1 1000 m ^ k * = 1.3199 η ¯ * = 1 1000 k = 1 1000 η ^ k * = 1208.4
The parameter distribution obtained by solving Equation (18) for M T B F k is shown in Figure 8b.
A mean value of M T B F m e a n = 1114.97   h is obtained, and the 95% confidence interval of M T B F is (1099.51, 1130.47), using the bias correction method.
3.
The original data are expanded using the conventional bootstrap method and the improved bootstrap combined with the RBF neural network method. The “newrb” function in MATLAB (v2018b, MathWorks, Inc., Natick, MA, USA) is used to construct the RBF radial basis neural network. The network performance targets, expansion constants, and number of neurons, respectively, are set as goal , spread , max Neuron = 0 , 1 , 25 . The calculation process is shown in Figure 7, and the results converge to yield the expanded samples of X , X * k = x k 1 * , x k 2 * , , x k 61 * , and k = 1 , 2 , , 1000 . The average estimates of the Weibull parameters obtained using the conventional bootstrap method + RBF neural network are m ¯ T B R * , η ¯ T B R * = 1.2773 , 1206.5 , with M T B F T B R = 1118.32   h . The 95% confidence interval of M T B F T B R is (1114.37, 1121.86), using the bias correction method. The average estimates of the Weibull parameters obtained using the improved bootstrap method + RBF neural network are m ¯ B R * , η ¯ B R * = 1.2742 , 1168.2 , with M T B F B R = 1083.41   h . The 95% confidence interval of the M T B F B R obtained using the bias correction method is (1080.13, 1089.15). The overall distribution of M T B F T B R is shown in Figure 9a, and the parameter distribution of X T B R * is shown in Figure 9b. The overall distribution of X B R * is shown in Figure 10a, and the parameter distribution of M T B F B R is shown in Figure 10b.
The cumulative distribution function (CDF) and probability density function (PDF) are obtained using Equations (1) and (2), as shown in Figure 11.
As illustrated in Figure 11, the probability density function (PDF) of the Weibull life distribution, obtained using the RBF plus the enhanced Bootstrap method, exhibits a peak value that is comparatively higher than those obtained through other methods. This observation can be interpreted as follows:
  • A higher peak value indicates that the life data are more concentrated around a specific time period. This suggests that the majority of components or systems are likely to fail around this point in time, demonstrating a lower variability in life spans. In other words, the lifespans of most components are expected to be relatively similar, leading to reduced uncertainty in life expectancy predictions.
  • Additionally, a higher peak value implies more accurate reliability predictions at this specific time point. Since failure events are more likely to occur near the peak, this facilitates more precise planning for maintenance, replacement cycles, and inventory management.
A comparison of the MTBF estimates obtained from the aforementioned methods with the manufacturer-rated MTBF = 1000 h and the corresponding errors are presented in Table 4.
As shown in Table 4, the estimated value of the MTBF obtained using the maximum likelihood estimation method is 1118.20 h, compared with the nominal value of 1000 h, resulting in a relative error of 11.82%. The relative error of the RBF + conventional bootstrap method is 11.83%, which is almost equal to that of the maximum likelihood estimation method, indicating that the expanded samples obtained using the RBF + conventional bootstrap method are overfitted and not sufficiently random for the bootstrap method. The analysis results presented in Figure 9a and Figure 10a reveal that correcting the tail of the empirical distribution function using an exponential distribution attenuates the proportion of large values and makes the distribution more dispersed, which is consistent with the actual life distribution of the equipment. In addition, the relative error for MTBF is reduced to 8.34% from 11.50%, indicating that the proposed data expansion method improves the conventional bootstrap method.
In this study, the confidence interval estimates for the different methods are obtained by combining bias correction methods. As shown in Table 4, the bootstrap method combined with the RBF neural network significantly reduces the length of the confidence intervals and improves the accuracy of the estimates. This demonstrates the effectiveness of combining RBF neural networks with the bootstrap method.

6. Conclusions

The estimation of equipment MTBF is crucial for reliability assessment and analysis. However, when the number of samples is limited, relying on traditional parameter estimation methods simulations is inadequate. Moreover, conventional parameter estimation methods such as maximum likelihood estimation typically fail to estimate the confidence intervals of the parameters.
This paper proposes the use of the bootstrap method for data expansion and reliability assessment. An exponential distribution is utilized to fit right-tailed data and modify the empirical distribution function. The simulation results indicate that the range of the expanded samples generated via the modified bootstrap method increases. The randomness of the expanded samples also increases, and the accuracy of interval estimation improves. In addition, a novel data expansion method is proposed by combining the modified bootstrap method with the RBF neural network. The bias correction method is then used to estimate confidence intervals for the expanded data and improve the estimation accuracy. Through our analysis of the results, this paper proposes that the method of tail data correction using an exponential function effectively enhances the original failure data by moderately incorporating additional information on the product’s reliability characteristics, based on its failure properties. This approach optimizes the raw data. Furthermore, employing the radial basis function (RBF) neural network essentially achieves a better fit of the failure data, thereby improving the accuracy of parameter estimation.
Finally, the proposed method is employed for the reliability assessment of a CNC machine tool. The shape and scale parameters of the corresponding Weibull distribution are estimated to determine the MTBF of the equipment. Simulation experiments show that the proposed method offers a greater improvement in the accuracy of point estimation and interval estimation than the original bootstrap and conventional parameter estimation methods. Therefore, this method has excellent applicability in engineering practice. Despite our research’s contributions, our work is not without limitations. The collection of CNC failure data presents significant challenges, notably due to the scarcity of available data. We compiled data from seven machines operating under ostensibly similar conditions, operating on the assumption that these conditions were identical. However, in reality, variances in operating conditions do exist. Addressing how to effectively integrate data across varying conditions represents a key area for our future investigations.

Author Contributions

Conceptualization, H.L. and H.W.; methodology, S.S.; software, H.W.; validation, H.L. and H.W.; formal analysis, H.W.; investigation, S.S.; resources, S.S.; data curation, H.W.; writing—original draft preparation, H.W.; writing—review and editing, H.L. and S.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data presented in this study are available upon request from the corresponding author. The data are not publicly available due to privacy.

Acknowledgments

We extend our deepest appreciation to Zhang Zhihua for his invaluable guidance, unwavering support, and mentorship throughout this research.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Wilson, A.G.; Fronczyk, K.M. Bayesian reliability: Combining information. Qual. Eng. 2017, 29, 119–129. [Google Scholar] [CrossRef]
  2. Wang, L.; Pan, R.; Wang, X.; Fan, W.; Xuan, J. A Bayesian reliability evaluation method with different types of data from multiple sources. Reliab. Eng. Syst. Saf. 2017, 167, 128–135. [Google Scholar] [CrossRef]
  3. BahooToroody, A.; Abaei, M.M.; Banda, O.V.; Montewka, J.; Kujala, P. On reliability assessment of ship machinery system in different autonomy degree; A Bayesian-based approach. Ocean Eng. 2022, 254, 111252. [Google Scholar] [CrossRef]
  4. Efron, B. Bootstrap methods: Another look at the jackknife. In Breakthroughs in Statistics: Methodology and Distribution; Kotz, S., Johnson, N.L., Eds.; Springer: New York, NY, USA, 1992; pp. 569–593. [Google Scholar]
  5. Picheny, V.; Kim, N.H.; Haftka, R.T. Application of bootstrap method in conservative estimation of reliability with limited samples. Struct. Multidisc. Optim. 2010, 41, 205–217. [Google Scholar] [CrossRef]
  6. Amalnerkar, E.; Lee, T.H.; Lim, W. Reliability analysis using bootstrap information criterion for small sample size response functions. Struct. Multidisc. Optim. 2020, 62, 2901–2913. [Google Scholar] [CrossRef]
  7. Zhang, M.; Liu, X.; Wang, Y.; Wang, X. Parameter distribution characteristics of material fatigue life using improved bootstrap method. Int. J. Damage Mech. 2019, 28, 772–793. [Google Scholar] [CrossRef]
  8. Sun, H.; Hu, W.; Liu, H. Improvement of bayes bootstrap method based on interpolation method. Stat. Decis. 2017, 9, 74–77. [Google Scholar] [CrossRef]
  9. Zhao, Y.; Yang, L. Lifetime evaluation model of small sample based on Bootstrap theory. J. Beijing Univ. Aeronaut. Astronaut. 2022, 48, 106–112. [Google Scholar] [CrossRef]
  10. Tang, S.; Liu, G.; Li, X. A Bootstrap data expansion method based on RBF neural network and its application on IRSS reliability evaluation. China Meas. Test 2022, 48, 22–26. [Google Scholar]
  11. Zhang, C.W. Weibull parameter estimation and reliability analysis with zero-failure data from high-quality products. Reliab. Eng. Syst. Saf. 2021, 207, 107321. [Google Scholar] [CrossRef]
  12. Thanh Thach, T.; Briš, R. An additive Chen-Weibull distribution and its applications in reliability modeling. Qual. Reliab. Eng. Int. 2021, 37, 352–373. [Google Scholar] [CrossRef]
  13. Piña Monarrez, M.R.; Barraza-Contreras, J.M.; Villa-Señor, R.C. Vibration fatigue life reliability cable trough assessment by using Weibull distribution. Appl. Sci. 2023, 13, 4403. [Google Scholar] [CrossRef]
  14. Almarashi, A.M.; Algarni, A.; Nassar, M. On estimation procedures of stress-strength reliability for Weibull distribution with application. PLoS ONE 2020, 15, e0237997. [Google Scholar] [CrossRef]
  15. Poletto, J.P. An alternative to the exponential and Weibull reliability models. IEEE Access 2022, 10, 118759–118778. [Google Scholar] [CrossRef]
  16. Bai, H. A New Resampling Method to Improve Quality Research with Small Samples. Doctoral Dissertation, University of Cincinnati, Cincinnati, OH, USA, 2007. [Google Scholar]
  17. Larsen, J.E.P.; Lund, O.; Nielsen, M. Improved method for predicting linear B-cell epitopes. Immunome Res. 2006, 2, 1–7. [Google Scholar] [CrossRef]
  18. Linton, O.; Song, K.; Whang, Y.-J. An improved bootstrap test of stochastic dominance. J. Econom. 2010, 154, 186–202. [Google Scholar] [CrossRef]
  19. Ali, S.; Ali, S.; Shah, I.; Siddiqui, G.F.; Saba, T.; Rehman, A. Reliability analysis for electronic devices using generalized exponential distribution. IEEE Access 2020, 8, 108629–108644. [Google Scholar] [CrossRef]
  20. Collins, D.H.; Warr, R.L. Failure time distributions for complex equipment. Qual. Reliab. Eng. Int. 2019, 35, 146–154. [Google Scholar] [CrossRef]
  21. Chahkandi, M.; Ganjali, M. On some lifetime distributions with decreasing failure rate. Comput. Stat. Data Anal. 2009, 53, 4433–4440. [Google Scholar] [CrossRef]
  22. Xiao, G.; Li, T. Monte Carlo Method in System Reliability Analysis; Science Press: Beijing, China, 2003. [Google Scholar]
  23. Qian, J.; Chen, L.; Sun, J.-Q. Random vibration analysis of vibro-impact systems: RBF neural network method. Int. J. Non-Linear Mech. 2023, 148, 104261. [Google Scholar] [CrossRef]
  24. Sing, J.; Basu, D.; Nasipuri, M.; Kundu, M. Improved k-means algorithm in the design of RBF neural networks. In Proceedings of the TENCON 2003. Conference on Convergent Technologies for Asia-Pacific Region, Bangalore, India, 15–17 October 2003; pp. 841–845. [Google Scholar]
  25. Nabney, I.T. Efficient training of RBF networks for classification. Int. J. Neural Syst. 2004, 14, 201–208. [Google Scholar] [CrossRef] [PubMed]
  26. Hansen, B.E.; Racine, J.S. Jackknife model averaging. J. Econ. 2012, 167, 38–46. [Google Scholar] [CrossRef]
Figure 1. Weibull distribution probability density function (PDF).
Figure 1. Weibull distribution probability density function (PDF).
Applsci 14 02901 g001
Figure 2. Schematic of the bootstrap method.
Figure 2. Schematic of the bootstrap method.
Applsci 14 02901 g002
Figure 3. Distribution of original sample values.
Figure 3. Distribution of original sample values.
Applsci 14 02901 g003
Figure 4. Distribution of sample values for expansion. (a) Conventional bootstrap; (b) improved bootstrap.
Figure 4. Distribution of sample values for expansion. (a) Conventional bootstrap; (b) improved bootstrap.
Applsci 14 02901 g004
Figure 5. Parameter distribution of expanded samples. (a) Conventional bootstrap; (b) improved bootstrap.
Figure 5. Parameter distribution of expanded samples. (a) Conventional bootstrap; (b) improved bootstrap.
Applsci 14 02901 g005
Figure 6. Schematic of the RBF neural network.
Figure 6. Schematic of the RBF neural network.
Applsci 14 02901 g006
Figure 7. Flowchart of the RBF + bootstrap approach.
Figure 7. Flowchart of the RBF + bootstrap approach.
Applsci 14 02901 g007
Figure 8. Results of the conventional bootstrap approach. (a) Overall distribution; (b) parametric distribution.
Figure 8. Results of the conventional bootstrap approach. (a) Overall distribution; (b) parametric distribution.
Applsci 14 02901 g008
Figure 9. Results of the RBF + conventional bootstrap method. (a) Overall distribution; (b) parametric distribution.
Figure 9. Results of the RBF + conventional bootstrap method. (a) Overall distribution; (b) parametric distribution.
Applsci 14 02901 g009
Figure 10. Results of the RBF + improved bootstrap method. (a) Overall distribution; (b) parametric distribution.
Figure 10. Results of the RBF + improved bootstrap method. (a) Overall distribution; (b) parametric distribution.
Applsci 14 02901 g010
Figure 11. Plots of the cumulative distribution function (CDF) and probability density function (PDF).
Figure 11. Plots of the cumulative distribution function (CDF) and probability density function (PDF).
Applsci 14 02901 g011
Table 1. Generated sample dataset.
Table 1. Generated sample dataset.
Exponential   Distribution   Sample   Dataset   E x   =   100 ,   n   =   30
51.6764.4385.08102.42151.11
52.2768.9788.70113.02152.14
56.9674.0288.88115.54159.35
57.3874.2091.13120.89160.67
61.5676.2494.91131.71164.48
63.9577.65100.55147.34166.26
Table 2. Parameter ( λ ) estimation table.
Table 2. Parameter ( λ ) estimation table.
Parameter Point EstimateEstimation of Confidence Intervals
Expected ValueEstimated ValueErrorEstimated ValueInterval Length
Conventional bootstrap 100.4493 99.7526 0.6967 [99.1357, 100.6059] 1.4702
Improved bootstrap 101.1103 0.6610 [100.8831, 101.3135] 0.4304
Table 3. Equipment failure data.
Table 3. Equipment failure data.
NumberTime between Failures (h)
K163.5215.5302639.5945.51264.25
2332.52591.52894
K2178318.08374.5645.51240.421246.58
13371419.52154
K3215.3230.17837.33838.671017.271486
2491.172842.33
K4537.25862.38953.671027.671045.51274
15842449.253062.08
K5194271.539991310401873.5
2304.53062.5
K6141.5239.5241.83397.67454.51382.5
2027.523122591.83
K7153.5184186409639655.5
68610371375
Table 4. Comparison of reliability assessment results based on MTBF obtained using the different methods.
Table 4. Comparison of reliability assessment results based on MTBF obtained using the different methods.
Point Estimates (h)Rated Value (h)Absolute Error (h)Relative
Error (%)
Confidence IntervalInterval Length
Maximum likelihood method1118.201000118.2011.82\\
Bootstrap1114.97114.9711.50(1099.51, 1130.47)30.96
RBF + conventional bootstrap1118.32118.3211.83(1114.37, 1121.86)7.49
RBF +improved bootstrap1083.4183.418.34(1080.13, 1089.15)9.02
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Wang, H.; Liu, H.; Shao, S. Improved Bootstrap Method Based on RBF Neural Network for Reliability Assessment. Appl. Sci. 2024, 14, 2901. https://doi.org/10.3390/app14072901

AMA Style

Wang H, Liu H, Shao S. Improved Bootstrap Method Based on RBF Neural Network for Reliability Assessment. Applied Sciences. 2024; 14(7):2901. https://doi.org/10.3390/app14072901

Chicago/Turabian Style

Wang, Houxiang, Haitao Liu, and Songshi Shao. 2024. "Improved Bootstrap Method Based on RBF Neural Network for Reliability Assessment" Applied Sciences 14, no. 7: 2901. https://doi.org/10.3390/app14072901

APA Style

Wang, H., Liu, H., & Shao, S. (2024). Improved Bootstrap Method Based on RBF Neural Network for Reliability Assessment. Applied Sciences, 14(7), 2901. https://doi.org/10.3390/app14072901

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop