Next Article in Journal
Sub-/Super-SCI Influencing Factors Analysis of VSC-HVDC and PMSG-Wind Farm System by Impedance Bode Criterion
Previous Article in Journal
A Short Survey on Machine Learning Explainability: An Application to Periocular Recognition
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Alternative Measures of Dependence for Cyclic Behaviour Identification in the Signal with Impulsive Noise—Application to the Local Damage Detection

by
Justyna Hebda-Sobkowicz
1,*,
Jakub Nowicki
1,
Radosław Zimroz
1 and
Agnieszka Wyłomańska
2
1
Faculty of Geoengineering, Mining and Geology, Wroclaw University of Science and Technology, Na Grobli 15, 50-421 Wroclaw, Poland
2
Faculty of Pure and Applied Mathematics, Wroclaw University of Science and Technology, Wybrzeże Wyspiańskiego 27, 50-370 Wroclaw, Poland
*
Author to whom correspondence should be addressed.
Electronics 2021, 10(15), 1863; https://doi.org/10.3390/electronics10151863
Submission received: 31 May 2021 / Revised: 27 July 2021 / Accepted: 30 July 2021 / Published: 3 August 2021
(This article belongs to the Section Circuit and Signal Processing)

Abstract

:
The local damage detection procedures in rotating machinery are based on the analysis of the impulsiveness and/or the periodicity of disturbances corresponding to the failure. Recent findings related to non-Gaussian vibration signals showed some drawbacks of the classical methods. If the signal is noisy and it is strongly non-Gaussian (heavy-tailed), searching for impulsive behvaior is pointless as both informative and non-informative components are transients. The classical dependence measure (autocorrelation) is not suitable for non-Gaussian signals. Thus, there is a need for new methods for hidden periodicity detection. In this paper, an attempt will be made to use alternative measures of dependence used in time series analysis that are less known in the condition monitoring (CM) community. They are proposed as alternatives for the classical autocovariance function used in the cyclostationary analysis. The methodology of the auto-similarity map calculation is presented as well as a procedure for a “quality” or “informativeness” assessment of the map is proposed. In the most complex case, the most resistant to heavy-tailed noise turned out the proposed techniques based on Kendall, Spearman and Quadrant autocorrelations. Whereas in the case of the local fault disturbed by the Gaussian noise, the most efficient proved to be a commonly-known approach based on Pearson autocorrelation. The ideas proposed in the paper are supported by simulation signals and real vibrations from heavy-duty machines.

1. Introduction

Many phenomena observed in the real world reveal cyclic behvaior, for example, meteorological data [1,2], hydrological data [3,4], air quality data [5,6] and financial data [7,8]. If the length of the cycle is approximately the same or the domain of observation could be rescaled to the periodic phenomenon, one can take advantage of well-established mathematical theory to model such processes. Vibration data from rotating machines will be used to illustrate the problem. It should be said that periodically correlated processes, or in other words, second-order cyclostationary processes, are probably the most intuitive and powerful methods for condition monitoring applications, especially for local damage detection in rotating machinery such as bearings or gearboxes. The identification of cyclostationary properties in the signal is related to the detection of periodicity in the signal, often hidden in the whole signal, that is, masked by non-informative components. It should be clearly said that periodicity detection does not mean the simple usage of spectral analysis. Cyclostationary signals are a special class of nonstationary signals for which statistics are changing in time [9].
The most common approach is based on the autocovariance function (or Pearson autocorrelation measure) calculated for the frequency band from spectrally decomposed signals via a filter bank or short-time Fourier transform (STFT). Unfortunately, this methodology can be applied under a strong assumption of the finite-variance distribution of the noise. The classical example is a Gaussian distribution. In many research works, such an assumption is reasonable (the character of the noise is strongly Gaussian) or it is just a way to simplify a complex problem. For processes for which some unexpected disturbances may appear (due to various reasons, e.g., disturbances in the measurement system or disturbance related to the natural process of the machine operation), the assumption regarding the Gaussian character of the signal may be an oversimplification. From the theoretical point of view, the classical approaches for periodic behavior identification cannot be applied for non-Gaussian signals. Clearly, the value of the estimator (i.e., value of the statistic) can be calculated for a real dataset; however, the obtained value might be non-informative, as the estimator does not approximate (estimate) any theoretical quantity. Most of the classical methods for periodic behavior detection are very sensitive to large observations and thus, the crucial information obtained by using these methods can be hidden when we apply them for signals with impulsive behavior. It is a case described in the paper in the applications section. Although cyclostationary analysis is a powerful approach for damage detection in various situations, it should be said that many machines generate vibration signals that could be described as non-Gaussian processes [10,11,12]. In such a case—classical cyclostationary approaches fail—one needs alternative techniques for periodicity measurement (dependence measurement) that are more dedicated for non-Gaussian processes [13,14,15,16,17,18].
This work is a continuation of the authors’ previous research focused on cyclostationary analysis for heavy-tailed signals with an application to local damage detection. Here, we extend the formula proposed by Zak [19], Moshrefzadeh [20] and recently by Kruczek [18] by providing a set of alternative measures that could be useful in local damage detection. Moreover, it is demonstrated that the proposed technique based on alternative measures of dependence can provide more clear diagnostic information, especially if a localized damage is immersed in heavy background noise.

1.1. Local Damage Detection—A Brief Overview

The local damage detection methods for gearboxes and bearings attract the attention of many researchers, thus, there are many diagnostic signal processing methods designed to diagnose the damage. Mostly, they are based on the physical understanding of damage signature analysis (impulsive, cyclic, modulated, nonlinear, etc. phenomenon) [21]. Recently, Kumar et al. [22] discussed the latest and most widely used diagnosis methods for gearbox condition monitoring.
One of the most used elements in rotating machinery is bearings and their failure is the most important reason for machinery breakdown. As the vibration signal from a damaged machine is expected to be nonstationary, the time–frequency methods were favoured by many researchers [23,24]. Moreover, the contact of a damaged rolling element with the inner race, outer race, or their damaged surfaces provides impulsive (wide-band) disturbance, hence the usage of the time–frequency representation allows understanding of the nature of the informative signal and its location in frequency—often called the informative frequency band (IFB). The second issue is critical because often the final stage of the diagnostic process is the envelope spectrum analysis [25,26] to identify fault frequencies. The wavelet analysis is used for both time–frequency representation as well as for signal prefiltering [27,28,29,30]. The combined wavelets with HMM (hidden Markov model) have been used in [31] to detect abnormal behavior. The discussion about the usage of the hidden Markov model (HMM) for bearing diagnostics and prognostics one can find in [32,33]. The EMD (Empirical Mode Decomposition) has also been proposed in many applications as a data-driven approach for signal decomposition [34]. Although the wavelet analysis is very powerful, its use raises many questions related to the base selection, depth of decomposition, stopping criteria, and so forth. As the nature of the signal is generally known, many authors proposed various statistical methods for filter design such as kurtogram [35], spectral kurtosis [36] and many similar approaches [37,38,39]. Optimal and adaptive filters have also been used for signal enhancement [27,40,41,42]. The last two decades delivered (mostly thanks to the massive pioneering work of Jerome Antoni) a solid theoretical background of cyclostationary analysis with engineering applications to gearboxes and rolling element bearings [9,43]. The cyclostationarity of the first order related to the deterministic component associated with rotating elements such as shafts or gear wheels is in general not interesting from a local damage detection perspective. A cyclostationarity of second-order perfectly describes the cyclic impulsive disturbance related to local damage [44,45,46].
The key issue is to “unravel” or more precisely decompose the signal into a family of narrow-band subsignals and measure the periodicity in each of the subsignals. The commonly used way of decomposing the signal into the time–frequency domain is to use STFT [47] or wavelets [48,49]. The typical example of a measure of periodicity is the autocovariance function. However, such a measure has some limitations that will be discussed in the next sections.

1.2. Periodically Correlated Processes and Measures of Dependence

As mentioned, for the identification of hidden cyclostationary components in noisy signals, one may use as a measure of dependence the autocovariance function. Unfortunately, the autocovariance function is sensitive to outliers present in the non-Gaussian signal and one may say that for highly impulsive disturbances the autocovariance is not able to deliver appropriate information about periodicity [50]. To be more precise, different cases of the signal with a local fault are presented in Figure 1. The vibration signals of a Gaussian noise with a local fault with carrier frequency 2.5 kHz is visible in Figure 1a and its spectrogram is presented in Figure 1b. The value of kurtosis of the considered signal is equal to 3.29. Figure 1c presents the same vibration data but with additional non-cyclic impulses added with a carrier frequency 6 kHz, which makes the background noise heavy-tailed distributed—its spectrogram is shown in Figure 1d. The value of kurtosis of the this signal is equal to 200.86 (large value of the kurtosis is typical for heavy-tailed data). The last signal, presented in Figure 1e, is a non-Gaussian signal with local failure where cyclic impulses of fault have a carrier frequency equal to 5 kHz—its time–frequency representation is visible in Figure 1f. The value of kurtosis of the considered signal is equal 202.16. Basis on the vibration signals in the time domain and time–frequency domain, see Figure 1, one can clearly see heavy-tailed behaviour of the data. The corresponding Pearson autocorrelation maps for each of signals with a local fault are presented in Figure 2. The Pearson autocorrelation map uses a normalized autocovariance function (called autocorrelation) to investigate the existence of cyclic subsignals in the time–frequency domain of the tested signal. Figure 2a shows that the Pearson autocorrelation works properly in the case of the background Gaussian noise and there are strong visible cyclic impulses in the frequency band 2–3 kHz. In Figure 2b, one can see that the IFB is in a different frequency band than the wide-band disturbances in the range of 4–10 kHz; however, diagnostic information is still noticeable (2–3 kHz).
In Figure 2b,c, a similar example of the Pearson autocorrelation map for the signal with a local fault and non-Gaussian noise is presented; however, this time, the informative frequency band is inside the range of wide-band disturbances. In such a case, the extraction of diagnostic information is not possible. The autocorrelation response in the information band is misleading—it does not respond to occurring dependencies [17].
Such a case is the most critical. The purpose of this study is to investigate a simpler case—informative and non-informative frequency bands do not overlap each other at all or it may be a limited, partial spectral overlapping [51]. In such a case, the detection of IFB is possible, but more difficult than for the Gaussian case. This formulation of the problem will provide a chance to define the criteria of the evaluation of the quality of the map.
Despite the size of spectral overlapping, more robust measures are needed. Zak proposed several solutions for practical applications such as autocovariation, autocodifference, or FLOC (fractional lower order covariance) [19,52,53].
Recently, Kruczek proposed a deeper theoretical study related to alternative measures of dependence with application to rolling element bearing [17,18]. Nowicki also analysed novel measures for IFB detection [54]. As this direction of research is very promising, a deep study of the literature allowed the collection of a few more possible solutions.
The discussion in this article provides an overview of several dependence measures, such as Pearson autocorrelation, autocovariation, autocodifference, Spearman autocorrelation, Kendall autocorrelation and Quadrant autocorrelation. These measures are used to design—by analogy with the dependence map in [19]—an auto-similarity map that allows the identification of a frequency band with cyclostationary content. Such information is the basis for further filter design procedures. Moreover, it may be used in the detection procedure of local fault. In this paper, the focus is on the comparison and evaluation of the quality of the information provided by a particular method.
It should be stated that damage detection in the presence of non-Gaussian noise is a dynamically developing area of research. Borghesani [55] studied cyclostationary signals of order two in the presence of non-Gaussian noise. He proposed a lot of envelope indicators. Wylomanska et al. [10] proposed an impulsive noise cancellation procedure based on regime-switching models. Wodecki et al. introduced nonnegative matrix factorisation for impulsive source separation [56]. In the comparative study provided by Hebda-Sobkowicz et al. [57], several methods have been recently investigated in the context of local damage detection in the presence of impulsive noise. The study included spectral kurtosis [36], kurtogram [35], stability index (Alpha selector) [58], spectral Gini selector [39], spectral smoothness index [59], infogram [38] and conditional variance-based selector [60]. There are also other known approaches such as protrugram and its modified version [61], which aim to overcome the disadvantages of the known kurtogram. In 2018, Moshrefzadeh [20] proposed an autogram, which is based on the idea of the kurtogram and protrugram but utilizes the autocorrelation and the modified definition of kurtosis. In very recent solutions provided by Schmidt [62], a synchronous averaging procedure has been updated by replacing a mean with a median. It is less sensitive to high amplitude random impulses and the final “averaged” signal is significantly “de-noised”. Interesting solutions are provided by Luan [63] and Zhao [64]—both related to cyclic correntropy. In [65], the efficiency of cyclostationary analysis was studied for various levels of non-Gaussian contribution. The conclusion from this analysis is that for some small level of non-Gaussian noise, the classical analysis still can be used; even if the results are a bit noisy, the diagnostic information can be extracted. When the level of non-Gaussian noise becomes stronger, understanding of bi-frequency map is harder and finally impossible. The papers mentioned above are a motivation for searching for more robust techniques. The main novelty of the paper and the authors’ contribution are related to two topics. First, we propose new methods for local damage detection for signals with non-Gaussian noise. The second issue is much more general and it corresponds to the problem of cyclic behavior identification for processes with heavy-tailed distribution.
The rest of the paper is organized as follows: In Section 2, the problem is formulated in detail. Next, in Section 3, the used measures of dependence are recalled with the step-by-step methodology of the autocorrelation map calculation. The criteria of the assessment of the quality of a map are formulated. In Section 4, the effectiveness of the proposed approach is presented for the simulated signal as well as for the real signal from the crushing machine operating in a real environment. Section 5 concludes the paper.

2. Problem Formulation

The main aim of this paper is to develop a robust method for cyclic behvaior identification in the signal with impulsive noise. As mentioned, a classical measure of dependence used in the cyclostationary analysis, namely, the Pearson autocorrelation, is not providing clear diagnostic information as it is sensitive to outliers (impulsive noise). Thus, alternative measures of dependence will be recalled (usually used for testing the cross-dependence in the data), reformulated to auto-similarity measures, and used for the design of the auto-similarity map. The definition of the auto-similarity map proposed by the authors, which consist of an automatic and universal procedure, enables to apply different measures of dependence also those that were not included in the paper. An example of the application of this universal procedure is the Spearman and Kendall correlation measures, which so far have not been used as self-dependence measures, and the Quadrant measure—so far unknown in machine diagnostics. Finally, one will be able to apply the auto-similarity map to the vibration signal from the machine with local damage. There are several issues to address:
  • The procedure for general auto-similarity map calculation is needed;
  • The procedure for auto-similarity measures to be used in auto-similarity map is needed;
  • The procedure of map quality evaluation is needed.
The novelty of the paper is related to a novel measure of dependence: quadrant correlation, new, automatic procedure for auto-similarity map calculation, detailed analysis of the performance of the considered measures of dependence and proposed criteria for their comparison. The main assumption regarding the quality of the maps is very intuitive: one should expect higher dependence for informative frequency bands and much smaller dependence for the background noise elsewhere. The presence of impulsive noise should not affect the dependence related to the informative frequency band and should not provide any dependence outside that band. Finally, we will define the criteria that one may use for the selection of the optimal dependence measure for the considered problem.

3. Methodology

In this section, the proposed methodology is described. Firstly, the input signal is transformed into the time–frequency domain. Using the STFT [66], a spectrogram is obtained as the time–frequency map:
S ( t , f ) = | S T F T ( t , f ) | 2 ,
where the formula of the STFT is defined as follows:
S T F T ( t , f ) = k = 1 N X ( k ) w ( t k ) e 2 j π f k N ,
w ( t k ) is the shift window, X = X ( 1 ) , , X ( N ) is the input signal of length N, t T is the time point, f F is a frequency, and j is an imaginary unit; see [66] for more details.
Each sub-signal Y i = S ( · , f i ) of the time–frequency map S ( t , f ) contains the energy flow of the signal X associated with the given frequency bin f i and time interval t and it is investigated towards auto-similarity in time. Consequently, the auto-similarity map m a p ( i , k ) = a u t o m e a s u r e ( Y i ( 1 : n ) , k 1 ) for k = 1 , , n is created, see Algorithm 1, where a u t o m e a s u r e ( · , · ) is substituted by the given measure of dependence of the six tested. The measures of dependence used for the investigation of auto-similarity are presented in the next subsection. The proposed measures are tested for performance but using the given methodology one can apply any other measure of dependence.
Algorithm 1: Auto-similarity maps calculation
  Data:
  Input signal X ( t ) for t = 1 , 2 , , N
  Calculate spectrogram: S ( t , f ) = | S T F T ( X , w i n d o w , n f f t , n o v e r l a p , f s ) | 2 ,
  where w i n d o w is a Hann window, n f f t is a number of frequency points, n o v e r l a p is a size of the overlapping and f s is a sample rate,
  Calculate automeasure:
  m—number of rows in matrix S
  n—number of columns in matrix S
   a u t o m e a s u r e ( Y i , k ) —function to calculate the value of measure for sub-signals Y i and lag time k
   Electronics 10 01863 i001

3.1. Definition of Dependence Measures

In this section, we recall the definition of six different measures of dependence, that is, Pearson autocorrelation, autocodifference, autocovariation, Spearman, Kendall and Quadrant autocorrelation measures.
Let { Z ( t ) } , t Z be the time-series with variance σ Z 2 and mean μ Z .

3.2. Pearson Autocorrelation

The Pearson autocorrelation, often called simply “autocorrelation”, is defined as follows [67]:
r ( k ) = cov ( Z ( t ) , Z ( t + k ) ) σ Z 2 ,
where cov ( · , · ) is the covariance function. The estimator of the Pearson autocorrelation for the data vector z = z ( 1 ) , , z ( n ) is defined as follows [68]:
r ^ ( k ) = t = 1 n k ( z ( t ) z ¯ ) ( z ( t + k ) z ¯ ) t = 1 n ( z ( t ) z ¯ ) 2 ,
where z ¯ is a sample mean. The Pearson autocorrelation is the function of delay with the lag operator k of the correlation of a signal with a delayed copy of itself. In other words, it is the similarity between observations as a function of the time lag between them.
The analysis of Pearson autocorrelation enables finding repeating patterns, such as the presence of a periodic signal disturbed by noise, as the autocorrelation of a periodic function is also a periodic function with the same period. The values of the Pearson autocorrelation are in the range of [ 1 , 1 ] . Values close to 1 or −1 indicate a strong linear relationship (positive and negative, respectively). If the process { Z ( t ) } constitutes a sequence of independent random variables, then the value of autocorrelation takes 0. However, zero autocorrelation does not imply independence [69]; there can still exist some (nonlinear) dependence.
Thanks to the simple definition and interpretation, the Pearson autocorrelation is a standard tool for testing linear dependence in statistical data [70]. However, for stochastic processes with heavy-tailed distributions, this tool can be insufficient [54]. Then, more general measures of dependence are recommended.

3.3. Autocodifference

The autocodifference is a measure that enables searching the cyclic behavior in heavy-tailed data. It is well-defined, which means that it exists for a wide class of distributions. The properties of the autocodifference are thoroughly described in [71,72]. The autocodifference for the time lag k for time-series { Z ( t ) } is given by [71]:
C D ( k ) = log E e j Z ( t ) Z ( t k ) log E e j Z ( t ) log E e j Z ( t k ) .
The estimator of the autocodifference can be defined as follows [53]:
C D ^ ( k ) = n k n log ϕ ( 0 , 1 ; k ) + log ϕ ( 1 , 0 ; k ) log ϕ ( 1 , 1 ; k ) ,
where ϕ is a empirical characteristic function for the considered sample with the following definition [53]:
ϕ ( p , q ; k ) = 1 n k t = 1 n k exp ( j ( p z ( t + k ) + q z ( t ) ) ) , k 0 ,
for the points p and q from { 1 , 0 , 1 } , and the imaginary unit j, where z = z ( 1 ) , , z ( n ) is the set of observation corresponding to { Z ( t ) } .
The autocodifference is an appropriate tool for measuring the relationship between random variables from a more general class of distributions (infinitely divisible). For Gaussian processes, the autocodifference is reduced to negative-sign autocovariance.

3.4. Autocovariation

The autocovariation as a function of time lag is defined as follows for time-series { Z ( t ) } ,58]:
C V ( k ) = E Z ( t ) s i g n Z ( t k ) E | Z ( t k ) | .
The estimator of the autocovariation for z = z ( 1 ) , , z ( n ) is given by [53]:
C V ^ ( k ) = t = 1 + k n z ( t ) s i g n ( z ( t k ) ) t = 1 n | z ( t ) | . ,
where l = m a x ( 1 , 1 + k ) and r = m i n ( n , n + k ) .
Covariation was originally defined for α -stable distributions belonging to the class of heavy-tailed distributions [71]. The presented definition is a special case of the FLOC (Fractional Lower Order Covariance) measure [73] and is used for any heavy-tailed distribution for which the theoretical mean exists. More properties of covariation can be found in [71].

3.5. Spearman Autocorrelation

The usage of the Spearman correlation in the cyclic behavior detection during the diagnostic procedure has been already presented in [51,54]. Nowicki et al. [54] presents that well-known Spearman correlation coefficient allows the distinguishing of cyclic (informative) and noncyclic (non-informative) impulses.
The Spearman autocorrelation for time series { Z ( t ) } and time lag k is given by:
ρ ( k ) = cov ( R ( t ) , R ( t + k ) ) σ R 2 ,
where { R ( t ) } is the rank vector corresponding to the time-series { Z ( t ) } and σ R is a standard deviation of variable R.
The estimator of the Spearman autocorrelation is given by [74]:
ρ ^ ( k ) = t = 1 n k ( r ( t ) r ¯ ) ( r ( t + k ) r ¯ ) t = 1 n ( r ( t ) r ¯ ) 2 ,
where r ( 1 ) , , r ( n ) is the rank of observation of the sample z ( 1 ) , z ( 2 ) , , z ( n ) and r ¯ is sample means of rank vector. The Spearman autocorrelation takes values between [ 1 , 1 ] and tests a monotonic relationship in the data vector according to the changing time lag k.

3.6. Kendall Autocorrelation

In Reference [54], the authors present the usage of a Kendall correlation as the IFB selector in a diagnostic procedure of local fault detection. As they show, the Kendall correlation-based selector has similar efficiency to the Spearman correlation-based selector. Kendall autocorrelation for time series { Z ( t ) } and time lag k is defined as follows [75]:
τ ( k ) = P [ ( Z ( t ) Z ( t ) ) ( Z ( t + k ) Z ( t + k ) 0 ] P [ Z ( t ) Z ( t ) ) ( Z ( t + k ) Z ( t + k ) 0 ] .
The above definition can also be written as:
τ ( k ) = c o v [ s g n ( Z ( t ) Z ( t ) ) , s g n ( Z ( t + k ) Z ( t + k ) ] ,
where s g n ( · , · ) is a signum function. Kendall autocorrelation uses pairs of observations and determines the strength of the relationship based on the pattern of concordance and discordance between the pairs. The calculation of the Kendall correlation requires that the process can be ordered.
The estimator of the Kendall autocorrelation is as follows [69]:
τ ^ ( k ) = 2 n ( n 1 ) t > j s g n ( z ( t ) z ( j ) ) ( z ( t + k ) z ( j + k ) ) ,
where z ( 1 ) , , z ( n ) are the observations corresponding to time-series { Z ( t ) } .
The Kendall autocorrelation usually takes smaller values than the Spearman autocorrelation [69]. The Kendall and Spearman autocorrelations measure the relation between rankings. In consequence, they are not affected by any increase between Z ( t ) and Z ( t + k ) . Whereas the Pearson autocorrelation indicates only the linear transformations [69].

3.7. Quadrant Autocorrelation

Another measure of dependence discussed in this paper is a Quadrant autocorrelation (known also as a sign autocorrelation coefficient or Blomquist statistic [76]).
The definition of the Quadrant autocorrelation for the time series { Z ( t ) } and time lag k can be written as:
q ( k ) = E [ s i g n ( Z ( t ) μ Z ) ( s i g n ( Z ( t + k ) μ Z ) ] .
The estimator of the Quadrant autocorrelation is [76]:
q ^ ( k ) = 1 n k t = 1 n s i g n ( z ( t ) m e d ( z ) ( z ( t + k ) m e d ( z ) ) ,
where m e d ( z ) is the median of sample z ( 1 ) , z ( 2 ) , , z ( n ) . The Quadrant correlation is equal 1 for strong positive correlation, 1 for strong negative correlation and 0 for no correlation [77]. However, the Quadrant correlation may take 1 or 1 without the data exhibiting a perfect linear relationship.
In our analysis, the above mentioned measures of dependence are applied to the subsignals Y i from the time–frequency map S ( t , f ) .

3.8. The Quality of Maps Criterion

It can be assumed that an input signal consists of three main components: Gaussian noise, non-Gaussian noise, and Signal of the Interest (SOI). Each of them behaves in a specific way on certain frequency bands. Therefore, the auto-similarity map is divided into three (or more) parts so that each part corresponds to a different signal component: noncyclic noise and cyclic signal of fault. In this way, it is possible to investigate how the considered measures of dependence react to the specific signal components. The result will be satisfactory when the auto-similarity will be low for subsignals in the Gaussian and non-Gaussian noise parts and high for subsignals corresponding to the IFB (cyclic components of local fault).
To facilitate the comparative analysis, the auto-similarity maps corresponding to the previously highlighted components are extracted creating separated matrices, and each matrix is integrated into the vector by calculating the median for each time lag. Presenting all medians on one plot allows comparing how a given measure of dependence reacts to each signal’s components and what pattern of behavior it creates.
A special focus is on the median vectors for the parts related to non-Gaussian noise and the IFB. For the deeper analysis of the first one, the boxplot for the vector values is used, while the analysis of the informative part is described in detail in Section 3.8.
To reduce the complexity of the analysis, a simple aggregated indicator is introduced, called the Impulsiveness Indicator (IMPI). In this case, normalization determines the percentage portion of the sum of the Amplitudes of the Information Signal (AIS) - M components founded by the local maximum function, in the overall sum of the autocorrelation (ACORR) function. The considered measure is defined as:
I M P I = i = 1 M A I S i k = 1 m A C O R R k ,
where M is the number of components to analyse and m is the number of lags used to calculate the total energy, which corresponds to the last component of M. In our analysis, the number of components M has been set arbitrarily to ten—this should be good enough to detect the cyclic behavior [60]. Thus, the total sum of the autocorrelation function is calculated from the given measure up to the value of lag which corresponds to the last (10th) peak. The IMPI takes higher values for the signal, where the cyclic impulses appear. The lack of an impulsive component implies that the IMPI converges to zero. To calculate the IMPI, one can follow Algorithm 2:
The above procedure, described in Section 3, is supported by the Monte Carlo simulation—100 iterations of the signal with different signal’s properties—and real vibrations from heavy duty machines. More details will be provided in the next section.
Algorithm 2: IMPI—Impulsiveness Indicator (the quality of map criterion)
  Data: auto-similarity map: m a p ( i , k ) , F—frequency vector,
  range of the informative frequency band— [ f m i n , f m a x ] ,
  Calculate:
   p 1 —sample number of f m i n in vector F
   p 2 —sample number of f m a x in vector F
  n—number of columns in m a p ( i , k )
  Calculate median:
   Electronics 10 01863 i002
  Calculate: [ A I S 1 , , A I S 10 ] —maxima of the first 10 peaks on the medians’ vector
  Calculate ratio:
   I M P I = s u m ( [ A I S 1 , , A I S 10 ] ) s u m ( A C O R R )

4. Results

In this section, the effectiveness of the proposed methodology is presented for the simulated signal as well as for the real signal from the crushing machine operating in a real environment.

4.1. Analysis of the Simulated Data

The duration of the simulated signal is 1 s and the number of samples is 25,000 Hz. It consists of three elements: Gaussian noise N ( 0 , 0.2 ) ; non-cyclic impulses uniformly distributed on the whole time interval with amplitudes generated from the uniform distribution U ( 0 , 8.5 ) ; cyclic impulses that are generated with a frequency of 30 Hz with fixed amplitudes of impulses equal to 0.2 . The assumed signal length is good enough to perform the reliable analysis (it includes 30 cycles of the signal of interest and several noncyclic impulses) and at the same time not extending the calculation time.
The simulated signal is presented in Figure 3a, and its time–frequency decomposition using STFT is shown in Figure 3b (spectrogram). The following parameters are used to perform STFT: Hann window [66] of length 256, number of frequency points ( n f f t ) equal to 512, number of overlapping elements ( n o v e r l a p ) set to 217 85 % · 256 and the sample rate ( f s ) equal to 25 kHz. As one can see in Figure 3b, cyclic impulses are located in the frequency band 2–3 kHz (carrier frequency is equal to 2.5 kHz), while non-cyclic impulses appear in the range of 3–10 kHz (carrier frequency is equal to 6 kHz).
The dependence measures described in Section 3.1 are applied in a further step to create autocorrelation maps, see Figure 4. The common feature of all maps is a proper indication of cyclic impulses—more or less precisely. However, the considered measures of dependence react differently to noncyclic impulses. The Pearson autocorrelation map takes high values in the frequency range of 3–10 kHz (light blue bandwidth), similar to the autocovariation map, see (Figure 4b). It is a consequence of the sensitivity of these measures to impulsive noise—high values in the data distort the results. In the autocodifference map (Figure 4c) one may see increased values in the frequency range of 3–10 kHz, but only for a small time lapse (0–0.06 s). On the remaining maps, that is, in the Spearman (Figure 4d), Kendall (Figure 4e) and Quadrant (Figure 4f) auto-similarity maps, the high-amplitude noncyclic impulses are not visible and values in the range of 3–10 kHz are similar to the values for Gaussian noise occurring in 0–2 kHz and 10–12 kHz.
To deeper analyze the behavior of each tested measure of dependence, the attention is focused on four parts of the frequency bands, see Figure 5: A: 0–2 kHz, B: 2–3 kHz, C: 3–10 kHz, and D: 10–12 kHz. For these highlighted frequency bands i.e, A, B, C, D for each dependence map, see Figure 4, the medians for each time lag are calculated, see Figure 6. The results for the Pearson autocorrelation (Figure 6a) and the autocovariation (Figure 6b) are very similar, that is, the median trajectories for the considered frequency bands of part B (2–3 kHz) associated with the cyclic impulses (see Figure 6 red lines) show cyclic peaks with values between 0.4 and 0.6 . In the band associated with impulsive noise (part C (3–10 kHz)) these measures behave similarly (see Figure 6 green line), that is, they react to non-cyclic impulses (around 0.1 , 0.2 , 0.25 and 0.3 s) and have values in the range of 0.2–0.35. Considering the autocodifference map, see Figure 6c, it is worth mentioning that it indicates cyclic and noncyclic impulses, but only for small-time lags (0–0.2 s). For the other lags, the median values for each part are approximately constant and their value is 0.2 . The Spearman (Figure 6d), Kendall (Figure 6e) and Quadrant (Figure 6f) autocorrelation measures do not react to impulsive noise in the frequency band of part C (3–10 kHz); therefore, the trajectories of medians for this frequency band (Figure 6, green line) are similar to trajectories of medians corresponding to Gaussian noise at 0–2 kHz (part A) and 10–12 kHz (part D) (see Figure 6 blue line and magenta line). The mentioned measures of dependence use ranks (Spearman, Kendall correlation) or the median (Quadrant correlation), which makes them more resistant to impulse noise and makes them superior to other considered measures of dependence. However, each of these three measures (especially Kendall and Quadrant autocorrelations) has peaks with smaller values than Pearson autocorrelation and autocovariation for the frequency band corresponding to part B.
As mentioned, the considered measures of dependence have different sensitivity to outliers. This property can also be seen in the graphs of the main statistics, that is, boxplots, of the frequency bands of part C associated with noncyclic impulses, see Figure 7. Boxplot is the graphical representation of the distribution of a statistical feature, which in this case is created from the trajectories associated with heavy-tailed noise (green line in Figure 6a–f). The largest IQRs are for Pearson autocorrelation and autocovariation—as they respond to impulsive noise. The other measures have a much lower IQR. The smallest IQR has the autocodifference, but as mentioned, its values decrease rapidly (also cyclic peaks).
Each of the considered measures of dependence indicates cyclic impulses, more or less precisely, see Figure 8. To compare the results not only visually but also numerically, the IMPI is calculated (according to the Equation (15)) for median trajectories of the considered frequency bands of part B (2–3 kHz). For the considered measures, the results are as follows (see Figure 8): Pearson autocorrelation— 0.10951 , Spearman autocorrelation— 0.091236 , autocovariation— 0.09085 , Kendall autocorrelation— 0.084551 , autocodifference— 0.073151 , and Quadrant autocorrelation— 0.070546 . These results mean that the best selection of cyclic impulses is made by Pearson autocorrelation. Spearman also indicates cyclic impulses with a similar precision. The autocodifference and Quadrant autocorrelation have the lowest IMPI values.
However, taking into account the responses of the measures to the heavy-tailed noise (see Figure 9) the most robust measure of dependence is the Kendall autocorrelation (takes the smallest value of autocorrelation in the area of impulsive disturbance—the same as for the area of Gaussian noise, and its values are regular on the whole time interval). Right behind the Kendall autocorrelation, there are the Spearman and Quadrant autocorrelations—they take higher values of autocorrelation in the area of impulsive disturbance, but their values are also regular on the whole time interval.
The worst results for the heavy-tailed noise have autocovariation, Pearson autocorrelation, and autocodifference.
Based on the above presented results, one may conclude that the most reasonable choice between the considered auto-dependency maps is the Spearman and Kendall auto-dependency maps as those which properly indicate the cyclic component and suppress the information about the impulsive noise.

4.2. Monte Carlo Simulations

To validate the results obtained in Section 4.1, Monte Carlo simulations are carried out. The basic boxplot’s statistics, that is, the median and IQR, are calculated for the median vectors of the frequency band associated with noncyclic impulses (to obtain the reference information presented in Figure 7) for a different number of noncyclic impulses and their amplitudes. Additionally, the quality of the selection of cyclic impulses by each measure is analyzed by counting the ratio given by the IMP indicator, defined in Equation (15).
As mentioned above, the given statistics are calculated for different values of two parameters: the number of noncyclic impulses and their amplitudes. The first parameter belongs to the set [ 15 : 6 : 45 ] , the second one is taken from the set [ 3 : 0.5 : 8.5 ] . For each case, 100 iterations are performed. To be more precise, the number of noncyclic impulses equal to 15 means that, in the signal, there are 15 noncyclic impulses in one second, two times less than the number of cyclic impulses, which is fixed—30 in one second (30 Hz). Whereas the amplitude of noncyclic impulses equal to 3 denotes that they are around 15 times larger than the background Gaussian noise generated from the distribution N ( 0 , 0.2 ) .
Based on the Monte Carlo simulations, one can conclude that the medians are constant and the change of parameters does not affect them, see Figure 10. The smallest values are for the Kendall autocorrelation (approximately 0.13), while the largest values are for autocovariation (approximately 0.26). For other measures, the median values are very similar and they are between 0.18 and 0.2.
As for IQR, the results are different depending on the measures of dependence. For Pearson autocorrelation and autocovariation, IQR increases significantly with an increment in the number of noncyclic impulses and with an increase in their amplitude, see Figure 11a,b. This is an intuitive result as these measures are sensitive to impulse noise, therefore increasing the energy of the noncyclic impulses and their quantity make the result worse. A similar trend is noticeable for the Spearman autocorrelation, but the increase is very weak (a difference of 0.01), see Figure 11d. The IQRs of other measures (autocodifference, Kendall, Quadrant autocorrelation measures) become approximately constant (see Figure 11c,e,f) and do not significantly change values along with changing the number of noncyclic impulses and changing amplitudes of non-cyclic impulses. Thus, one can say that they are most resistant to noncyclic impulses.
As mentioned, the IMPI (see Equation (15)) is also calculated during Monte Carlo simulations. The best result is achieved by the Pearson autocorrelation, see Figure 12a. The IMPI values are between 0.11 and 0.13 . Its value slightly decreases with an increasing number of noncyclic impulses and their amplitude. The IMPI value for autocovariation follows a similar trend, but its values are in the range of 0.09–0.1 (Figure 12b). The values of IMPI for other measures of dependence are approximately as follows (considering them in descending order): Spearman (Figure 12d)— 0.095 , Kendall (Figure 12e)— 0.085 , and Quadrant (Figure 12f)— 0.075 autocorrelations, autocodifference (Figure 12c)— 0.07 .
As can be seen, considering the quality of the indicator (the higher the indicator, the more unambiguous the diagnostic information) in the band of occurrence of cyclical impulses, the Pearson autocorrelation is the best measure of dependence. However, if the influence of noncyclic, high-energy impulses is taken into account, then other measures, such as Spearman, Kendall, Quadrant autocorrelation and autocodifference, come up much better than the Pearson autocorrelation.

4.3. Analysis of the Real Vibration Data

In this section, the real vibration signal coming from a copper ore crusher is analyzed. The Hammer crusher is presented in Figure 13, whereas bearing and sensor location are marked in red and green, respectively. The signal duration is 10 s and the number of samples is 250,000, so the sampling frequency is 25,000 Hz. The vibration signals were collected using Endevco accelerometers, while a shaft speed was measured with a BruelKjaer laser probe. The collected data were recorded on the NI DAQ card using Labview Signal Express Software.
The signal is presented in Figure 14a, and its representation in the time–frequency domain is shown in Figure 14b, the spectrogram parameters are the same as presented in Section 4.1. The informative frequency band (cyclic impulses) is located between 3 and 4 kHz, while impulsive noise is in the range of 4–12 kHz.
Using the considered measures of dependence, auto-similarity maps are created, see Figure 15. As one can see, cyclic impulses are visible on each map. The main difference is the identification of noncyclic impulses. As for the simulated signal, auto-similarity maps for the Pearson autocorrelation (Figure 15a) and autocovariation (Figure 15b) show impulsive noise. The auto-similarity map for the autocodifference (Figure 15c) also indicates these noncyclic impulses, but only for small time lags. The auto-similarity maps for Spearman, Kendall, and Quadrant autocorrelations (see Figure 15d–f) are the most homogeneous in the frequency band of 4–12 kHz than the other maps.
One could distinguish 3 specific bandwidth on the auto-similarity map, which represents a different source of vibration: Gaussian noise at 0–3 kHz (part A 1 ), signal of the interest at 3–4 kHz (part B 1 ) , and non-Gaussian (impulsive) noise at 4–12 (part C 1 ) kHz. For all these sets, the medians of the values of auto-similarity map for each time lag are calculated, see Figure 16. The Pearson autocorrelation and autocovariation (see Figure 16a,b) medians have the highest peaks associated with cyclic impulses bandwidth, but they also have distorted median values for the bandwidth corresponding to the impulsive noise (part C 1 ). The other measures indicate much less disturbance in the noncyclic impulses band (see Figure 16c–f) and the median values are similar to the values for the bandwidth corresponding to the Gaussian noise. Thus, one could expect that these measures will be more appropriate for impulsive signals, especially when the carrier frequencies of the noncyclic and cyclic impulses will overlap.
Regarding the impulsive noise, the difference of the results for each measure is also noticed in the boxplots, which are applied to the vectors of medians of the frequency band corresponding to the carrier frequency of the heavy-tailed noise, see Figure 17. The largest IQRs are for autocovariation (greater dispersion) and the Pearson autocorrelation. The Spearman, Kendall, and Quadrant autocorrelations have around twice lower IQR. However, the autocodifference has the smallest IQR and, as mentioned above, its values rapidly decrease. The medians for Pearson, Spearman, and Quadrant autocorrelations and autocodifference are close to each other and they are around 0.18 . The median for the Kendall autocorrelation is approximately 0.13 and for the autocovariation it is around 0.25 .
Figure 18 presents the medians of the values of the auto-similarity map for each time lag calculated from the bandwidth corresponding to the informative frequency band with peaks marked in red. Based on that, the IMPI defined in Equation (15) for each measure of dependence is calculated. They are as follows: Pearson autocorrelation— 0.085806 , autocovariation— 0.083281 , autocodifference— 0.082679 , Spearman autocorrelation— 0.075579 , Kendall autocorrelation— 0.074124 and Quadrant autocorrelation— 0.070759 . The best selection of cyclic impulses is made by the first three listed measures of dependence that have the highest IMPI values. Cyclic impulses are also indicated by other measures but with slightly less precision.
However, taking into account the responses of the measures to the heavy-tailed noise (see Figure 19) the most robust measures of dependence are Kendall, Quadrant and Spearman autocorrelations (take the smallest values of autocorrelation in the area of impulsive disturbances and its values are regular on the whole time interval).

5. Conclusions

In this paper, the efficiency of the presented measures of dependence, that is, Pearson, Spearman, Kendall, Quadrant autocorrelation, autocoddiference, and autocovariation, in the presence of non-Gaussian noise has been investigated. The comparison has been made to realize that there are many dependence measures and, depending on the needs, one can choose the appropriate one. In the literature, one can find the application of the—considered in this paper—Pearson, Kendall, and Spearman correlations as well as autocovariation and autocodifference measures in the bearing diagnosis procedures but the influence of amplitude and number of impulses on the results has not been investigated. What is more, the proposed Quadrant autocorrelation is novel in this field. While Spearman and Kendall correlation were commonly used as the cross-dependency measures, their reformulation into auto measures is innovative. The general procedure for auto-similarity map calculation was presented in-depth. This universal procedure allows to apply different—not considered in the paper—measures of dependence and create auto-similarity maps based on a measure adapted to the needs of the problem under consideration.
This paper has focused on comparing the results of different autocorrelation measures used in the auto-similarity map calculation. The auto-similarity map enables to observe cyclic behavior on separated frequency bands depending on the measure’s responsiveness to the occurring noncyclic impulses. A detailed analysis of the performance of the auto-similarity map and the criteria for their comparison was proposed.
The performance of the maps was checked to deal with three different areas of behavior observed in the time–frequency representation of the heavy-tailed data with a local fault. The analysis was supported by the Monte Carlo simulation—100 iterations of the signal with different signal properties—and a real vibration signal from a heavy duty machine. The results of each of the auto-similarity maps have been checked in the area of Gaussian noise, Gaussian noise with cyclic impulses of local damage, and in the case of heavy-tailed noise. In the case of only Gaussian noise, each of the maps work properly and take small values without any artificial increments. However, if the noise is heavy-tailed, false increases in the value of measures appear, especially in the case of Pearson autocorrelation as well as in autocovariation and autocodifference. The most resistant to heavy-tailed noise turned out to be the Kendall, Spearman, and Quadrant autocorrelations. In the case of the signal with only Gaussian noise and cyclic impulses of local fault without any high-energy disturbances, the Pearson autocorrelation, the most efficient, detects damage (takes the highest value of the IMP indicator).
The proposed dependence maps based on different measures of auto-dependence show a great improvement in the visibility of IFB against the noisy background. They can be an intermediate step for many known local fault detection methods. The final automatic procedure of the selection of the informative frequency band is not considered in this paper.
It is worth emphasizing that the considered simulation signals and real vibration signals contain cyclic and noncyclic impulses which do not overlap each other. This makes the analysis possible. Otherwise, if the carrier frequency of local fault and noncyclic impulses will be the same or largely overlap, then the comparison of considered measures of dependence for different sources of noise may be impossible. It is this case that we pay attention to, and we aim to highlight the superiority of the resistance of the highlighted measures of dependence for non-cyclic impulses—which will be a crucial property in cases with overlapped carrier frequencies of cyclic/noncyclic impulses.

Author Contributions

Conceptualization, J.H.-S., R.Z. and A.W.; methodology, J.H.-S., R.Z. and A.W.; software, J.N.; validation, J.N.; formal analysis, J.H.-S., J.N.; investigation, J.H.-S., R.Z. and A.W.; resources, J.H.-S., R.Z. and A.W.; data curation, R.Z.; writing—original draft preparation, J.H.-S., J.N., R.Z. and A.W.; writing—review and editing, R.Z., A.W.; visualization, J.N.; supervision, R.Z., A.W. All authors have read and agreed to the published version of the manuscript.

Funding

This activity has received funding from EIT RawMaterials GmbH under Framework Partnership Agreement No. 18253 (OPMO—Operational Monitoring of Mineral Crushing Machinery).

Conflicts of Interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

References

  1. Bloomfield, P.; Hurd, H.L.; Lund, R.B. Periodic correlation in stratospheric ozone data. J. Time Ser. Anal. 1994, 15, 127–150. [Google Scholar] [CrossRef]
  2. Dargaville, R.J.; Doney, S.C.; Fung, I.Y. Inter-annual variability in the interhemispheric atmospheric CO2 gradient: Contributions from transport and the seasonal rectifier. Tellus B Chem. Phys. Meteorol. 2003, 55, 711–722. [Google Scholar] [CrossRef]
  3. Jones, R.H.; Brelsford, W.M. Time series with periodic structure. Biometrika 1967, 54, 403–408. [Google Scholar] [CrossRef]
  4. Bukofzer, D. Optimum and suboptimum detector performance for signals in cyclostationary noise. IEEE J. Ocean. Eng. 1987, 12, 97–115. [Google Scholar] [CrossRef] [Green Version]
  5. Hebda-Sobkowicz, J.; Gola, S.; Zimroz, R.; Wyłomańska, A. Identification and statistical analysis of impulse-like patterns of carbon monoxide variation in deep underground mines associated with the blasting procedure. Sensors 2019, 19, 2757. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  6. Hebda-Sobkowicz, J.; Gola, S.; Zimroz, R.; Wyłomańska, A. Pattern of H2S concentration in a deep copper mine and its correlation with ventilation schedule. Measurement 2019, 140, 373–381. [Google Scholar] [CrossRef]
  7. Broszkiewicz-Suwaj, E.; Makagon, A.; Weron, R.; Wyłomańska, A. On detecting and modeling periodic correlation in financial data. Phys. A Stat. Mech. Appl. 2004, 336, 196–205. [Google Scholar] [CrossRef]
  8. Franses, P.H. Periodicity and Stochastic Trends in Economic Time Series; OUP Catalogue; Oxford University Press: Oxford, UK, 1996. [Google Scholar]
  9. Antoni, J.; Bonnardot, F.; Raad, A.; El Badaoui, M. Cyclostationary modelling of rotating machine vibration signals. Mech. Syst. Signal Process. 2004, 18, 1285–1314. [Google Scholar] [CrossRef]
  10. Wyłomańska, A.; Zimroz, R.; Janczura, J.; Obuchowski, J. Impulsive noise cancellation method for copper ore crusher vibration signals enhancement. IEEE Trans. Ind. Electron. 2016, 63, 5612–5621. [Google Scholar] [CrossRef]
  11. Wyłomańska, A.; Żak, G.; Kruczek, P.; Zimroz, R. Application of tempered stable distribution for selection of optimal frequency band in gearbox local damage detection. Appl. Acoust. 2017, 128, 14–22. [Google Scholar] [CrossRef]
  12. Rachev, S.T.; Mittnik, S. Stable Paretian Models in Finance; Wiley: Hoboken, NJ, USA, 2000; Volume 7. [Google Scholar]
  13. Li, Q.; Liang, S.Y. Bearing incipient fault diagnosis based upon maximal spectral kurtosis TQWT and group sparsity total variation denoising approach. J. Vibroeng. 2018, 20, 1409–1425. [Google Scholar] [CrossRef] [Green Version]
  14. Li, Q.; Liang, S.Y. Incipient fault diagnosis of rolling bearings based on impulse-step impact dictionary and re-weighted minimizing nonconvex penalty Lq regular technique. Entropy 2017, 19, 421. [Google Scholar] [CrossRef] [Green Version]
  15. Yu, G.; Li, C.; Zhang, J. A new statistical modeling and detection method for rolling element bearing faults based on alpha–stable distribution. Mech. Syst. Signal Process. 2013, 41, 155–175. [Google Scholar] [CrossRef]
  16. Chen, Z.; Ding, S.X.; Peng, T.; Yang, C.; Gui, W. Fault detection for non-Gaussian processes using generalized canonical correlation analysis and randomized algorithms. IEEE Trans. Ind. Electron. 2017, 65, 1559–1567. [Google Scholar] [CrossRef]
  17. Kruczek, P.; Zimroz, R.; Wyłomańska, A. How to detect the cyclostationarity in heavy-tailed distributed signals. Signal Process. 2020, 172, 107514. [Google Scholar] [CrossRef]
  18. Kruczek, P.; Zimroz, R.; Antoni, J.; Wyłomańska, A. Generalized spectral coherence for cyclostationary signals with alpha-stable distribution. Mech. Syst. Signal Process. 2021, 159, 107737. [Google Scholar] [CrossRef]
  19. Żak, G.; Teuerle, M.; Wyłomańska, A.; Zimroz, R. Measures of dependence for α-stable distributed processes and its application to diagnostics of local damage in presence of impulsive noise. Shock Vib. 2017, 2017, 1963769. [Google Scholar] [CrossRef] [Green Version]
  20. Moshrefzadeh, A.; Fasana, A. The Autogram: An effective approach for selecting the optimal demodulation band in rolling element bearings diagnosis. Mech. Syst. Signal Process. 2018, 105, 294–318. [Google Scholar] [CrossRef]
  21. Randall, R.B. Vibration-Based Condition Monitoring: Industrial, Aerospace and Automotive Applications; John Wiley & Sons: Hoboken, NJ, USA, 2011. [Google Scholar]
  22. Kumar, A.; Gandhi, C.; Zhou, Y.; Kumar, R.; Xiang, J. Latest developments in gear defect diagnosis and prognosis: A review. Measurement 2020, 158, 107735. [Google Scholar] [CrossRef]
  23. Gao, M.; Yu, G.; Wang, T. Impulsive Gear Fault Diagnosis Using Adaptive Morlet Wavelet Filter Based on Alpha-Stable Distribution and Kurtogram. IEEE Access 2019, 7, 72283–72296. [Google Scholar] [CrossRef]
  24. Wodecki, J.; Stefaniak, P.; Obuchowski, J.; Wyłomańska, A.; Zimroz, R. Combination of principal component analysis and time–frequency representations of multichannel vibration data for gearbox fault detection. J. Vibroeng. 2016, 18, 2167–2175. [Google Scholar]
  25. Randall, R.B.; Antoni, J.; Chobsaard, S. The relationship between spectral correlation and envelope analysis in the diagnostics of bearing faults and other cyclostationary machine signals. Mech. Syst. Signal Process. 2001, 15, 945–962. [Google Scholar] [CrossRef]
  26. Abboud, D.; Antoni, J.; Sieg-Zieba, S.; Eltabach, M. Envelope analysis of rotating machine vibrations in variable speed conditions: A comprehensive treatment. Mech. Syst. Signal Process. 2017, 84, 200–226. [Google Scholar] [CrossRef]
  27. Lin, J.; Zuo, M. Gearbox fault diagnosis using adaptive wavelet filter. Mech. Syst. Signal Process. 2003, 17, 1259–1269. [Google Scholar] [CrossRef]
  28. Peng, Z.; Chu, F. Application of the wavelet transform in machine condition monitoring and fault diagnostics: A review with bibliography. Mech. Syst. Signal Process. 2004, 18, 199–221. [Google Scholar] [CrossRef]
  29. Qiu, H.; Lee, J.; Lin, J.; Yu, G. Wavelet filter-based weak signature detection method and its application on rolling element bearing prognostics. J. Sound Vib. 2006, 289, 1066–1090. [Google Scholar] [CrossRef]
  30. Yan, R.; Gao, R.X.; Chen, X. Wavelets for fault diagnosis of rotary machines: A review with applications. Signal Process. 2014, 96, 1–15. [Google Scholar] [CrossRef]
  31. Bakhtazad, A.; Palazoglu, A.; Romagnoli, J.A. Detection and classification of abnormal process situations using multidimensional wavelet domain hidden Markov trees. Comput. Chem. Eng. 2000, 24, 769–775. [Google Scholar] [CrossRef]
  32. Kwan, C.; Zhang, X.; Xu, R.; Haynes, L. A novel approach to fault diagnostics and prognostics. In Proceedings of the 2003 IEEE International Conference on Robotics and Automation (Cat. No. 03CH37422), Taipei, Taiwan, 14–19 September 2003; Volume 1, pp. 604–609. [Google Scholar]
  33. Zhang, X.; Xu, R.; Kwan, C.; Liang, S.Y.; Xie, Q.; Haynes, L. An integrated approach to bearing fault diagnostics and prognostics. In Proceedings of the 2005, American Control Conference, Portland, OR, USA, 8–10 June 2005; pp. 2750–2755. [Google Scholar]
  34. Lei, Y.; Lin, J.; He, Z.; Zuo, M.J. A review on empirical mode decomposition in fault diagnosis of rotating machinery. Mech. Syst. Signal Process. 2013, 35, 108–126. [Google Scholar] [CrossRef]
  35. Antoni, J. Fast computation of the kurtogram for the detection of transient faults. Mech. Syst. Signal Process. 2007, 21, 108–124. [Google Scholar] [CrossRef]
  36. Antoni, J. The spectral kurtosis: A useful tool for characterising non-stationary signals. Mech. Syst. Signal Process. 2006, 20, 282–307. [Google Scholar] [CrossRef]
  37. Barszcz, T.; Jabłoński, A. A novel method for the optimal band selection for vibration signal demodulation and comparison with the Kurtogram. Mech. Syst. Signal Process. 2011, 25, 431–451. [Google Scholar] [CrossRef]
  38. Antoni, J. The infogram: Entropic evidence of the signature of repetitive transients. Mech. Syst. Signal Process. 2016, 74, 73–94. [Google Scholar] [CrossRef]
  39. Miao, Y.; Zhao, M.; Lin, J. Improvement of kurtosis-guided-grams via Gini index for bearing fault feature identification. Meas. Sci. Technol. 2017, 28, 125001. [Google Scholar] [CrossRef]
  40. Combet, F.; Gelman, L. Optimal filtering of gear signals for early damage detection based on the spectral kurtosis. Mech. Syst. Signal Process. 2009, 23, 652–668. [Google Scholar] [CrossRef]
  41. Wodecki, J.; Michalak, A.; Zimroz, R. Optimal filter design with progressive genetic algorithm for local damage detection in rolling bearings. Mech. Syst. Signal Process. 2018, 102, 102–116. [Google Scholar] [CrossRef]
  42. Abboud, D.; Marnissi, Y.; Elbadaoui, M. Optimal filtering of angle-time cyclostationary signals: Application to vibrations recorded under nonstationary regimes. Mech. Syst. Signal Process. 2020, 145, 106919. [Google Scholar] [CrossRef]
  43. Antoni, J. Cyclostationarity by examples. Mech. Syst. Signal Process. 2009, 23, 987–1036. [Google Scholar] [CrossRef]
  44. Antoni, J. Cyclic spectral analysis of rolling-element bearing signals: Facts and fictions. J. Sound Vib. 2007, 304, 497–529. [Google Scholar] [CrossRef]
  45. Wang, D.; Zhao, X.; Kou, L.L.; Qin, Y.; Zhao, Y.; Tsui, K.L. A simple and fast guideline for generating enhanced/squared envelope spectra from spectral coherence for bearing fault diagnosis. Mech. Syst. Signal Process. 2019, 122, 754–768. [Google Scholar] [CrossRef]
  46. Chen, Z.; Mauricio, A.; Li, W.; Gryllias, K. A deep learning method for bearing fault diagnosis based on cyclic spectral coherence and convolutional neural networks. Mech. Syst. Signal Process. 2020, 140, 106683. [Google Scholar] [CrossRef]
  47. Gröchenig, K. Foundations of Time-Frequency Analysis; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2001. [Google Scholar]
  48. Daubechies, I. Ten Lectures on Wavelets; SIAM: Philadelphia, PA, USA, 1992. [Google Scholar]
  49. Mallat, S. A Wavelet Tour of Signal Processing; Elsevier: Amsterdam, The Netherlands, 1999. [Google Scholar]
  50. Grzesiek, A.; Michalak, A.; Wylomanska, A. How to describe the linear dependence for heavy-tailed distributed data. In Applied Condition Monitoring; A Book Series; Springer: Berlin/Heidelberg, Germany, 2021. [Google Scholar]
  51. Nowicki, J.; Hebda-Sobkowicz, J.; Zimroz, R.; Wylomanska, A. Local Defect Detection in Bearings in the Presence of Heavy-Tailed Noise and Spectral Overlapping of Informative and Non-Informative Impulses. Sensors 2020, 20, 6444. [Google Scholar] [CrossRef] [PubMed]
  52. Żak, G.; Obuchowski, J.; Wyłomańska, A.; Zimroz, R. Novel 2D representation of vibration for local damage detection. Min. Sci. 2014, 21, 105–113. [Google Scholar]
  53. Żak, G.; Wyłomańska, A.; Zimroz, R. Periodically impulsive behavior detection in noisy observation based on generalized fractional order dependency map. Appl. Acoust. 2019, 144, 31–39. [Google Scholar] [CrossRef]
  54. Nowicki, J.; Hebda-Sobkowicz, J.; Zimroz, R.; Wyłomańska, A. Dependency measures for the diagnosis of local faults in application to the heavy-tailed vibration signal. Appl. Acoust. 2021, 178, 107974. [Google Scholar] [CrossRef]
  55. Borghesani, P.; Antoni, J. CS2 analysis in presence of non-Gaussian background noise–Effect on traditional estimators and resilience of log-envelope indicators. Mech. Syst. Signal Process. 2017, 90, 378–398. [Google Scholar] [CrossRef]
  56. Wodecki, J.; Michalak, A.; Zimroz, R.; Barszcz, T.; Wyłomańska, A. Impulsive source separation using combination of Nonnegative Matrix Factorization of bi-frequency map, spatial denoising and Monte Carlo simulation. Mech. Syst. Signal Process. 2019, 127, 89–101. [Google Scholar] [CrossRef]
  57. Hebda-Sobkowicz, J.; Zimroz, R.; Wyłomańska, A. Selection of the Informative Frequency Band in a Bearing Fault Diagnosis in the Presence of Non-Gaussian Noise—Comparison of Recently Developed Methods. Appl. Sci. 2020, 10, 2657. [Google Scholar] [CrossRef] [Green Version]
  58. Żak, G.; Wyłomańska, A.; Zimroz, R. Application of alpha-stable distribution approach for local damage detection in rotating machines. J. Vibroeng. 2015, 17, 2987–3002. [Google Scholar]
  59. Wang, D. Some further thoughts about spectral kurtosis, spectral L2/L1 norm, spectral smoothness index and spectral Gini index for characterizing repetitive transients. Mech. Syst. Signal Process. 2018, 108, 360–368. [Google Scholar] [CrossRef]
  60. Hebda-Sobkowicz, J.; Zimroz, R.; Pitera, M.; Wylomanska, A. Informative frequency band selection in the presence of non-Gaussian noise—A novel approach based on the conditional variance statistic with application to bearing fault diagnosis. Mech. Syst. Signal Process. 2020, 145, 106971. [Google Scholar] [CrossRef]
  61. Kruczek, P.; Obuchowski, J. Modified Protrugram Method for Damage Detection in Bearing Operating Under Impulsive Load. In Cyclostationarity: Theory and Methods III; Springer International Publishing: Cham, Switzerland, 2017; pp. 229–240. [Google Scholar]
  62. Schmidt, S.; Zimroz, R.; Chaari, F.; Heyns, P.S.; Haddar, M. A simple condition monitoring method for gearboxes operating in impulsive environments. Sensors 2020, 20, 2115. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  63. Luan, S.; Qiu, T.; Zhu, Y.; Yu, L. Cyclic correntropy and its spectrum in frequency estimation in the presence of impulsive noise. Signal Process. 2016, 120, 503–508. [Google Scholar] [CrossRef]
  64. Zhao, X.; Qin, Y.; He, C.; Jia, L.; Kou, L. Rolling element bearing fault diagnosis under impulsive noise environment based on cyclic correntropy spectrum. Entropy 2019, 21, 50. [Google Scholar] [CrossRef] [Green Version]
  65. Wodecki, J.; Michalak, A.; Zimroz, R. Local damage detection based on vibration data analysis in the presence of Gaussian and heavy-tailed impulsive noise. Measurement 2021, 169, 108400. [Google Scholar] [CrossRef]
  66. Boashash, B. Time-Frequency Signal Analysis and Processing: A Comprehensive Reference; Academic Press: Cambridge, MA, USA, 2015. [Google Scholar]
  67. Dunn, O.J.; Clark, V.A. Basic Statistics: A Primer for the Biomedical Sciences; John Wiley & Sons: Hoboken, NJ, USA, 2009. [Google Scholar]
  68. Box, G.E.; Jenkins, G.M.; Reinsel, G.C.; Ljung, G.M. Time Series Analysis: Forecasting and Control; John Wiley & Sons: Hoboken, NJ, USA, 2015. [Google Scholar]
  69. Balakrishnan, N.; Lai, C.D. Continuous Bivariate Distributions; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2009. [Google Scholar]
  70. Che, Y.; Jia, Y.; Tang, Z.; LAN, F. Application of Pearson correlation coefficient in wind power combination prediction. Guangxi Electr. Power 2016, 3, 50–53. [Google Scholar]
  71. Samorodnitsky, G.; Taqqu, M. Stable Non-Gaussian Random Processes: Stochastic Models with Infinite Variance; Chapman and Hall: London, UK, 1994. [Google Scholar]
  72. Wyłomańska, A.; Chechkin, A.; Gajda, J.; Sokolov, I.M. Codifference as a practical tool to measure interdependence. Phys. A Stat. Mech. Appl. 2015, 421, 412–429. [Google Scholar] [CrossRef] [Green Version]
  73. Ma, X.; Nikias, C.L. Joint estimation of time delay and frequency delay in impulsive noise using fractional lower order statistics. IEEE Trans. Signal Process. 1996, 44, 2669–2687. [Google Scholar]
  74. Kendall, M.; Gibbons, J.D. Rank Correlation Methods, 5th ed.; Charles Griffin Book Series; Oxford University Press: Oxford, UK, 1990. [Google Scholar]
  75. Trivedi, P.K.; Zimmer, D.M. Copula Modeling: An Introduction for Practitioners; Now Publishers Inc.: Boston, MA, USA, 2007. [Google Scholar]
  76. Dürre, A.; Fried, R.; Liboschik, T. Robust estimation of (partial) autocorrelation. Wiley Interdiscip. Rev. Comput. Stat. 2015, 7, 205–222. [Google Scholar] [CrossRef]
  77. Croux, C.; Dehon, C. Influence functions of the Spearman and Kendall correlation measures. Stat. Methods Appl. 2010, 19, 497–515. [Google Scholar] [CrossRef] [Green Version]
Figure 1. Vibration signal and its spectrogram for Gaussian signal with a local fault (a,b) and a heavy-tailed signal with a local fault, with a different carrier frequency of fault: 2.5 kHz (c,d) and 5 kHz (e,f), carrier frequency of non-cyclic: 6 kHz.
Figure 1. Vibration signal and its spectrogram for Gaussian signal with a local fault (a,b) and a heavy-tailed signal with a local fault, with a different carrier frequency of fault: 2.5 kHz (c,d) and 5 kHz (e,f), carrier frequency of non-cyclic: 6 kHz.
Electronics 10 01863 g001
Figure 2. Pearson correlation map for Gaussian signal with a local fault (a,b) and heavy-tailed signal with a local fault, with a different carrier frequency of fault: 2.5 kHz (a,b) and 5 kHz (c), carrier frequency of non-cyclic: 6 kHz.
Figure 2. Pearson correlation map for Gaussian signal with a local fault (a,b) and heavy-tailed signal with a local fault, with a different carrier frequency of fault: 2.5 kHz (a,b) and 5 kHz (c), carrier frequency of non-cyclic: 6 kHz.
Electronics 10 01863 g002
Figure 3. (a) Simulated signal s1 with the amplitude of the non-cyclic impulses set to 8.5. (b) Spectrogram of simulated signal s1.
Figure 3. (a) Simulated signal s1 with the amplitude of the non-cyclic impulses set to 8.5. (b) Spectrogram of simulated signal s1.
Electronics 10 01863 g003
Figure 4. Auto-similarity maps for signal s1 for different measures of dependence, i.e., (a) Pearson autocorrelation, (b) autocovariation, (c) autocodifference, (d) Spearman autocorrelation, (e) Kendall autocorrelation, (f) Quadrant autocorrelation.
Figure 4. Auto-similarity maps for signal s1 for different measures of dependence, i.e., (a) Pearson autocorrelation, (b) autocovariation, (c) autocodifference, (d) Spearman autocorrelation, (e) Kendall autocorrelation, (f) Quadrant autocorrelation.
Electronics 10 01863 g004
Figure 5. Pearson autocorrelation maps with the highlighted 4 distinguishable parts of the frequency.
Figure 5. Pearson autocorrelation maps with the highlighted 4 distinguishable parts of the frequency.
Electronics 10 01863 g005
Figure 6. Comparison of medians for different frequency bands: 0–2 kHZ (Gaussian white noise), 2–3 kHz (informative frequency band), 3–8.5 kHz (heavy-tailed noise), and 8.5–12 kHz (Gaussian white noise), in the case of different measures of dependence: (a) Pearson autocorrelation, (b) autocovariation, (c) autocodifference, (d) Spearman autocorrelation, (e) Kendall autocorrelation, (f) Quadrant autocorrelation.
Figure 6. Comparison of medians for different frequency bands: 0–2 kHZ (Gaussian white noise), 2–3 kHz (informative frequency band), 3–8.5 kHz (heavy-tailed noise), and 8.5–12 kHz (Gaussian white noise), in the case of different measures of dependence: (a) Pearson autocorrelation, (b) autocovariation, (c) autocodifference, (d) Spearman autocorrelation, (e) Kendall autocorrelation, (f) Quadrant autocorrelation.
Electronics 10 01863 g006
Figure 7. Boxplots of medians associated with heavy-tailed noise (part C of the frequency bands) of the considered auto-dependence maps based on the following measure of dependence: Pearson, Spearman, Kendall autocorrelation, autocovariation and autocodifference for the simulated signal s 1 .
Figure 7. Boxplots of medians associated with heavy-tailed noise (part C of the frequency bands) of the considered auto-dependence maps based on the following measure of dependence: Pearson, Spearman, Kendall autocorrelation, autocovariation and autocodifference for the simulated signal s 1 .
Electronics 10 01863 g007
Figure 8. The results of IMPI calculated from auto-similarity maps only for IFB (2.5 kHz), in the case of different measures of dependence: (a) Pearson autocorrelation, (b) autocovariation, (c) autocodifference, (d) Spearman autocorrelation, (e) Kendall autocorrelation, (f) Quadrant autocorrelation—simulated signal.
Figure 8. The results of IMPI calculated from auto-similarity maps only for IFB (2.5 kHz), in the case of different measures of dependence: (a) Pearson autocorrelation, (b) autocovariation, (c) autocodifference, (d) Spearman autocorrelation, (e) Kendall autocorrelation, (f) Quadrant autocorrelation—simulated signal.
Electronics 10 01863 g008
Figure 9. Medians for the frequency band at 3–8.5 kHz.
Figure 9. Medians for the frequency band at 3–8.5 kHz.
Electronics 10 01863 g009
Figure 10. Medians—calculated for part C of auto-dependence maps from Monte Carlo simulation based on the following measures of dependence: (a) Pearson autocorrelation, (b) autocovariation, (c) autocodifference, (d) Spearman autocorrelations, (e) Kendall autocorrelation and (f) Quadrant autocorrelation.
Figure 10. Medians—calculated for part C of auto-dependence maps from Monte Carlo simulation based on the following measures of dependence: (a) Pearson autocorrelation, (b) autocovariation, (c) autocodifference, (d) Spearman autocorrelations, (e) Kendall autocorrelation and (f) Quadrant autocorrelation.
Electronics 10 01863 g010
Figure 11. IQRs—calculated for part C of auto-dependence maps from Monte Carlo simulation based on the following measure of dependence: (a) Pearson autocorrelation, (b) autocovariation, (c) autocodifference, (d) Spearman autocorrelations, (e) Kendall autocorrelation and (f) Quadrant autocorrelation.
Figure 11. IQRs—calculated for part C of auto-dependence maps from Monte Carlo simulation based on the following measure of dependence: (a) Pearson autocorrelation, (b) autocovariation, (c) autocodifference, (d) Spearman autocorrelations, (e) Kendall autocorrelation and (f) Quadrant autocorrelation.
Electronics 10 01863 g011
Figure 12. IMPI—impulsiveness indicator calculated for tested IFB at 2–3 kHz in the case of different measures of dependence: (a) Pearson autocorrelation, (b) autocovariation, (c) autocodifference, (d) Spearman autocorrelation, (e) Kendall autocorrelation, (f) Quadrant autocorrelation.
Figure 12. IMPI—impulsiveness indicator calculated for tested IFB at 2–3 kHz in the case of different measures of dependence: (a) Pearson autocorrelation, (b) autocovariation, (c) autocodifference, (d) Spearman autocorrelation, (e) Kendall autocorrelation, (f) Quadrant autocorrelation.
Electronics 10 01863 g012
Figure 13. Hammer crusher with bearing and sensor location marked.
Figure 13. Hammer crusher with bearing and sensor location marked.
Electronics 10 01863 g013
Figure 14. (a) Real signal r1, (b) spectrogram of real signal r1.
Figure 14. (a) Real signal r1, (b) spectrogram of real signal r1.
Electronics 10 01863 g014
Figure 15. Auto-similarity maps for real vibration signal for different measures of dependence, i.e., (a) Pearson autocorrelation, (b) autocovariation, (c) autocodifference, (d) Spearman autocorrelation, (e) Kendall autocorrelation and (f) Quadrant autocorrelation.
Figure 15. Auto-similarity maps for real vibration signal for different measures of dependence, i.e., (a) Pearson autocorrelation, (b) autocovariation, (c) autocodifference, (d) Spearman autocorrelation, (e) Kendall autocorrelation and (f) Quadrant autocorrelation.
Electronics 10 01863 g015
Figure 16. Comparison of medians for different frequency bands: 0–3 kHZ (Gaussian noise), 3–4 kHz (informative frequency band), 4–12 kHz (heavy-tailed noise), in the case of different measures of dependence: (a) Pearson autocorrelation, (b) autocovariation, (c) autocodifference, (d) Spearman autocorrelation, (e) Kendall autocorrelation, (f) Quadrant autocorrelation.
Figure 16. Comparison of medians for different frequency bands: 0–3 kHZ (Gaussian noise), 3–4 kHz (informative frequency band), 4–12 kHz (heavy-tailed noise), in the case of different measures of dependence: (a) Pearson autocorrelation, (b) autocovariation, (c) autocodifference, (d) Spearman autocorrelation, (e) Kendall autocorrelation, (f) Quadrant autocorrelation.
Electronics 10 01863 g016
Figure 17. Boxplots of medians associated with heavy-tailed noise (part C of the frequency bands) of the considered auto-similarity maps based on the following measure of dependence: Pearson, Spearman, Kendal autocorrelations, autocovariation and autocodifference for the real signal r 1 .
Figure 17. Boxplots of medians associated with heavy-tailed noise (part C of the frequency bands) of the considered auto-similarity maps based on the following measure of dependence: Pearson, Spearman, Kendal autocorrelations, autocovariation and autocodifference for the real signal r 1 .
Electronics 10 01863 g017
Figure 18. The results of IMPI calculated from auto-similarity maps only for IFB (2.5 kHz), in the case of different measures of dependence: (a) Pearson autocorrelation, (b) autocovariation, (c) autocodifference, (d) Spearman autocorrelation, (e) Kendall autocorrelation, (f) Quadrant autocorrelation—real vibration signal.
Figure 18. The results of IMPI calculated from auto-similarity maps only for IFB (2.5 kHz), in the case of different measures of dependence: (a) Pearson autocorrelation, (b) autocovariation, (c) autocodifference, (d) Spearman autocorrelation, (e) Kendall autocorrelation, (f) Quadrant autocorrelation—real vibration signal.
Electronics 10 01863 g018
Figure 19. Medians for the frequency band at 4–12 kHz.
Figure 19. Medians for the frequency band at 4–12 kHz.
Electronics 10 01863 g019
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Hebda-Sobkowicz, J.; Nowicki, J.; Zimroz, R.; Wyłomańska, A. Alternative Measures of Dependence for Cyclic Behaviour Identification in the Signal with Impulsive Noise—Application to the Local Damage Detection. Electronics 2021, 10, 1863. https://doi.org/10.3390/electronics10151863

AMA Style

Hebda-Sobkowicz J, Nowicki J, Zimroz R, Wyłomańska A. Alternative Measures of Dependence for Cyclic Behaviour Identification in the Signal with Impulsive Noise—Application to the Local Damage Detection. Electronics. 2021; 10(15):1863. https://doi.org/10.3390/electronics10151863

Chicago/Turabian Style

Hebda-Sobkowicz, Justyna, Jakub Nowicki, Radosław Zimroz, and Agnieszka Wyłomańska. 2021. "Alternative Measures of Dependence for Cyclic Behaviour Identification in the Signal with Impulsive Noise—Application to the Local Damage Detection" Electronics 10, no. 15: 1863. https://doi.org/10.3390/electronics10151863

APA Style

Hebda-Sobkowicz, J., Nowicki, J., Zimroz, R., & Wyłomańska, A. (2021). Alternative Measures of Dependence for Cyclic Behaviour Identification in the Signal with Impulsive Noise—Application to the Local Damage Detection. Electronics, 10(15), 1863. https://doi.org/10.3390/electronics10151863

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop