All articles published by MDPI are made immediately available worldwide under an open access license. No special
permission is required to reuse all or part of the article published by MDPI, including figures and tables. For
articles published under an open access Creative Common CC BY license, any part of the article may be reused without
permission provided that the original article is clearly cited. For more information, please refer to
https://www.mdpi.com/openaccess.
Feature papers represent the most advanced research with significant potential for high impact in the field. A Feature
Paper should be a substantial original Article that involves several techniques or approaches, provides an outlook for
future research directions and describes possible research applications.
Feature papers are submitted upon individual invitation or recommendation by the scientific editors and must receive
positive feedback from the reviewers.
Editor’s Choice articles are based on recommendations by the scientific editors of MDPI journals from around the world.
Editors select a small number of articles recently published in the journal that they believe will be particularly
interesting to readers, or important in the respective research area. The aim is to provide a snapshot of some of the
most exciting work published in the various research areas of the journal.
Institute of Fiber-Optic Communication and Information Engineering, College of Information Engineering, Zhejiang University of Technology, Hangzhou 310023, China
*
Authors to whom correspondence should be addressed.
The Brillouin Optical Time-Domain Analyzer assisted by the AdaBoost Algorithm for Brillouin frequency shift (BFS) extraction is proposed and experimentally demonstrated. The Brillouin gain spectrum classification under different BFS is realized by iteratively updating the weak classifier in the form of a decision tree, forming several base classifiers and combining them into a strong classifier. Based on the pseudo-Voigt curve training set with noise, the performance of the AdaBoost Algorithm is studied, and the influence of different signal-to-noise ratio (SNR), frequency range, and frequency step is also studied. Results show that the performance of BFS extraction decreases with the decrease in SNR, the reduction in frequency range, and the increase in frequency step.
The advantages of distributed optical fiber sensors include distributed sensing, high spatial resolution, large dynamic range, real-time monitoring, etc. The technique of distributed fiber sensing can be widely applied in various areas, such as industrial infrastructure health monitoring, long-haul vibration detection, and quick fault location [1,2,3]. The Brillouin Optical Time-Domain Analyzer (BOTDA), as one of many distributed optical fiber sensors, can be used for monitoring both temperature and strain in ultra-long sensing ranges [4]. In order to obtain temperature and strain information from BOTDA, Brillouin frequency shift (BFS) needs to be extracted from the measured Brillouin Gain Spectrum (BGS).
One method for BFS extraction is curve fitting [5,6,7,8]. The curves can be fitted by using Lorentzian curve, parabolic curve, pseudo-Voigt curve, etc. [5,6]. However, the curve fitting method is highly dependent on the initial conditions. As for the BGS with low signal-to-noise ratio (SNR), the initial parameters need to be adjusted carefully and often lead to fitting failures [7]. In addition, the curve fitting method requires a small frequency step, because a large frequency step could reduce the number of fitting points and affect the fitting performance [8].
In order to avoid the problem of initial parameter adjustment in the curve fitting method, the Cross-Correlation Method (XCM) is proposed to extract BFS [8,9]. XCM does not have the problem of initial parameter setting, but it requires data interpolation processing to up-sample the measured BGS, which could be time consuming. As a result, there is a trade-off between accuracy and processing speed [9].
The method of Artificial Neural Network (ANN) has also been introduced in the BOTDA system [10,11,12]. ANN is proved to be an effective method for BFS extraction; however, the number of intermediate layers and the number of neurons in layers need to be designed carefully [10]. Moreover, the training process is always complicated and time consuming. The method can be converted to the local optimal solution, which limits its applications [11,12].
In recent years, Support Vector Machine (SVM) has also been proposed to extract temperature from the whole BGS [13,14,15]. However, because SVM is essentially a binary classifier, it is often necessary to construct hundreds of support vectors to extract temperature information. The error of support vectors could also affect the accuracy of extraction.
Here, we propose a novel method for BFS extraction. The method is based on the AdaBoost Algorithm. The basic idea of AdaBoost is to train different weak classifiers with the same training sets and combine them to build a strong classifier. The advantages of the AdaBoost Algorithm include freedom of extraction failure and simplicity of parameter adjustments [16,17]. Generally, the decision tree or neural network can be chosen as the weak classifier [18,19]. In this study, the BFS extraction is treated as a supervised classification problem. The algorithm of the AdaBoost method is applied to extract BFS from BGS.
The article is organized as follows: In Section II the principle of the AdaBoost Algorithm is introduced, and the AdaBoost Algorithm for extracting BFS is illustrated. In Section III, the performance of BFS extraction by AdaBoost is numerically studied under different SNR, frequency range, and frequency step. In Section IV, the BFS extraction by the AdaBoost Algorithm is experimentally studied. The result of the experiment is compared with that of the simulation. Finally, the study is briefly concluded in Section V.
2. Theoretical Model
2.1. Principle of AdaBoost
AdaBoost is a supervised learning algorithm for solving binary classification problems. The principle of AdaBoost is shown in the details below.
The decision tree is chosen to be the weak classifier. The algorithm of Classification And Regression Tree (CART) is introduced to train the decision tree. For instance, Xn×m stands for the properties of the training set, which can be expressed as follows [20]:
Yn×1 represents for the labels of one sample in the training set, where yi = 0 or 1. After the training process, a tree structure is constructed, which can be denoted as a model of ; then, the decision tree can be expressed as follows [21]:
where Y′ represents the output of the decision tree. The main process of AdaBoost is shown in Figure 1.
In the algorithm of AdaBoost, a weight factor wi is introduced for each sample in the training set, where i = 1, 2..., n, standing for the index of that sample. Wn×1 is defined as the initial sample weight [20]:
During iteration, the weak classifier is trained based on training samples, and then the weight of each sample is updated according to the error rate and classification error of this particular sample. The initial weight of the sample W1 is set to [22]
The characteristics of training set X1 used for the weak classifier are
Furthermore, the output is
The error rate e1 can be obtained by
According to the calculated error rate, the weight of weak classifier a1 is obtained by
After that, the weight of sample can be updated by
The classification error of the weak classifier generated by AdaBoost will be gradually decreased after iterations of T steps.
AdaBoost combines all the weak classifiers to construct a strong classifier, in which the weak classifier with the lower classification error will obtain more weight.
It is worth noting that the resulting training error drops exponentially and rapidly to zero when the classification error of the weak classifier reaches only a little less than 50%. Therefore, the weak classifier needs to be slightly better than the random classifier.
The whole algorithm of AdaBoost is shown in detail (Algorithm 1), as follows [22].
Algorithm 1: AdaBoost for supervised classification.
Give the number of iterations T and input the labeled training sample set F, k←0
repeat
According to the weighted training samples, the weak classifier is trained.
Calculate error rate
Compute coefficients
Compute
Update
Untilk < T, k←k + 1
2.2. Principle of BFS Extraction by AdaBoost
In order to solve multiple classification problem of BFS extraction, for a given sample, firstly two different labels are chosen from the training set, and the AdaBoost Algorithm is applied for this exact binary classification problem. Then, this routine is repeated until all different label pairs in the training set are chosen, and finally the final label for the input sample is decided by voting.
In this study, the pseudo-Voigt curve is selected to make it applicable to conditions of different pump pulse widths, and is defined as a combination vector of Lorentz ratio α, curve gain parameter p and curve line width . Assume that the value range of center frequency v0 is v0 = [v01, v02…, v0n] and the value range of c is c = [c1, c2…, cm]. The scanning frequency range of curve is vi = [vi1, vi2…, vik]. The Brillouin gain set Fmn×k can be expressed as follows:
Here, the Brillouin gain subsets Fi and Fj from are chosen to be
is initialized and the weighted sample set is constructed for the binary classification training of the AdaBoost Algorithm [23]:
For a sample set, the corresponding binary classifier is obtained by , as shown in Figure 2. Therefore, classifiers can be expressed as:
For the given BGS Sp = S(fp,cp,fvk), the classifier set is constructed, the output results of the center frequency of each classifier are counted, and the maximum frequency value fmax is obtained, which should be the extracted BFS corresponding to the given BGS.
3. Simulation
The performance of AdaBoost extraction is studied by simulation. The AdaBoost Algorithm, using ideal training samples and training samples with noise, are trained separately.
Here, the ideal training sample can be expressed as [14]:
When the ideal samples are used for training, the Lorentz ratio α varies from 0 to 1, with an interval of 0.2. The width of the curves ranges from 10 MHZ to 50 MHz, and the interval is 5 MHz. The curve gain parameter p is fixed at 1. The center frequency v0 ranges from 10,800 MHZ to 10,980 MHz, which is the same as the frequency scanning range with an interval of 1 MHz. Therefore, 6 × 9 × 181 samples can be obtained for model training.
For the training samples with noise, a Gaussian white noise is introduced into the pseudo-Voigt curve. The width of the curves, the curve gain parameter, the center frequency, and the frequency scanning range are the same as that of the ideal training samples. The Gaussian white noise with mean value of 0 and different standard deviations (0.05, 0.10, 0.15, 0.20, 0.25, 0.30, 0.35, 0.40) are added to the training samples, separately. The process with different standard deviations is repeated 5 times, so 6 × 9 × 61 × 5 × 8 samples are finally obtained for model training.
After model training, the model is applied to a test. During the test, the pseudo-Voigt curve with noise is used to simulate BGS, where the line width is fixed at 50 MHz, suggesting a pump signal with the pulse width of around 20 ns. The center frequency is fixed at 10,860 MHz, and the Lorentz ratio is fixed at 1. Here, the SNR is defined as the ratio between the mean amplitude of the Brillouin peak or trace and its standard deviation, which is proportional to the amplitude instead of the power [14].
3.1. BGS Extraction under Different SNR
The Root Mean Square Error (RMSE) and uncertainty under different SNR based on both training samples, with and without noise, is shown in Figure 3. With the increase in SNR, both the RMSE and uncertainty of the AdaBoost Algorithm decrease, and the values of RMSE and uncertainty are similar under different SNR. It is obvious that the algorithm model obtained by using training samples with noise is better than that without noise. This may be because the pseudo-Voigt curve without noise has only one data sample based on the same feature, while for the pseudo-Voigt curve with noise, there are 40 data samples used for classification training under the same feature.
3.2. BGS Extraction under Different Cut-Off Frequency
Based on the frequency range of 10,800–10,980 MHz, the BGS extraction under different SNR is tested. Here, the initial frequency is fixed at 10,800 MHz and the cut-off frequency is changed. The result is shown in Figure 4. The SNR is fixed at 7 dB.
With the increase in cut-off frequency, both RMSE and uncertainty gradually decrease, and the extraction error tends to be converged when the cut-off frequency is larger than 10,880 MHz, as shown in Figure 4. With the decrease in cut-off frequency, the BGS data set of BOTDA decreases, which leads to the increase in the extracting error.
3.3. BGS Extraction under Different Frequency Range
The cut-off frequency is also fixed and the extraction performance under different initial frequencies is investigated. Results show that when the cut-off frequency is fixed at 10,980 MHz and the initial frequency is changed, the extraction error tends to be stable when the initial frequency is less than 10,835 MHz.
Therefore, further analysis is conducted for the frequency range between 10,825 MHz and 10,895 MHz. The results are shown in Figure 5. With the decrease in the frequency range, the BFS extraction accuracy decreases, and the feature number of BGS data increases with frequency range, so the BGS extraction accuracy could be stable when the frequency range is larger than 50 MHz. Overall, AdaBoost can retain extraction accuracy when the frequency range is small.
3.4. BGS Extraction under Different Frequency Step
The frequency step determines the number of features in each feature vector, i.e., the number of data points on each BGS. Here, the BGS extraction at different frequency step is studied. The results are shown in Figure 6. As the frequency step increases, the extraction performance of AdaBoost gradually deteriorates. Figure 6 shows that with a larger frequency step, the feature set becomes sparser, which could introduce more error into BFS extraction.
4. BOTDA Setup and Experiment Results
The BFS extraction by the AdaBoost Algorithm is studied experimentally. The configuration of the experiment is shown in Figure 7.
In the experiment, a narrow-line width laser with the wavelength of 1550 nm is used as the light source. The signal from the laser source is divided into two branches by a 50:50 coupler after passing through the optical isolator. At the upper branch, the light is modulated by an elector-optic modulator to generate pump light pulse. The power of the pump light is amplified by an Erbium-doped fiber amplifier. A polarization scrambler is introduced to suppress the polarization fluctuations. After passing through the scrambler, the pump pulse goes into the optical circulator and enters the Fiber Under Test (FUT). At the lower branch, the microwave source generates a microwave signal with the frequency of 10.500~11.000 GHz, which is modulated to the light by an electro-optic modulator. The back-scattered signal is then detected by the photodetector and converted into an electrical signal. The data are received and quantized by a 250 MSa/s acquisition card. The FUT is a 1 km single-mode optical fiber, and the last section of 80 m is heated in a thermostatic water tank, corresponding to 200 sampling points. The BGS distribution measured is shown in Figure 8. The width of the pump pulse is 25 ns, and the averaging time is 10,000. The frequency scanning step is 1 MHz.
The tolerance of AdaBoost to different SNR is also studied, as shown in Figure 9, where the width of pump pulse is 20 ns and the frequency step is 1 MHz. The averaging time is selected to be 2, 4, 8, 16, and 50, separately, corresponding to the measured SNRs of 5.52 dB, 6.78 dB, 7.96 dB, 8.86 dB, and 10.96 dB, respectively. The trend of the experiment is consistent with that of the simulation. With the increase in SNR, the extraction accuracy of BGS decreases gradually.
Moreover, the performance of the algorithm is also investigated under different cut-off frequency. The initial frequency is fixed at 10,800 MHz, the pump pulse is 20 ns frequency step at 1 MHz, and the average time is 4, corresponding to an SNR of 6.78 dB. The result is shown in Figure 10. The experimental result is consistent with the simulation result. The performance of the AdaBoost Algorithm becomes stable with the increase in cut-off frequency.
Similarly, in order to verify the relationship between AdaBoost performance and frequency range, tests are conducted for the frequency range of 10,825–10,895 MHz. The result is shown in Figure 11. As the frequency range decreases, the BFS extraction performance of AdaBoost gradually decreases, which is consistent with the numerical simulation.
Here, we compare extraction by both AdaBoost and SVM. The result is shown in a newly added figure (Figure 12). Figure 12 shows that the RMSE of the extracted BFS decreases rapidly with the increase in frequency range, due to the increase in the sampling points of BGS. RMSE of the BFS extracted by AdaBoost varies from 5.8 MHz down to 2.6 MHz, while for SVM, the RMSE varies from 6.3 MHz down to 1.5 MHz. When the frequency range is less than 35 MHz, the accuracy (RMSE) of AdaBoost is better than that of SVM. The reason may be that we use linear kernel function to train the SVM, which introduces error for small sampling sets. It is interesting to note that when the frequency range is larger than 35 MHz, the accuracy (RMSE) of SVM shows a better performance than that of AdaBoost, and the deviation between both methods is about 1 MHz. This suggests that the SVM with the linear kernel is a better choice for sampling sets which cover a large range of BGS.
Finally, the performance of the algorithm is also studied experimentally under different frequency step, as shown in Figure 13. With the increase in frequency step, the RMSE and uncertainty of the AdaBoost Algorithm gradually increase, which verifies the simulation conclusion that the extraction error of BFS by AdaBoost gradually increases when the sparsity of the feature set becomes larger and the number of features becomes smaller.
The simulation and experimental results are compared. Although the trend of the experiment agrees with that of the simulation, there is a deviation between the result of the simulation and that of the experiment. The deviation is about 3 MHz. This may be because the type of noise source is different, e.g., in the simulation, we use Gaussian white noise as the noise source, while in the experiment, the actual noise may include the amplified spontaneous noise (ASE) from the Erbium-doped fiber amplifier, the relative intensity noise (RIN) and phase noise from the laser source, and the thermal noise from the photo-detector.
Distributed fiber sensors can be used to measure the temperature or strain distribution along the fiber. If the fiber is bound tightly with a large-scale structure, by using distributed fiber sensors, we can obtain the total strain distribution of the bridge.
In principle, when the pump and probe enter the fiber from two ends separately, Brillouin scattering may occur, and there will be a BGS at each position along the fiber. By extracting methods, we can obtain BFS from the peak of BGS.
Two different extracting methods are also compared. For the case when BGS has intense noise (shown in Figure 14), curve-fitting does not always lead to a good fitting result, due to possibly poor initial value settings. It is possible that the curve-fitting could result in a fitting failure, as shown in Figure 14. However, for our method of AdaBoost, we can still find BFS from a noisy BGS. Take Figure 14, for example; here, the BFS extraction by AdaBoost is about 10,860 MHz, and the extracting error is near 1 MHz. We believe this may show the advantage of the proposed AdaBoost extracting method.
5. Conclusions
In this study, we propose a novel method of BFS extraction using the AdaBoost Algorithm. The extracted BFS are divided into different categories, and the weak classifiers constructed by decision trees are combined to form strong classifiers. The RMSE and uncertainty of the extracted BFS can be 1 MHz with a frequency scanning step of 1 MHz and an SNR of 11 dB. The BFS extraction performance under different SNR, cut-off frequency, frequency range, and frequency step is also studied by both simulation and experiment. Results show that the BFS extraction error increases with the decrease in SNR and frequency range, and with the increase in frequency step. The trend of the experiment agrees with that of the simulation. In addition, we compare the extracting performance of both AdaBoost and SVM. Results show that when the frequency range is less than 35 MHz, the accuracy (RMSE) of AdaBoost is better than that of SVM. This may be because we use the linear kernel function to train the SVM, which introduces error for small sampling sets. The comparison between AdaBoost and curve-fitting is also presented, and the result shows that our proposed BFS extracting method is more stable than curve-fitting, especially for the case when BGS has intense noise. It is believed that AdaBoost is a good candidate for BFS extraction from weak BGS. We would like to further investigate the performance of AdaBoost and other methods, such as ANN.
Author Contributions
Concept and structure of this study, F.X.; resources, H.Z. and Y.Q.; writing—original draft preparation, F.X.; writing—review and editing, F.X., H.Z., and S.S. All authors have read and agreed to the published version of the manuscript.
Funding
This work was supported by the National Natural Science Foundation of China (NSFC) (61675184, 61275124).
Institutional Review Board Statement
Not applicable.
Informed Consent Statement
Not applicable.
Data Availability Statement
The data presented in this study are available on request from the corresponding author.
Conflicts of Interest
The authors declare no conflict of interest.
References
Hicke, K.; Hussels, M.T.; Eisermann, R. Condition monitoring of industrial infrastructures using distributed fibre optic acoustic sensors. In Proceedings of the 2017 25th Optical Fiber Sensors Conference (OFS), Jeju, Korea, 24–28 April 2017; pp. 1–4. [Google Scholar]
Mariñelarena, J.; Urricelqui, J.; Loayssa, A. Extension of the dynamic range in slope-assisted coherent BOTDA sensors. In Proceedings of the 25th Optical Fiber Sensors Conference (OFS), Jeju, Korea, 24–28 April 2017; pp. 1–4. [Google Scholar]
Cui, Q.; Pamukcu, S.; Xiao, W. Truly Distributed Fiber Vibration Sensor Using Pulse Base BOTDA With Wide Dynamic Range. IEEE Photonics Technol. Lett.2011, 23, 1887–1889. [Google Scholar] [CrossRef]
Bao, X.; Chen, L. Recent Progress in Brillouin Scattering Based Fiber Sensors. Sensors2011, 11, 4152–4187. [Google Scholar] [CrossRef] [PubMed]
Soto, M.A.; Thévenaz, L. Modeling and evaluating the performance of Brillouin distributed optical fiber sensors. Opt. Express2013, 21, 31347–31366. [Google Scholar] [CrossRef] [PubMed]
Wu, H.; Chang, Y.; Zhao, C. Distributed Brillouin frequency shift extraction via a convolutional neural network. Photonics Res.2020, 8, 690–697. [Google Scholar]
Yaswanth, K.; Somepalli, B.; Khankhoje, U.K. Accurate Estimation of Brillouin Frequency Shift in Brillouin Optical Correlation Domain Analysis. J. Lightwave Technol.2019, 37, 5875–5884. [Google Scholar] [CrossRef]
Farahani, M.A.; Castillo-Guerra, E.; Colpitts, B.G. A Detailed Evaluation of the Correlation-Based Method Used for Estimation of the Brillouin Frequency Shift in BOTDA Sensors. IEEE Sens. J.2013, 13, 4589–4598. [Google Scholar] [CrossRef]
Farahani, M.A.; Castillo-Guerra, E.; Colpitts, B.G. Accurate estimation of Brillouin frequency shift in Brillouin optical time domain analysis sensors using cross correlation. Opt. Lett.2011, 36, 4275–4277. [Google Scholar] [CrossRef] [PubMed]
Azad, A.K.; Wang, L.; Guo, N. Signal processing using artificial nerual network for BOTDA sensor system. Opt. Express2016, 24, 6769–6783. [Google Scholar] [CrossRef] [PubMed]
Ruiz-Lombera, R.; Fuentes, A.; Rodriguez-Cobo, L. Simultaneous Temperature and Strain Discrimination in a Conventional BOTDA via Artificial Neural Networks. J. Lightwave Technol.2018, 36, 2114–2121. [Google Scholar] [CrossRef] [Green Version]
Azad, A.K.; Wang, L.; Guo, N. Temperature sensing in BOTDA system by using artificial neural network. Electron. Lett.2015, 51, 1578–1580. [Google Scholar] [CrossRef]
Wu, H.; Wang, L.; Guo, N. Brillouin Optical Time-Domain Analyzer Assisted by Support Vector Machine for Ultrafast Temperature Extraction. J. Lightwave Technol.2017, 35, 4159–4167. [Google Scholar] [CrossRef]
Wu, H.; Wang, L.; Zhao, Z. Support Vector Machine based Differential Pulse-width Pair Brillouin Optical Time Domain Analyzer. IEEE Photonics J.2018, 10, 1–11. [Google Scholar] [CrossRef]
Zhu, H.; Yu, L.; Zhang, Y. Optimized Support Vector Machine Assisted BOTDA for Temperature Extraction With Accuracy Enhancement. IEEE Photonics J.2019, 12, 1–15. [Google Scholar] [CrossRef]
Schapire, R.E.; Freund, Y.; Bartlett, P. Boosting the margin: A new explanation for the effectiveness of voting methods. Ann. Stat.1998, 26, 1651–1686. [Google Scholar]
Dietterich, T.G. An Experimental Comparison of Three Methods for Constructing Ensembles of Decision Trees: Bagging, Boosting, and Randomization. Mach. Learn.2000, 40, 139–157. [Google Scholar] [CrossRef]
Cherubini, D.; Fanni, A.; Montisci, A. Inversion of MLP neural networks for direct solution of inverse problems. IEEE Trans. Magn.2005, 41, 1784–1787. [Google Scholar] [CrossRef]
Harrington, P. Machine Learning in Action; Manning Publications: New York, NY, USA, 2012; pp. 154–196. [Google Scholar]
Zheng, H. Extraction of Brillouin frequency shift in Brillouin distributed fiber sensors by neighbors-based machine learning. In Proceedings of the Advanced Sensor Systems and Applications X, Online, 10 October 2020. [Google Scholar]
Tan, P.N.; Steinback, M.; Kumar, V. Introduction to Data Mining; Pearson: London, UK, 2011; pp. 213–236. [Google Scholar]
Yang, J.; Ma, H.; Dou, J. Harmonic Characteristics Data-Driven THD Prediction Method for LEDs Using MEA-GRNN and Improved-AdaBoost Algorithm. IEEE Access2021, 9, 31297–31308. [Google Scholar] [CrossRef]
Figure 1.
The principle of AdaBoost Algorithm, where X, Wi, Xi, ei, αi, gi(x), and f(x) represent the labeled training samples, the weight set, the weighted sample set, the classification error, the weak classifier coefficient, the weak classifier, and the final strong classifier, respectively.
Figure 1.
The principle of AdaBoost Algorithm, where X, Wi, Xi, ei, αi, gi(x), and f(x) represent the labeled training samples, the weight set, the weighted sample set, the classification error, the weak classifier coefficient, the weak classifier, and the final strong classifier, respectively.
Figure 2.
BFS extraction process based on AdaBoost, where F, Fi,j, fi,j(x), Ri, and fmax are the classifier set, the binary label sample set, the binary classifier, the classification result, and the BFS of output, separately.
Figure 2.
BFS extraction process based on AdaBoost, where F, Fi,j, fi,j(x), Ri, and fmax are the classifier set, the binary label sample set, the binary classifier, the classification result, and the BFS of output, separately.
Figure 3.
BFS extraction based on training samples with or without noise under different SNR.
Figure 3.
BFS extraction based on training samples with or without noise under different SNR.
Figure 4.
BFS extraction under different cut-off frequency.
Figure 4.
BFS extraction under different cut-off frequency.
Figure 5.
BFS extraction under different frequency range.
Figure 5.
BFS extraction under different frequency range.
Figure 6.
BFS extraction under different frequency step.
Figure 6.
BFS extraction under different frequency step.
Zheng, H.; Xiao, F.; Sun, S.; Qin, Y.
Brillouin Frequency Shift Extraction Based on AdaBoost Algorithm. Sensors2022, 22, 3354.
https://doi.org/10.3390/s22093354
AMA Style
Zheng H, Xiao F, Sun S, Qin Y.
Brillouin Frequency Shift Extraction Based on AdaBoost Algorithm. Sensors. 2022; 22(9):3354.
https://doi.org/10.3390/s22093354
Chicago/Turabian Style
Zheng, Huan, Feng Xiao, Shijie Sun, and Yali Qin.
2022. "Brillouin Frequency Shift Extraction Based on AdaBoost Algorithm" Sensors 22, no. 9: 3354.
https://doi.org/10.3390/s22093354
APA Style
Zheng, H., Xiao, F., Sun, S., & Qin, Y.
(2022). Brillouin Frequency Shift Extraction Based on AdaBoost Algorithm. Sensors, 22(9), 3354.
https://doi.org/10.3390/s22093354
Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.
Article Metrics
No
No
Article Access Statistics
For more information on the journal statistics, click here.
Multiple requests from the same IP address are counted as one view.
Zheng, H.; Xiao, F.; Sun, S.; Qin, Y.
Brillouin Frequency Shift Extraction Based on AdaBoost Algorithm. Sensors2022, 22, 3354.
https://doi.org/10.3390/s22093354
AMA Style
Zheng H, Xiao F, Sun S, Qin Y.
Brillouin Frequency Shift Extraction Based on AdaBoost Algorithm. Sensors. 2022; 22(9):3354.
https://doi.org/10.3390/s22093354
Chicago/Turabian Style
Zheng, Huan, Feng Xiao, Shijie Sun, and Yali Qin.
2022. "Brillouin Frequency Shift Extraction Based on AdaBoost Algorithm" Sensors 22, no. 9: 3354.
https://doi.org/10.3390/s22093354
APA Style
Zheng, H., Xiao, F., Sun, S., & Qin, Y.
(2022). Brillouin Frequency Shift Extraction Based on AdaBoost Algorithm. Sensors, 22(9), 3354.
https://doi.org/10.3390/s22093354
Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.