Improved Drill State Recognition during Milling Process Using Artificial Intelligence
Abstract
:1. Introduction
2. Materials and Methods
2.1. Materials
- if VBmax is in the range (0–0.15) mm, then it is a Green state—four different levels of wear state
- if VBmax is in the range (0.151–0.299) mm, then it is a Yellow state—two different levels of wear state
- if VBmax is in the range (>0.299) mm, then it is a Red state—two different levels of wear state
- force value in the (1) X and (2) Y-axes (Kistler 9601A sensor; Impexron GmbH, Pfullingen, Germany)
- (3) acoustic emission (Kistler 8152B sensor; Kistler Group, Winterthur, Switzerland)
- (4) noise level (Brüel & Kjær 4189 sensor; Brüel and Kjær, Nærum, Denmark)
- (5) vibration level (Kistler 5127B sensor; Kistler Group, Winterthur, Switzerland)
- (6) device-rated current (Finest HR 30 sensor; Micom Elektronika, Zagreb, Croatia)
- (7) device-rated voltage (Testec TT-Si9001 sensor; Testec, Dreieich, Germany)
- (8) head-rated current (Finest HR 30 sensor; Micom Elektronika, Zagreb, Croatia)
- (9) head-rated voltage (Testec TT-Si9001 sensor; Testec, Dreieich, Germany)
- (10) servo-rated current (Finest HR 30 sensor; Micom Elektronika, Zagreb, Croatia)
- (11) servo-rated voltage (Testec TT-Si9001 sensor; Testec, Dreieich, Germany)
2.2. Methods
2.2.1. K-Nearest Neighbors
- N—number of samples
- —probability of the sample being correctly classified
- —set of points in the same class as the sample
- —softmax over Euclidean distances in the embedded space
- —Mahalanobis distance metric
- K = 5
- metrics = ‘minkowski’
- leaf_size = 30
2.2.2. GaussianNB
2.2.3. MultinomialNB
2.2.4. Stochastic Gradient Descent
- L—loss function
- R—regularization term that penalizes model complexity
- —is a non-negative hyperparameter that controls the regularization strength
- —learning rate
- b—intercept
- L—Hinge loss function
- max_iter = 1000
- validation fraction = 10%
- = 0.0001
- penalty = L2
2.2.5. Decision Tree
- —training vectors
- —label vector
- m—number of node
- —data at node m
- —number of samples at node m
- k—Number of classes
- m—Number of nodes
- min_samples_leaf = 1
- loss function = Gini
2.2.6. Random Forest
- min_samples_leaf = 1
- loss function = Gini
- base classifier = Decision Tree
2.2.7. Gradient Boosting
- —weak learners
- M—number of weak learners
- M = 100
- loss function = ‘log-loss’
- max_depth = 3
- learning_rate = 0.1
- min_samples_leaf = 1
2.2.8. Extreme Gradient Boosting
- regularization rules,
- parallel processing,
- an in-built feature to handle missing values,
- built-in cross-validation technique,
- tree pruning feature.
- M = 100
- loss function = ‘log-loss’
- max_depth = 3
- learning_rate = 0.1
- min_samples_leaf = 1
2.2.9. Light Gradient Boosting
- M = 100
- loss function = ‘log-loss’
- learning_rate = 0.1
- reg_alpha = 0
- reg_lambda = 0
- boosting_type = ‘gbdt’
2.2.10. Support Vector Machine
- C—penalty term that controls the penalty strength
- —distance samples from their correct margin boundary
- —dual coefficients
- e is the vector of all single coefficients. The positive semidefinite matrix is and is the kernel.
- C = 30,000
- kernel = ‘RBF’
- gamma = 1/561
2.2.11. General Implementation
- Processor: AMD RYZEN THREADRIPPER 2990WX (32C 64T) 4.3 GHz
- Motherboard: AsRock X399 TAICHI
- Memory: 8 × ADATA XPG SPECTRIX DDR4 16 GB D41 3000MHz (128 GB RAM)
- Graphics Card: 2 × Nvidia GeForce RTX Titan 24GB GDDR6 (48 GB RAM)
- Drive SSD: 2 × WD BLACK 1TB WDS100T3X0C1TB (PCIE)
- Drive HDD: 1 × WD RED PRO 8TB WD8003FFBX 3.5” (SATA)
- Power Supply: BE QUIET! DARK POWER PRO 11 1000 W
- Cooling: BE QUIET! Silent Loop BW003 280 mm
- Network: 10GbE SFP+
3. Discussion
4. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Jemielniak, K. Commercial tool condition monitoring systems. Int. J. Adv. Manuf. Technol. 1999, 15, 711–721. [Google Scholar] [CrossRef]
- Bai, Q.; Yao, Y.; Bex, P.; Zhang, G. Study on wear mechanisms and grain effects of PCD tool in machining laminated flooring. Int. J. Refract. Met. Hard Mater. 2004, 22, 111–115. [Google Scholar] [CrossRef]
- Górski, J.; Szymanowski, K.; Podziewski, P.; Śmietańska, K.; Czarniak, P.; Cyrankowski, M. Use of cutting force and vibro-acoustic signals in tool wear monitoring based on multiple regression technique for compreg milling. Bioresources 2019, 14, 3379–3388. [Google Scholar] [CrossRef]
- Dimla, D., Sr.; Lister, P. On-line metal cutting tool condition monitoring.: I: Force and vibration analyses. Int. J. Mach. Tools Manuf. 2000, 40, 739–768. [Google Scholar] [CrossRef]
- Silva, R.; Baker, K.; Wilcox, S.; Reuben, R. The adaptability of a tool wear monitoring system under changing cutting conditions. Mech. Syst. Signal Process. 2000, 14, 287–298. [Google Scholar] [CrossRef]
- Jemielniak, K.; Urbański, T.; Kossakowska, J.; Bombiński, S. Tool condition monitoring based on numerous signal features. Int. J. Adv. Manuf. Technol. 2012, 59, 73–81. [Google Scholar] [CrossRef] [Green Version]
- Porankiewicz, B.; Wieloch, G. Drill wear during the boring of particle board: A multi-factor analysis including effects of mineral contaminants. BioResources 2008, 3, 425–436. [Google Scholar]
- Porankiewicz, B. Tepienie sie ostrzy i jakosc przedmiotu obrabianego w skrawaniu plyt wiorowych. Rocz. Akad. Rol. Pozn. Rozpr. Nauk. 2003, 241, 1–169. [Google Scholar]
- Jegorowa, A.; Antoniuk, I.; Kurek, J.; Bukowski, M.; Dołowa, W.; Czarniak, P. Time-efficient approach to drill condition monitoring based on images of holes drilled in melamine faced chipboard. BioResources 2020, 15, 9611. [Google Scholar] [CrossRef]
- Jegorowa, A.; Kurek, J.; Antoniuk, I.; Dołowa, W.; Bukowski, M.; Czarniak, P. Deep learning methods for drill wear classification based on images of holes drilled in melamine faced chipboard. Wood Sci. Technol. 2021, 55, 271–293. [Google Scholar] [CrossRef]
- Borz, S.A.; Forkuo, G.O.; Oprea-Sorescu, O.; Proto, A.R. Development of a Robust Machine Learning Model to Monitor the Operational Performance of Fixed-Post Multi-Blade Vertical Sawing Machines. Forests 2022, 13, 1115. [Google Scholar] [CrossRef]
- Bedelean, B.; Ispas, M.; Răcășan, S.; Baba, M.N. Optimization of Wood Particleboard Drilling Operating Parameters by Means of the Artificial Neural Network Modeling Technique and Response Surface Methodology. Forests 2022, 13, 1045. [Google Scholar] [CrossRef]
- Górski, J. The Review of New Scientific Developments in Drilling in Wood-Based Panels with Particular Emphasis on the Latest Research Trends in Drill Condition Monitoring. Forests 2022, 13, 242. [Google Scholar] [CrossRef]
- Kurek, J.; Wieczorek, G.; Kruk, B.S.M.; Jegorowa, A.; Osowski, S. Transfer learning in recognition of drill wear using convolutional neural network. In Proceedings of the 2017 18th International Conference on Computational Problems of Electrical Engineering (CPEE), Kutná Hora, Czech Republic, 1–13 September 2017; IEEE: Piscataway, NJ, USA, 2017; pp. 1–4. [Google Scholar]
- Kurek, J.; Wieczorek, G.; Swiderski, B.; Kruk, M.; Jegorowa, A.; Gorski, J. Automatic identification of drill condition during drilling process in standard laminated chipboard with the use of long short-term memory (LSTM). In Proceedings of the 19th International Conference Computational Problems of Electrical Engineering, Banska Stiavnica, Slovak Republic, 9–12 September 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 1–4. [Google Scholar]
- Kothuru, A.; Nooka, S.P.; Liu, R. Cutting Process Monitoring System Using Audible Sound Signals and Machine Learning Techniques: An Application to End Milling. In Proceedings of the International Manufacturing Science and Engineering Conference, Los Angeles, CA, USA, 4–8 June 2017; American Society of Mechanical Engineers: New York, NY, USA, 2017; Volume 50749, p. V003T04A050. [Google Scholar]
- Classifier Implementing the k-Nearest Neighbors Vote. Available online: https://scikit-learn.org/stable/modules/generated/sklearn.neighbors.KNeighborsClassifier.html (accessed on 5 November 2022).
- Cover, T.; Hart, P. Nearest neighbor pattern classification. IEEE Trans. Inf. Theory 1967, 13, 21–27. [Google Scholar] [CrossRef] [Green Version]
- Saritas, M.M.; Yasar, A. Performance analysis of ANN and Naive Bayes classification algorithm for data classification. Int. J. Intell. Syst. Appl. Eng. 2019, 7, 88–91. [Google Scholar] [CrossRef] [Green Version]
- Gaussian Naive Bayes Classifier. Available online: https://scikit-learn.org/stable/modules/generated/sklearn.naive_bayes.GaussianNB.html (accessed on 5 November 2022).
- Naive Bayes Classifier for Multinomial Models. Available online: https://scikit-learn.org/stable/modules/generated/sklearn.naive_bayes.MultinomialNB.html?highlight=multinomialnb#sklearn.naive_bayes.MultinomialNB (accessed on 5 November 2022).
- Chubarian, K.; Turán, G. Interpretability of Bayesian Network Classifiers: OBDD Approximation and Polynomial Threshold Functions. In Proceedings of the ISAIM, Fort Lauderdale, FL, USA, 6–8 January 2020. [Google Scholar]
- Robbins, H.; Monro, S. A stochastic approximation method. Ann. Math. Stat. 1951, 22, 400–407. [Google Scholar] [CrossRef]
- Ketkar, N. Stochastic gradient descent. In Deep Learning with Python; Springer: Berlin/Heidelberg, Germany, 2017; pp. 113–132. [Google Scholar]
- Linear Classifiers with SGD Training. Available online: https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.SGDClassifier.html (accessed on 5 November 2022).
- Kotsiantis, S.B. Decision trees: A recent overview. Artif. Intell. Rev. 2013, 39, 261–283. [Google Scholar] [CrossRef]
- A Decision Tree Classifier. Available online: https://scikit-learn.org/stable/modules/generated/sklearn.tree.DecisionTreeClassifier.html (accessed on 5 November 2022).
- A Random Forest Classifier. Available online: https://scikit-learn.org/stable/modules/generated/sklearn.ensemble.RandomForestClassifier.html (accessed on 5 November 2022).
- Biau, G.; Scornet, E. A random forest guided tour. Test 2016, 25, 197–227. [Google Scholar] [CrossRef] [Green Version]
- Breiman, L. Random Forests. Mach. Learn. 2001, 45, 5–32. [Google Scholar] [CrossRef] [Green Version]
- Breiman, L. Arcing the Edge; Technical Report, Technical Report 486; Statistics Department, University of California: Berkeley, CA, USA, 1997. [Google Scholar]
- Friedman, J.H. Greedy Function Approximation: A Gradient Boosting Machine. Ann. Stat. 2001, 29, 1189–1232. [Google Scholar] [CrossRef]
- Gradient Boosting for Classification. Available online: https://scikit-learn.org/stable/modules/generated/sklearn.ensemble.GradientBoostingClassifier.html (accessed on 5 November 2022).
- Friedman, J.H. Stochastic gradient boosting. Comput. Stat. Data Anal. 2002, 38, 367–378. [Google Scholar] [CrossRef]
- Chen, T.; Guestrin, C. Xgboost: A scalable tree boosting system. In Proceedings of the 22nd ACM Sigkdd International Conference on Knowledge Discovery and Data Mining, San Francisco, CA, USA, 13–17 August 2016; pp. 785–794. [Google Scholar]
- Friedman, J.; Hastie, T.; Tibshirani, R. Additive logistic regression: A statistical view of boosting (With discussion and a rejoinder by the authors). Ann. Stat. 2000, 28, 337–407. [Google Scholar] [CrossRef]
- Python API Reference of Xgboost. Available online: https://xgboost.readthedocs.io/en/stable/python/python_api.html (accessed on 5 November 2022).
- Chun, P.j.; Izumi, S.; Yamane, T. Automatic detection method of cracks from concrete surface imagery using two-step light gradient boosting machine. Comput. Aided Civ. Infrastruct. Eng. 2021, 36, 61–72. [Google Scholar] [CrossRef]
- LightGBM Classifier. Available online: https://lightgbm.readthedocs.io/en/latest/pythonapi/lightgbm.LGBMClassifier.html (accessed on 5 November 2022).
- Yang, Y.; Li, J.; Yang, Y. The research of the fast SVM classifier method. In Proceedings of the 2015 12th International Computer Conference on Wavelet Active Media Technology and Information Processing (ICCWAMTIP), Chengdu, China, 18–20 December 2015; IEEE: Piscataway, NJ, USA, 2015; pp. 121–124. [Google Scholar]
- Platt, J. Probabilistic Outputs for Support Vector Machines and Comparisons to Regularized Likelihood Methods. Adv. Large Margin Classif. 2000, 10, 61–74. [Google Scholar]
- C-Support Vector Classification. Available online: https://scikit-learn.org/stable/modules/generated/sklearn.svm.SVC.html (accessed on 5 November 2022).
Data Set | Variable | Length of 1 Trial | Sampling Frequency (Hz) | Measure Time (s) |
---|---|---|---|---|
DataHigh | Ac. Emission | 27,999,960 | 5,000,000 | 5.59 |
DataLow | Force X | 700,000 | 200,000 | 3.50 |
DataLow | Force Y | 700,000 | 200,000 | 3.50 |
DataLow | Noise | 700,000 | 200,000 | 3.50 |
DataLow | Vibration | 700,000 | 200,000 | 3.50 |
DataCurrent | Dev. Current | 30,000 | 50,000 | 0.60 |
DataCurrent | Dev. Voltage | 30,000 | 50,000 | 0.60 |
DataCurrent | Head Current | 30,000 | 50,000 | 0.60 |
DataCurrent | Head Voltage | 30,000 | 50,000 | 0.60 |
DataCurrent | Servo Current | 30,000 | 50,000 | 0.60 |
DataCurrent | Servo Voltage | 30,000 | 50,000 | 0.60 |
Class | Precision | Recall | F1-Score |
---|---|---|---|
Green | 0.50 | 0.50 | 0.50 |
Yellow | 0.83 | 0.71 | 0.77 |
Red | 0.71 | 0.83 | 0.77 |
Accuracy | 0.73 |
Class | Precision | Recall | F1-Score |
---|---|---|---|
Green | 1.00 | 1.00 | 1.00 |
Yellow | 0.50 | 0.57 | 0.53 |
Red | 0.40 | 0.33 | 0.36 |
Accuracy | 0.53 |
Class | Precision | Recall | F1-Score |
---|---|---|---|
Green | 1.00 | 1.00 | 1.00 |
Yellow | 0.78 | 1.00 | 0.88 |
Red | 1.00 | 0.67 | 0.80 |
Accuracy | 0.87 |
Class | Precision | Recall | F1-Score |
---|---|---|---|
Green | 0.67 | 1.00 | 0.80 |
Yellow | 1.00 | 0.86 | 0.92 |
Red | 1.00 | 1.00 | 1.00 |
Accuracy | 0.93 |
Class | Precision | Recall | F1-Score |
---|---|---|---|
Green | 1.00 | 1.00 | 1.00 |
Yellow | 0.50 | 0.29 | 0.36 |
Red | 0.44 | 0.67 | 0.53 |
Accuracy | 0.53 |
Class | Precision | Recall | F1-Score |
---|---|---|---|
Green | 1.00 | 1.00 | 1.00 |
Yellow | 0.80 | 0.57 | 0.67 |
Red | 0.62 | 0.83 | 0.71 |
Accuracy | 0.73 |
Class | Precision | Recall | F1-Score |
---|---|---|---|
Green | 1.00 | 1.00 | 1.00 |
Yellow | 0.86 | 0.86 | 0.86 |
Red | 0.83 | 0.83 | 0.83 |
Accuracy | 0.87 |
Class | Precision | Recall | F1-Score |
---|---|---|---|
Green | 1.00 | 1.00 | 1.00 |
Yellow | 0.83 | 0.71 | 0.77 |
Red | 0.71 | 0.83 | 0.77 |
Accuracy | 0.80 |
Class | Precision | Recall | F1-Score |
---|---|---|---|
Green | 0.67 | 1.00 | 0.80 |
Yellow | 0.50 | 0.86 | 0.63 |
Red | 0.00 | 0.00 | 0.00 |
Accuracy | 0.53 |
Class | Precision | Recall | F1-Score |
---|---|---|---|
Green | 1.00 | 1.00 | 1.00 |
Yellow | 0.83 | 0.71 | 0.77 |
Red | 0.71 | 0.83 | 0.77 |
Accuracy | 0.80 |
Model | Parameters | Accuracy (%) |
---|---|---|
GaussianNB | var_smoothing = 1 | 73.33 |
MultinomialNB | alpha = 1.0 | 53.33 |
Gradient Boosting | learning_rate = 0.1, n_estimators = 100 | 86.66 |
Extreme Gradient Boosting | learning_rate = 0.1, n_estimators = 100 | 93.33 |
Decision Tree | min_samples_split = 2, min_samples_leaf = 1 | 53.33 |
Light Gradient Boosting | num_leaves = 31, learning_rate = 0.1, n_estimators = 100 | 73.33 |
Random Forest | n_estimators = 100, min_samples_split = 2, min_samples_leaf = 1 | 86.66 |
K-Nearest Neighbors | n_neighbors = 5 | 80.00 |
Stochastic Gradient Descent | alpha = 0.0001, epsilon = 0.1 | 53.33 |
Support Vector Machine | C = 1.0, kernel = RBF, gamma = 0.1 | 80.00 |
Name of Signal | Number of Signal Occurrences in XGBoost Features |
---|---|
ForceX | 10 |
Noise | 9 |
Ac.Emission | 8 |
Vibration | 7 |
ForceY | 3 |
HeadVoltage | 3 |
ServoCurrent | 3 |
ServoVoltage | 3 |
HeadCurrent | 2 |
Dev.Current | 1 |
Dev.Voltage | 1 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Kurek, J.; Krupa, A.; Antoniuk, I.; Akhmet, A.; Abdiomar, U.; Bukowski, M.; Szymanowski, K. Improved Drill State Recognition during Milling Process Using Artificial Intelligence. Sensors 2023, 23, 448. https://doi.org/10.3390/s23010448
Kurek J, Krupa A, Antoniuk I, Akhmet A, Abdiomar U, Bukowski M, Szymanowski K. Improved Drill State Recognition during Milling Process Using Artificial Intelligence. Sensors. 2023; 23(1):448. https://doi.org/10.3390/s23010448
Chicago/Turabian StyleKurek, Jarosław, Artur Krupa, Izabella Antoniuk, Arlan Akhmet, Ulan Abdiomar, Michał Bukowski, and Karol Szymanowski. 2023. "Improved Drill State Recognition during Milling Process Using Artificial Intelligence" Sensors 23, no. 1: 448. https://doi.org/10.3390/s23010448
APA StyleKurek, J., Krupa, A., Antoniuk, I., Akhmet, A., Abdiomar, U., Bukowski, M., & Szymanowski, K. (2023). Improved Drill State Recognition during Milling Process Using Artificial Intelligence. Sensors, 23(1), 448. https://doi.org/10.3390/s23010448