Multialgorithm Fusion for Milling Tool Abrasion and Breakage Evaluation Based on Machine Vision
Abstract
:1. Introduction
2. Image Acquisition System
2.1. Hardware Equipment
2.2. Tool Cutting Sequence
3. Visual Inspection of Tool Abrasion and Breakage
3.1. Extract Tool Abrasion
3.1.1. Image Grayscale Curve Analysis
3.1.2. Adaptive Region Localization of Abrasion
- The column pixel grayscale distribution curve of the original grayscale image is counted to calculate the maximum grayscale value of each column of grayscale curves, and the maximum grayscale values of each column of grayscale curves are summed. Then, the summed value are divided by the total number of columns to obtain the final grayscale mean. The grayscale mean value obtained by this operation is taken as the screening threshold, which is expressed asThe obtained screening threshold is used as the threshold for image binarization, and the grayscale information of the grayscale image is divided into two regions. Finally, a binarized image of preliminary region localization is obtained, as shown in Figure 6a. The blue line is the contour of the original tool.
- Since the tool abrasion is mainly concentrated on both sides of the tool tip, it is only necessary to further locate the area at the left and right ends to filter out the useless area. Taking the left side as an example, the first nonzero pixel point at the upper left can be obtained by row pixel scanning, denoted as Pixel(xl,yl). Then, a circular area is obtained with the center point Pixel(xl,yl) and the radius k, where k is the artificially set width limit, and the value of k can refer to the actual geometric limit of abrasion and breakage. In this paper, k is equal to 360 pixels. The final initial location of the abrasion must be within the entire circular area. The right side is processed in a similar way. The final positioning rendering of the abrasion is shown in Figure 6b.
3.1.3. Extraction of Abrasion Based on Region Growing Algorithm
- By determining the centroid of each abrasion, the average grayscale value of the centroid eight neighborhood is taken as the average threshold, which can be listed as
- A difference with the average threshold by finding the maximum and minimum grayscale values of each abrasion is calculated, and the largest difference is chosen as the final difference difvalue.
- The upper bound of the threshold and lower bound of the threshold are defined as
3.2. Extracting the Tool Breakage
3.2.1. Image Presegmentation of Breakage
3.2.2. Least-Squares Method
3.2.3. Technical Route of Breakage
- Pixel scanning is performed on the straight edge and the curved edge of the tool tip to obtain the edge pixel. The coordinates of the pixel point p(xi,yi) are recorded, where I = 1, 2,…n.
- A curve fitting is performed based on the existing pixel points. According to the tool shape analysis, the shape of the tool edge is an arc. A high-order polynomial fitting should be selected instead of a first-order polynomial fitting. In the meantime, it can be seen from Figure 10 that an inflection has already appeared during the fitting of the 3rd-order polynomial. The higher-order polynomial is not suitable for fitting the tool tip arc, and the fitting effect of the 2nd-degree polynomial is close to the edge shape of the tool tip. The quadratic polynomial fitting C(x) is selected to fit the tool tip arc. However, the curve direction is wrong. The straight edge of the tool tip is fitted with a first-order polynomial S(x). The fitting function is defined as follows:
- Error points are eliminated based on the distance threshold. The current fitted edge function can be linearly regressed from the data points by the least-squares method. However, due to the accuracy of image presegmentation, the wrong edge points will inevitably be extracted by scanning edge pixels. In the case of many obvious wrong points, a best-fit curve does not necessarily exist. In this paper, based on the inclusion of wrong points, a method for eliminating error edge points based on the distance threshold is proposed. The idea of this method is as follows:Firstly, it is necessary to ensure that more than half of the edge points are correct. In the case of including error points, the first curve fitting is performed as shown in Figure 11a, and the direction of the curve is to the right.Secondly, by calculating the closest distance from an edge pixel point to all points of the fitted curve, the closest distance is expressed asFinally, the sum of the closest distances is calculated, and the average is obtained. The average distance is used as a distance threshold, which can be shown asThe distance threshold can reflect the deviation of the edge pixel points from the current fitting curve. The wrong edge pixel points are those data points that seriously deviate from the ideal fitting curve, as shown in Figure 11. As the distance is greater than the threshold, the points are eliminated.
- After the error points are eliminated, the straight edge of the tool is fitted with a straight line, and the curved edge of the tool is fitted with a quadratic polynomial. The fitting effect and the breakage are shown in Figure 11b,c. It demonstrates that the fitted edge is oriented to the left, which is in line with the actual edge trend.
3.3. Monitoring Results of Abrasion and Breakage
4. Experiment Analysis
4.1. Pixel Size Standard
4.2. Experimental Setup
4.3. Experimental Results and Discussion
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Luo, Q.; Fang, X.; Liu, L.; Yang, C.; Sun, Y. Automated visual defect detection for flat steel surface: A survey. IEEE Trans. Instrum. Meas. 2020, 69, 626–644. [Google Scholar] [CrossRef] [Green Version]
- Ren, R.; Hung, T.; Tan, K.C. A generic deep-learning-based approach for automated surface inspection. IEEE Trans. Cybern. 2017, 48, 929–940. [Google Scholar] [CrossRef] [PubMed]
- Scime, L.; Beuth, J. Anomaly detection and classification in a laser powder bed additive manufacturing process using a trained computer vision algorithm. Addit. Manuf. 2018, 19, 114–126. [Google Scholar] [CrossRef]
- Aminzadeh, M.; Kurfess, T.R. Online quality inspection using Bayesian classification in powder-bed additive manufacturing from high-resolution visual camera images. J. Intell. Manuf. 2019, 30, 2505–2523. [Google Scholar] [CrossRef]
- Chauhan, V.; Surgenor, B. Fault detection and classification in automated assembly machines using machine vision. Int. J. Adv. Manuf. Technol. 2017, 90, 2491–2512. [Google Scholar] [CrossRef] [Green Version]
- Zhao, Y.; Liang, Y.; Bai, Q.; Wang, B. Experimental study on micro-milling machine, micro-tool wear and cutting force in micromachining. Opt. Precis. Eng. 2007, 15, 894–902. [Google Scholar]
- Möhring, H.C.; Wiederkehr, P.; Erkorkmaz, K.; Kakinuma, Y. Self-optimizing machining systems. CIRP Ann. 2020, 69, 740–763. [Google Scholar] [CrossRef]
- Zhu, K.; Yu, X. The monitoring of micro milling tool wear conditions by wear area estimation. Mech. Syst. Signal Process. 2017, 93, 80–91. [Google Scholar] [CrossRef]
- Dai, Y.; Zhu, K. A machine vision system for micro-milling tool condition monitoring. Precis. Eng. 2018, 52, 183–191. [Google Scholar] [CrossRef]
- Li, S.; Zhu, K. In-situ tool wear area evaluation in micro milling with considering the influence of cutting force. Mech. Syst. Signal Process. 2021, 161, 107971. [Google Scholar] [CrossRef]
- Fernández-Robles, L.; Sánchez-González, L.; Díez-González, J.; Castejón-Limas, M.; Pérez, H. Use of image processing to monitor tool wear in micro milling. Neurocomputing 2021, 452, 333–340. [Google Scholar] [CrossRef]
- Yu, J.; Cheng, X.; Lu, L.; Wu, B. A machine vision method for measurement of machining tool wear. Measurement 2021, 182, 109683. [Google Scholar] [CrossRef]
- Yu, J.; Cheng, X.; Zhao, Z. A machine vision method for measurement of drill tool wear. Int. J. Adv. Manuf. Technol. 2021, 118, 3303–3314. [Google Scholar] [CrossRef]
- Hou, Q.; Sun, J.; Huang, P. A novel algorithm for tool wear online inspection based on machine vision. Int. J. Adv. Manuf. Technol. 2019, 101, 2415–2423. [Google Scholar] [CrossRef]
- Li, Y.; Mou, W.; Li, J.; Gao, J. An automatic and accurate method for tool wear inspection using grayscale image probability algorithm based on bayesian inference. Robot. Comput.-Integr. Manuf. 2021, 68, 102079. [Google Scholar] [CrossRef]
- Marei, M.; El Zaatari, S.; Li, W. Transfer learning enabled convolutional neural networks for estimating health state of cutting tools. Robot. Comput. -Integr. Manuf. 2021, 71, 102145. [Google Scholar] [CrossRef]
- Bergs, T.; Holst, C.; Gupta, P.; Augspurger, T. Digital image processing with deep learning for automated cutting tool wear detection. Procedia Manuf. 2020, 48, 947–958. [Google Scholar] [CrossRef]
- Ye, Z.; Wu, Y.; Ma, G.; Li, H.; Cai, Z.; Wang, Y. Visual high-precision detection method for tool damage based on visual feature migration and cutting edge reconstruction. Int. J. Adv. Manuf. Technol. 2021, 114, 1341–1358. [Google Scholar] [CrossRef]
- Melouah, A. Comparison of automatic seed generation methods for breast tumor detection using region growing technique. In Proceedings of the IFIP International Conference on Computer Science and its Applications, Saida, Algeria, 20–21 May 2015; Springer: Cham, Switzerland, 2015; pp. 119–128. [Google Scholar]
- Astakhov, V.P. The assessment of cutting tool wear. Int. J. Mach. Tools Manuf. 2004, 44, 637–647. [Google Scholar] [CrossRef]
- Castejón, M.; Alegre, E.; Barreiro, J.; Hernández, L.K. On-line tool wear monitoring using geometric descriptors from digital images. Int. J. Mach. Tools Manuf. 2007, 47, 1847–1853. [Google Scholar] [CrossRef]
- Barreiro, J.; Castejón, M.; Alegre, E.; Hernández, L.K. Use of descriptors based on moments from digital images for tool wear monitoring. Int. J. Mach. Tools Manuf. 2008, 48, 1005–1013. [Google Scholar] [CrossRef]
- Liu, T.I. A computer vision approach for drill wear measurements. J. Mater. Shap. Technol. 1990, 8, 11–16. [Google Scholar] [CrossRef]
- Mikołajczyk, T.; Nowicki, K.; Kłodowski, A.; Pimenov, D.Y. Neural network approach for automatic image analysis of cutting edge wear. Mech. Syst. Signal Process. 2017, 88, 100–110. [Google Scholar] [CrossRef]
Equipment | Parameter |
---|---|
Industrial camera (MA-CA050-12GC) | Resolution: 2448 × 2048; Chart size: 2/3″ |
Lens (DH110-4F28X) | Magnification: 4.0×; Depth of field: 0.3 mm |
LED light source (MV-R6660S-W) | Power: 6.7 W; Angle: 60°; Color temperature: 6000–10,000 k |
Experience Num | Area (Our Method) (μm2) | Area (Measurement) (μm2) | Accuracy (%) |
---|---|---|---|
1(b) | 1356.25 | 1341.66 | 98.9 |
1(d) | 582.63 | 556.94 | 95.6 |
2(b) | 836.11 | 840.27 | 99.5 |
2(d) | 631.94 | 604.82 | 95.7 |
3(b) | 1465.27 | 1504.85 | 97.4 |
3(d) | 686.11 | 688.19 | 99.7 |
4(b) | 1758.22 | 1780.44 | 98.8 |
4(d) | 1586.70 | 1524.90 | 96.1 |
5(b) | 1109.72 | 1127.78 | 98.4 |
5(d) | 4086.78 | 4161.08 | 98.2 |
6(b) | 2227.07 | 2202.07 | 98.8 |
6(d) | 3472.22 | 3506.25 | 99.0 |
7(b) | 3813.17 | 3960.39 | 96.3 |
7(d) | 4668.72 | 4768.86 | 97.9 |
Experience Num | Area (Our Method) (μm2) | Area (Measurement) (μm2) | Accuracy (%) |
---|---|---|---|
1(b) | 2143.61 | 2215.14 | 96.8 |
1(d) | 1596.43 | 1533.24 | 96.1 |
2(b) | 1861.69 | 1885.99 | 98.7 |
2(d) | 1874.88 | 1863.08 | 99.3 |
3(b) | 1923.49 | 2001.95 | 96.1 |
3(d) | 2249.86 | 2308.19 | 97.5 |
4(b) | 4029.60 | 3878.92 | 96.3 |
4(d) | 2733.16 | 2724.83 | 99.7 |
5(b) | 2270.68 | 2228.33 | 98.1 |
5(d) | 849.25 | 806.89 | 95.1 |
6(b) | 3214.38 | 3283.12 | 98.0 |
6(d) | 599.96 | 598.57 | 99.8 |
7(b) | 3264.37 | 3237.29 | 99.1 |
7(d) | 709.68 | 756.20 | 97.3 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Wu, C.; Hu, Y.; Wang, T.; Peng, Y.; Qin, S.; Luo, X. Multialgorithm Fusion for Milling Tool Abrasion and Breakage Evaluation Based on Machine Vision. Metals 2022, 12, 1825. https://doi.org/10.3390/met12111825
Wu C, Hu Y, Wang T, Peng Y, Qin S, Luo X. Multialgorithm Fusion for Milling Tool Abrasion and Breakage Evaluation Based on Machine Vision. Metals. 2022; 12(11):1825. https://doi.org/10.3390/met12111825
Chicago/Turabian StyleWu, Chao, Yixi Hu, Tao Wang, Yeping Peng, Shucong Qin, and Xianbo Luo. 2022. "Multialgorithm Fusion for Milling Tool Abrasion and Breakage Evaluation Based on Machine Vision" Metals 12, no. 11: 1825. https://doi.org/10.3390/met12111825
APA StyleWu, C., Hu, Y., Wang, T., Peng, Y., Qin, S., & Luo, X. (2022). Multialgorithm Fusion for Milling Tool Abrasion and Breakage Evaluation Based on Machine Vision. Metals, 12(11), 1825. https://doi.org/10.3390/met12111825