Predicting the Textural Properties of Plant-Based Meat Analogs with Machine Learning
Abstract
:1. Introduction
- To develop a novel framework incorporating the constituents of meat analogs which aims to predict the textural properties, Hardness and Chewiness, of the developed meat analogs by using multiple parameters in high-moisture extrusion and mechanical elongation processing studies.
- To provide comprehensive experimental discussion to guide researchers in the field on the important computational points to consider, such as feature importance, multicollinearity among the features (i.e., constituents), linearity of the designed model, and inconsistent food compositions for validation of their experimental design.
- To lay out the groundwork for similar studies in the future.
2. Materials and Methods
2.1. Experimental System
2.2. Data Pre-Processing
2.2.1. Raw Material Composition
2.2.2. Measurement of Textural Properties
2.3. Machine Learning Models
2.4. Feature Selection
2.5. Evaluation Metrics
3. Results and Discussion
3.1. Experimental Setup
3.2. Results for Hardness and Chewiness Prediction
3.3. Feature Importance and the Effect of Feature Selection in Prediction Performance
3.4. Linearity Assumption
3.5. Case Study: Removal of Faba Bean Concentrate Commercial
3.6. Further Discussions on ML Models: Multicollinearity
4. Conclusions
Supplementary Materials
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Deshmukh, R.; Vig, H.; Chouhan, N. Global Opportunity Analysis and Industry Forecast, 2021–2030; Allied Market Research: Portland, OR, USA. Available online: https://www.alliedmarketresearch.com/request-sample/816 (accessed on 5 January 2023).
- Aguilar, C.N.; Hocquette, J.F.; Rojas, R.; Nachman, K.E.; Re, S.; Bf, K.; Se, G.; Dutkiewicz, J.; Emb, B.; Mw, B.; et al. Considering Plant-Based Meat Substitutes and Cell-Based Meats: A Public Health and Food Systems Perspective. Front. Sustain. Food Syst. 2020, 4, 134. [Google Scholar] [CrossRef]
- Manski, J.M.; van der Goot, A.J.; Boom, R.M. Advances in structure formation of anisotropic protein-rich foods through novel processing concepts. Trends Food Sci. Technol. 2007, 18, 546–557. [Google Scholar] [CrossRef]
- Strahm, B. Soy Applications in Food; Riaz, M.N., Ed.; CRC Press: Boca Raton, FL, USA, 2006; pp. 135–154. [Google Scholar]
- Malav, O.P.; Talukder, S.; Gokulakrishnan, P.; Chand, S. Meat Analog: A Review. Crit. Rev. Food Sci. Nutr. 2015, 55, 1241–1245. [Google Scholar] [CrossRef]
- Datar, I.; Betti, M. Possibilities for an in vitro meat production system. Innov. Food Sci. Emerg. Technol. 2010, 11, 13–22. [Google Scholar] [CrossRef]
- Wild, F.; Czerny, M.; Janssen, A.; Kole, A.; Zunabovic, M.; Domig, K. The evolution of a plant-based alternative to meat. From niche markets to widely accepted meat alternatives. Agro Food Ind. Hi-Tech 2014, 25, 45–49. [Google Scholar] [CrossRef]
- Mattice, K.D.; Marangoni, A.G. Comparing methods to produce fibrous material from zein. Food Res. Int. 2020, 128, 108804. [Google Scholar] [CrossRef]
- Smetana, S.; Larki, N.A.; Pernutz, C.; Franke, K.; Bindrich, U.; Toepfl, S.; Heinz, V. Structure design of insect-based meat analogs with high-moisture extrusion. J. Food Eng. 2018, 229, 83–85. [Google Scholar] [CrossRef]
- Vasudevan, R.; Pilania, G.; Balachandran, P.V. Machine learning for materials design and discovery. J. Appl. Phys. 2021, 129, 070401. [Google Scholar] [CrossRef]
- Song, Z.; Chen, X.; Meng, F.; Cheng, G.; Wang, C.; Sun, Z.; Yin, W.J. Machine learning in materials design: Algorithm and application. Chin. Phys. B 2020, 29, 116103. [Google Scholar] [CrossRef]
- Zhang, Z.; Li, M.; Flores, K.; Mishra, R. Machine learning formation enthalpies of intermetallics. J. Appl. Phys. 2020, 128, 105103. [Google Scholar] [CrossRef]
- Mazhnik, E.; Oganov, A.R. Application of machine learning methods for predicting new superhard materials. J. Appl. Phys. 2020, 128, 075102. [Google Scholar] [CrossRef]
- Dieb, S.; Song, Z.; Yin, W.J.; Ishii, M. Optimization of depth-graded multilayer structure for x-ray optics using machine learning. J. Appl. Phys. 2020, 128, 074901. [Google Scholar] [CrossRef]
- Zheng, B.; Yang, J.; Liang, B.; Cheng, J.C. Inverse design of acoustic metamaterials based on machine learning using a Gauss–Bayesian model. J. Appl. Phys. 2020, 128, 134902. [Google Scholar] [CrossRef]
- Tian, Y.; Yuan, R.; Xue, D.; Zhou, Y.; Ding, X.; Sun, J.; Lookman, T. Role of uncertainty estimation in accelerating materials development via active learning. J. Appl. Phys. 2020, 128, 014103. [Google Scholar] [CrossRef]
- Fan, F.; Ma, Q.; Ge, J.; Peng, Q.; Riley, W.W.; Tang, S. Prediction of texture characteristics from extrusion food surface images using a computer vision system and artificial neural networks. J. Food Eng. 2013, 118, 426–433. [Google Scholar] [CrossRef]
- Batista, L.F.; Marques, C.S.; dos Santos Pires, A.C.; Minim, L.A.; de Fátima Ferreira Soares, N.; Vidigal, M.C.T.R. Artificial neural networks modeling of non-fat yogurt texture properties: Effect of process conditions and food composition. Food Bioprod. Process. 2021, 126, 164–174. [Google Scholar] [CrossRef]
- Chiang, J.H.; Loveday, S.M.; Hardacre, A.K.; Parker, M.E. Effects of soy protein to wheat gluten ratio on the physicochemical properties of extruded meat analogues. Food Struct. 2019, 19, 100102. [Google Scholar] [CrossRef]
- Chiang, J.H.; Tay, W.; Ong, D.S.M.; Liebl, D.; Ng, C.P.; Henry, C.J. Physicochemical, textural and structural characteristics of wheat gluten-soy protein composited meat analogues prepared with the mechanical elongation method. Food Struct. 2021, 28, 100183. [Google Scholar] [CrossRef]
- Ferawati, F.; Zahari, I.; Barman, M.; Hefni, M.; Ahlström, C.; Witthöft, C.; Östbring, K. High-Moisture Meat Analogues Produced from Yellow Pea and Faba Bean Protein Isolates/Concentrate: Effect of Raw Material Composition and Extrusion Parameters on Texture Properties. Foods 2021, 10, 843. [Google Scholar] [CrossRef]
- Hoerl, A.E.; Kennard, R.W. Ridge Regression: Biased Estimation for Nonorthogonal Problems. Technometrics 2000, 42, 80–86. [Google Scholar] [CrossRef]
- Breiman, L. Random Forests. Mach. Learn. 2001, 45, 5–32. [Google Scholar] [CrossRef] [Green Version]
- Chen, T.; Guestrin, C. XGBoost: A Scalable Tree Boosting System. In Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining—KDD’16, San Francisco, CA, USA, 13–17 August 2015; Association for Computing Machinery: New York, NY, USA, 2016; pp. 785–794. [Google Scholar] [CrossRef] [Green Version]
- Altman, N.S. An Introduction to Kernel and Nearest-Neighbor Nonparametric Regression. Am. Stat. 1992, 46, 175–185. [Google Scholar] [CrossRef] [Green Version]
- Hornik, K.; Stinchcombe, M.; White, H. Multilayer feedforward networks are universal approximators. Neural Netw. 1989, 2, 359–366. [Google Scholar] [CrossRef]
- James, G.; Witten, D.; Hastie, T.; Tibshirani, R. An Introduction to Statistical Learning: With Applications in R; Springer: Berlin/Heidelberg, Germany, 2013. [Google Scholar]
- Murphy, K.P. Machine Learning: A Probabilistic Perspective; MIT Press: Cambridge, MA, USA, 2013. [Google Scholar]
- Breiman, L. Manual on Setting up, Using, and Understanding Random Forests v3. 1; Statistics Department University of California: Berkeley, CA, USA, 2002; Volume 1, pp. 3–42. [Google Scholar]
- O’brien, R.M. A Caution Regarding Rules of Thumb for Variance Inflation Factors. Qual. Quant. 2007, 41, 673–690. [Google Scholar] [CrossRef]
- Osborne, J.W.; Waters, E. Four assumptions of multiple regression that researchers should always test. Pract. Assess. Res. Eval. 2007, 41, 673–690. [Google Scholar] [CrossRef]
ML Models | Parameters | Values |
---|---|---|
Ridge | alpha | {0, 1, 2, 3, 4, 5} |
Random Forest | max_depth | {4, 5, 6, 7, None} |
min_samples_leaf | {1, 2, 3, 4, 5} | |
n_estimators | {100, 200, 300} | |
XGBoost | eta | {0.2, 0.3, 0.4, 0.5} |
max_depth | {4, 5, 6, 7} | |
n_estimators | {100, 200, 300} | |
reg_lambda | {1.0, 2.0, 3.0} | |
KNN | n_neighbors | {3, 5, 7, 9} |
MLP | activation | {“relu”, “identity”, “logistic”, “tanh”} |
alpha | {0.0001, 0.001, 0.01, 0.1, 0, 1, 10} | |
hidden_layer_size | {(1,), (2,), (3,), (4,), (5,)} |
ML | Hardness | Chewiness | ||||
---|---|---|---|---|---|---|
Subset | RMSE | MAPE % | Subset | RMSE | MAPE % | |
Ridge | {target moisture, moisture, carbs, and fat} | 10.101 | 22.9 | {target moisture, moisture, carbs, and fat} | 6.035 | 14.5 |
Random Forest | {protein, target moisture, and carbs} | 13.797 | 24.9 | {protein, target moisture, carbs, and fat} | 10.150 | 22.4 |
XGBoost | {protein, carbs, fat, and fibre} | 12.310 | 21.2 | {protein, carbs, and fat} | 7.815 | 17.5 |
KNN | {target moisture, moisture, ash, carbs, and fat} | 10.389 | 19.9 | {target moisture, carbs, and fat} | 7.902 | 16.1 |
MLP | {target moisture, moisture, carbs, and fat} | 14.695 | 27.5 | {target moisture, moisture, carbs, and fat} | 8.018 | 16.3 |
ML Models | Hardness | Chewiness | ||
---|---|---|---|---|
Before FS | After FS | Before FS | After FS | |
Ridge | 10.021 | 1.356 | 13.636 | 1.212 |
Random Forest | 3.583 | 3.337 | 3.324 | 3.028 |
XGBoost | 5.133 | 2.576 | 3.738 | 1.779 |
KNN | 3.240 | 2.611 | 3.023 | 1.721 |
MLP | 6.202 | 2.255 | 4.398 | 1.638 |
Average | 5.636 | 2.427 | 5.624 | 1.876 |
Features | VIF | Tolerance |
---|---|---|
carbs | 3.984 | 0.251 |
fat | 7.026 | 0.142 |
moisture | 1.593 | 0.628 |
target moisture | 5.598 | 0.179 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Kircali Ata, S.; Shi, J.K.; Yao, X.; Hua, X.Y.; Haldar, S.; Chiang, J.H.; Wu, M. Predicting the Textural Properties of Plant-Based Meat Analogs with Machine Learning. Foods 2023, 12, 344. https://doi.org/10.3390/foods12020344
Kircali Ata S, Shi JK, Yao X, Hua XY, Haldar S, Chiang JH, Wu M. Predicting the Textural Properties of Plant-Based Meat Analogs with Machine Learning. Foods. 2023; 12(2):344. https://doi.org/10.3390/foods12020344
Chicago/Turabian StyleKircali Ata, Sezin, Jing K. Shi, Xuesi Yao, Xin Yi Hua, Sumanto Haldar, Jie Hong Chiang, and Min Wu. 2023. "Predicting the Textural Properties of Plant-Based Meat Analogs with Machine Learning" Foods 12, no. 2: 344. https://doi.org/10.3390/foods12020344
APA StyleKircali Ata, S., Shi, J. K., Yao, X., Hua, X. Y., Haldar, S., Chiang, J. H., & Wu, M. (2023). Predicting the Textural Properties of Plant-Based Meat Analogs with Machine Learning. Foods, 12(2), 344. https://doi.org/10.3390/foods12020344