Next Article in Journal
Comparative Risk Evaluation of Contaminant Intrusion in Water Distribution Networks via Complex Network Analysis
Previous Article in Journal
Building Design for Daycare Center and Group Home for Elderly Persons with Dementia
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Proceeding Paper

Filipino Meal Recognition Scale with Food Nutrition Calculation and Smart Application †

by
Andrew D. R. Demition
*,
Zephanie Ann L. Narciso
and
Charmaine C. Paglinawan
School of Electrical, Electronics and Computer Engineering, Mapúa University, Muralla St., Intramuros, Manila 1002, Philippines
*
Author to whom correspondence should be addressed.
Presented at the 2024 IEEE 4th International Conference on Electronic Communications, Internet of Things and Big Data, Taipei, Taiwan, 19–21 April 2024.
Eng. Proc. 2024, 74(1), 54; https://doi.org/10.3390/engproc2024074054
Published: 3 September 2024

Abstract

:
Nutritional awareness is considered prevalent in today’s society. In effect, more people have been inclined to perform food calculations in the food they eat to improve their physical fitness and balance the meals they eat. In this study, the Internet of Things was used through a mobile application with Filipino meal recognition that was integrated into a weighing scale to simplify meal recognition and food calculation without individually scaling and measuring the macronutrients of each food. An ESP32 was programmed to determine the weight of the food sample. Moreover, a TensorFlow Lite Model was created using Teachable Machine, whereas the dataset comprised three Filipino meal combinations of rice, pork adobo, and pork giniling; rice, ginataang kalabasa, and pork giniling; and rice, ginataang kalabasa, and pork adobo. The model identified the 15 samples of Filipino meals per combination. The precision was 91.26% for the first meal combination, 82.73% for the second meal combination, and 85.46% for the third meal combination. One-factor ANOVA was conducted to determine the similarities of the actual and predicted macronutrient contents of the food samples, whereas 10 weight values of successfully determined food meals for each combination were used. The model recognized each Filipino food combination with an overall accuracy of 93.33%. The predicted macronutrient contents were similar to the actual macronutrient contents of the meal based on the statistical analysis performed.

1. Introduction

The Internet of Things (IoT) is a revolutionary technology that goes beyond simple functionality and highlights how deeply it affects people as a whole. Apps demand databases of nutritional data and algorithms to provide users with customized advice for better eating practices. In today’s world, Apps are useful tools for people who want to maximize their nutrition and promote healthier living [1]. System security is essential because the IoT is used more and contributes significantly to the infrastructure of countries.
Security research on cyber-physical and IoT systems is guided by different principles. An “Internet Protocol” must be implemented on even the smallest devices. IoT devices’ ability to process data quickly and efficiently is constrained by their limited computer process unit (CPU), memory, and power [2]. Image processing is capable of solving real-world problems [3] in different applications such as medicine [4], plants [5], and animals [6]. The deep learning framework TensorFlow simulates, trains, and classifies data with an accuracy of up to 90% and yields promising results [7].
Accurate methods for measuring energy and food intake are crucial in treating obesity [8]. By providing users/patients with considerate and useful tools, helping them track their food intake, and compiling nutritional data, the long-term prevention of obesity and successful treatment programs are allowed. The ability to recognize food is crucial for food choices and consumption, both of which are vital for human health and well-being. For this reason, it is significant to facilitate various food-related vision and multimodal tasks such as cross-modal recipe generation and retrieval, food detection, and segmentation [9]. The recommended approach is available on mobile phones, which allow the user to snap a photo of the food and determine its calorie content instantaneously. Deep convolutional neural networks (DCNNs) are used to classify food pictures to accurately recognize the meal. The technology is used to recognize food portions with great accuracy. Through the use of technology, the user places the meal within an enclosure and snaps a single photo for food detection [10]. The technology allows for calculating the food item’s calorie content. After the technology is used to analyze the user’s meal photograph and calculate their caloric intake, the name of the food item and how many calories it contains are displayed [11].
A well-balanced diet is essential for optimal tissue function, illness prevention, and improved health. To provide individualized advice for healthy eating practices, the databases of nutritional data and algorithms are integrated with Apps. The Apps allow for healthier living and are practical tools to optimize their nutrition [12]. The connection between nutrition and health and the relationship between diet and health are becoming important for people [13]. To determine the application of the technology and Apps, we analyzed the efficacy of various detection techniques and instruments and compared their advantages and disadvantages. Food identification systems were also explored concerning food safety and dietary recommendations for different groups of people. We also reviewed the latest developments in smartphone use in the food business to provide a foundation for the expansion of food-related businesses including fast food identification, food source tracking, and customized diets [14].
Based on previous studies, we identified a key research gap in food recognition, traceability, and dietary customization. While prior research was performed to explore the effectiveness of detection techniques and the feasibility of applications, there is a need to study such elements in a single framework to cover the technical aspects of food recognition and the practicality of implementation, user-friendly design, and the provision of dietary advice. Convolutional neural networks (CNNs) were used to deliver the information successfully. However, for efficient training, CNNs need a lot of labeled data, time, and effort to be applied to different food products for food recognition.
Thus, it is necessary to analyze how emerging technologies benefit food-related industries and improve individuals’ dietary choices and overall health. Therefore, we integrated AI-based food recognition into a weighing scale with a mobile application to show the macronutrient content of the food. Using Teachable Machine, a food recognition scale with the integration of TensorFlow, a user-friendly system was developed for accurate food recognition with food nutrition calculation. We created the system to measure the weight of the food combinations using image recognition and developed an intuitive user interface with IoT connectivity to the device to determine the system’s capability by comparing the predicted and the actual macronutrient content of the food combination. The system represents a significant advancement in nutritional tracking technology by combining precise weight measurement using a load cell HX111 with AI. The system helps people adopt healthier eating habits by offering precise and up-to-date information about the nutritional value of the food they eat. By fusing IoT connectivity with user-centric design, the system provides nutrition tracking and becomes more convenient for users. The system can be applied to various industries. Enhanced quality control and traceability are allowed to ensure compliance with safety and nutritional standards for processed foods and agricultural products. Consumers can also access real-time nutritional data for educated dietary choices.

2. Materials and Methods

2.1. Conceptual Framework

In the system, images are captured as the dataset of the macronutrient content of food samples for providing calorie intake for the user. Foods and combinations are classified by utilizing the captured images in the dataset, and nutrition is calculated to predict the macronutrient content and calories. On the device and the mobile device, the weight, total calories, macronutrient content, total calorie intake, and needed calories are displayed (Figure 1).

2.2. Process Flowchart

In the system, the first process is logging in followed by the main menu to input the user profile including height, weight, age, gender, activity level, and target weight. Calculation is conducted using the food calculator menu based on foods captured. Macronutrient content is calculated based on the database of food samples and their weights. Foods and calories are deducted from the total calories needed for the day (Figure 2).

2.3. Hardware Development

In the system, a 12 V, 5 A, AC/DC power supply adapter was connected to the HW688 module to convert 12 to 5 V. Since the ring light and the ESP32 need 5 V, the HW688 supplies the right voltage needed. Components were connected to the ESP32 and the 16 × 2 I2C LCD to show the weight by utilizing the HX711 load amplifier (Figure 3, Figure 4 and Figure 5).

2.4. Software Development

The App was created using MIT App Inventor. The App includes a login page to create login credentials and the main menu for food calculation and a user profile where the user inputs their data for the computation of total calories needed per day. It also includes an interface for the weight output of the classified food combination (Figure 6).

2.5. Training and Food Nutrition Calculation

In the recognition model, the dataset or images of the sample are gathered. Figure 7 shows the different samples of food combinations to create the different food classification models.
Figure 8 shows the interface of the Teachable Machine with the different food combinations. The Teachable Machine was utilized to train a TensorFlow model with the dataset and identify food combinations to determine the macronutrient content of the food combination. To calculate the actual calorie content per food sample, the calories of each food sample are multiplied by its conversion factor which is as follows.
T C A = F V 1 T C 1 S 1 + F V 2 T C 2 S 2 + R ( T C 3 S 3 )
where TCA is the total actual calories of the food combination. FV1 and FV2 are the two Filipino viands on the plate, R is for Rice, TC1 to TC3 are the total calories in accordance with the food sample, and S1 to S3 are its serving size.
For predicting the total caloric and macronutrient content of the food combinations, the sample mean of each macronutrient was approximated. The total calories and macronutrient content for each parameter (AMt) per serving size are equal to the sum of those of the listed food samples of the food combinations. The formula for the predicted caloric and macronutrient content per serving size is as follows.
A M T = F V 1 + F V 2 + R 3
After gathering the total caloric content of the food combination per serving size AMT, it is divided by the serving size (S) before multiplying by the total sum of each macronutrient content in grams of each retrieved food sample RF1, RF2, and RR in the food combination.
T C P = A M T S ( R F V 1 + R F V 2 + R R )
The macronutrient count is presented in Table 1. The information on the macronutrients is based on the values provided by the DOST-NFRI Food Exchange List for Meal Planning.

3. Results and Discussion

Foods were classified and compared between the actual and predicted calories of the food combinations. FC1, FC2, and FC3 denote the food combinations of rice, ginataang kalabasa, and giniling. The average precision score for the food classification was calculated in the mobile application using the TensorFlow Lite model. For FC1, FC2, and FC3, the average precision score was 91.27, 82.73, and 85.47% (Table 2). The confusion matrix was constructed using 15 trials. For FC1, all were identified correctly, while for FC2 and FC3, one and two samples were misidentified. The computed overall accuracy was 93.33% (Table 3).
P r e c i s i o n = T P / T P + F P
R e c a l l = T P / T P + F N
O v e r a l l   A c c u r a c y = T P + T N T P + F P + T N + F N × 100 %
O v e r a l l   A c c u r a c y = 93.33 %
Table 4 shows the different weights of FC1. The total calories of the sample were determined, too. For predicted total calories, the sum of the weights of each food sample was used and the computed macronutrient content per gram was used. The average actual total calories of FC1 was 269.75 kcal, and the average predicted total calories was 270.40 kcal. The same process was conducted for FC2 with different values because each food sample had a different macronutrient content. The average total calories was 319.95 kcal. The average predicted total calories was 323.70 kcal (Table 5). The average actual total calories of FC3 was 284.79 kcal, and the average predicted total calories was 280.98 kcal (Table 6).

4. Conclusions

We developed a system for food classification using image classification and machine learning. A working prototype was fabricated, and the mobile application was developed to determine nutritional information on foods for a healthier lifestyle. The potential of image processing was confirmed in determining the macronutrient content of selected foods and meals. Nutrition and health monitoring using mobile applications can be further developed with technological advancements.

Author Contributions

Conceptualization, A.D.R.D. and Z.A.L.N.; methodology, A.D.R.D. and Z.A.L.N.; software, A.D.R.D.; validation, A.D.R.D., Z.A.L.N. and C.C.P.; formal analysis, A.D.R.D., Z.A.L.N., and C.C.P.; investigation, Z.A.L.N.; resources, Z.A.L.N.; data curation, A.D.R.D. and Z.A.L.N.; writing—original draft preparation, A.D.R.D. and Z.A.L.N.; writing—review and editing, C.C.P.; visualization, A.D.R.D.; supervision, C.C.P.; funding acquisition, A.D.R.D. and Z.A.L.N. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

The data files generated and presented in this study are available upon request of the corresponding author.

Acknowledgments

The authors would like to thank Darwin Dormis, for providing the computation of nutritional content of the chosen Filipino viands based on DOST-NFRI Food Exchange List for Meal Planning.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Abdelghani, W.; Zayani, C.A.; Amous, I.; Sèdes, F. User-centric IoT: Challenges and Perspectives. personales.upv.es. 2019. Available online: https://personales.upv.es/thinkmind/UBICOMM/UBICOMM_2018/ubicomm_2018_2_20_10049.html (accessed on 2 May 2024).
  2. Mendoza, I.C.P.; Timbol, S.M.; Samonte, M.J.C.; Blancaflor, E.B. ImHome: An IoT for Smart Home Appliances. In Proceedings of the 2020 IEEE 7th International Conference on Industrial Engineering and Applications (ICIEA), Bangkok, Thailand, 16–21 April 2020. [Google Scholar] [CrossRef]
  3. Paglinawan, C.C.; Hannah, M.; Frias, J.B. Medicine Classification Using YOLOv4 and Tesseract OCR. In Proceedings of the 2023 15th International Conference on Computer and Automation Engineering (ICCAE), Sydney, Australia, 3–5 March 2023. [Google Scholar] [CrossRef]
  4. Ckyle, A.; Guillermo, D.M.; Paglinawan, C.C. Cassava Disease Detection Using MobileNetV3 Algorithm through Augmented Stem and Leaf Images. In Proceedings of the 2023 17th International Conference on Ubiquitous Information Management and Communication (IMCOM), Seoul, Republic of Korea, 3–5 January 2023. [Google Scholar] [CrossRef]
  5. Galeno, P.C.; Sale, D.P.; Martin, M.B. SMS-Based Dog Detection in Residential Area Using YOLOv5 and ML.net. In Proceedings of the 2023 IEEE 11th Conference on Systems, Process & Control (ICSPC), Malacca, Malaysia, 16 December 2023. [Google Scholar] [CrossRef]
  6. Perez, J.V.; Manangan, G.; Deveza, D.K.; Mendero, C.; Villaruel, H.V. Implementation of Convolutional Neural Network of Non-Biodegradable Garbage Classifier and Segregator Based on VGG16 Architecture. In Proceedings of the TENCON 2023 - 2023 IEEE Region 10 Conference (TENCON), Chiang Mai, Thailand, 31 October–3 November 2023. [Google Scholar] [CrossRef]
  7. Abu, M.; Hazirah Indra, N.; Halim, A.; Rahman, A.; Sapiee, A.; Ahmad, I. A Study on Image Classification Based on Deep Learning and Tensorflow. Int. J. Eng. Res. Technol. 2019, 12, 563–569. [Google Scholar]
  8. Kumar, R.D.; Julie, E.G.; Robinson, Y.H.; Vimal, S.; Seo, S. Recognition of Food Type and Calorie Estimation Using Neural Network. J. Supercomput. 2021, 77, 8172–8193. [Google Scholar] [CrossRef]
  9. Min, W.; Wang, Z.; Liu, Y.; Luo, M.; Kang, L.-P.; Wei, X.; Wei, X.; Jiang, S. Large Scale Visual Food Recognition. IEEE Trans. Pattern Anal. Mach. Intell. 2021, 45, 9932–9949. [Google Scholar] [CrossRef] [PubMed]
  10. Balbin, J.R.; Valiente, L.D.; Martin, K.; Olorvida, E.D.; Glenn, G.; Soto, M.L. Determination of Calorie Content in Different Type of Foods Using Image Processing. In Proceedings of the 2019 IEEE 11th International Conference on Humanoid, Nanotechnology, Information Technology, Communication and Control, Environment, and Management (HNICEM), Laoag, Philippines, 29 November–1 December 2019. [Google Scholar] [CrossRef]
  11. Jijesh, J.J.; Jinesh, J.J.; Bolla, D.R.; Sruthi, P.V.; Dileep, M.R.; Keshavamurthy, D. Development of Food Tracking System Using Machine Learning. In Proceedings of the 2021 5th International Conference on Electrical, Electronics, Communication, Computer Technologies and Optimization Techniques (ICEECCOT), Mysuru, India, 10–11 December 2021. [Google Scholar] [CrossRef]
  12. Gehlot, G. Dietexpert—Android Application for Personal Diet Consultant. Int. J. Eng. Appl. Sci. Technol. 2021, 5, 202–205. [Google Scholar] [CrossRef]
  13. Mohanty, S.P.; Singhal, G.; Scuccimarra, E.A.; Kebaili, D.; Héritier, H.; Boulanger, V.; Salathé, M. The Food Recognition Benchmark: Using Deep Learning to Recognize Food in Images. Front. Nutr. 2022, 9, 875143. [Google Scholar] [CrossRef] [PubMed]
  14. Ma, T.; Wang, H.; Wei, M.; Lan, T.; Wang, J.; Bao, S.; Ge, Q.; Fang, Y.; Sun, X. Application of Smart-Phone Use in Rapid Food Detection, Food Traceability Systems, and Personalized Diet Guidance, Making Our Diet More Health. Food Res. Int. 2022, 152, 110918. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Conceptual framework.
Figure 1. Conceptual framework.
Engproc 74 00054 g001
Figure 2. Process flowchart.
Figure 2. Process flowchart.
Engproc 74 00054 g002
Figure 3. System hardware.
Figure 3. System hardware.
Engproc 74 00054 g003
Figure 4. Experimental setup and prototype design.
Figure 4. Experimental setup and prototype design.
Engproc 74 00054 g004
Figure 5. Prototype of developed system.
Figure 5. Prototype of developed system.
Engproc 74 00054 g005
Figure 6. User interface of system.
Figure 6. User interface of system.
Engproc 74 00054 g006
Figure 7. Samples of food combinations.
Figure 7. Samples of food combinations.
Engproc 74 00054 g007
Figure 8. Dataset of food combination in Teachable Machine.
Figure 8. Dataset of food combination in Teachable Machine.
Engproc 74 00054 g008
Table 1. Nutritional content of chosen Filipino foods.
Table 1. Nutritional content of chosen Filipino foods.
Sample MenuQuantityFood GroupCarbohydrates (G)Protein (G)Fat (G)Energy (Kcal)
Pork Adobo
Pork Tenderloin2 slices (70 g)Low-Fat Meat-16282
Potato½ pc (85 g)Rice B
(Medium
Protein)
11.51-50
Cooking
oil
2 tsp (10 g)Fat--1090
TOTAL11.5 g17 g12 g222 kcal
Pork Giniling
Cooking
oil
2 tsp (10 g)Fat--1090
Lean ground pork1 slice (35 g)Low-Fat Meat-8141
Chicken egg1 pc medium
(55 g)
Medium-Fat Meat-8686
Potato½ pc (85 g)Rice B
(Medium
Protein)
11.51-50
Carrots1 cup (90 g)Vegetable62-32
Green peas
Cooking
oil
1 tsp (5 g)Fat--545
TOTAL17.5 g19 g12 g254 kcal
Ginataang Kalabasa
Pork tenderloin1 slice (35 g)Low-Fat Meat-8141
Squash1 cup (90 g)Vegetable62-32
String beans
Coconut cream3 Tbsp
(45 g)
Fat--15135
Cooking
oil
1 tsp (5 g)Fat--545
TOTAL6 g10 g21 g253 kcal
Rice
White Rice1 cup
(160 g)
Rice B
(Medium
Protein)
464-200
TOTAL46 g4 g0 g200 kcal
(Computed based on the values provided by the DOST-FNRI Food Exchanges Lists for Meal Planning).
Table 2. Average precision score of each food combination.
Table 2. Average precision score of each food combination.
Number of TrialsFood SampleAverage Precision Score (%)
15FC191.26666667
15FC282.73333333
15FC385.46666667
Table 3. Confusion matrix.
Table 3. Confusion matrix.
FC1FC2FC3Classification Overall
FC1150015
FC2013215
FC3011415
Truth Overall15141645
Table 4. Macronutrient content of FC1.
Table 4. Macronutrient content of FC1.
Food Combo ActualPredicted
TrialGiniling (g)Adobo (g)Rice (g)Sum (g)Total CaloriesTotal Calories
13747125209254.29246.35
2545688198236.15233.39
38452123259302.74305.29
4846378225261.29265.21
59555106256295.87301.76
6605093203239.97239.28
75762105224268.29264.04
8948067241279.82284.07
9816987237277.79279.36
10865799242281.34285.25
Average:269.75270.40
Table 5. Macronutrient content of FC2.
Table 5. Macronutrient content of FC2.
Food Combo 2ActualPredicted
TrialGinataang
Kalabasa (g)
Adobo (g)Rice (g)Sum (g)Total CaloriesTotal Calories
18542107234313.14315.21
24876103227300.40305.78
38761103251336.60338.11
44085115240315.94323.29
5919792280377.07377.18
6797189239320.99321.95
75844128230303.05309.82
8508193224297.52301.74
94582111238314.13320.60
10677994240320.65323.29
Average:319.95323.70
Table 6. Macronutrient content of FC3.
Table 6. Macronutrient content of FC3.
Food Combo 2ActualPredicted
TrialGinataang
Kalabasa (g)
Giniling (g)Rice (g)Sum (g)Total CaloriesTotal Calories
18542107234313.14315.21
24876103227300.40305.78
38761103251336.60338.11
44085115240315.94323.29
5919792280377.07377.18
6797189239320.99321.95
75844128230303.05309.82
8508193224297.52301.74
94582111238314.13320.60
10677994240320.65323.29
Average:319.95323.70
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Demition, A.D.R.; Narciso, Z.A.L.; Paglinawan, C.C. Filipino Meal Recognition Scale with Food Nutrition Calculation and Smart Application. Eng. Proc. 2024, 74, 54. https://doi.org/10.3390/engproc2024074054

AMA Style

Demition ADR, Narciso ZAL, Paglinawan CC. Filipino Meal Recognition Scale with Food Nutrition Calculation and Smart Application. Engineering Proceedings. 2024; 74(1):54. https://doi.org/10.3390/engproc2024074054

Chicago/Turabian Style

Demition, Andrew D. R., Zephanie Ann L. Narciso, and Charmaine C. Paglinawan. 2024. "Filipino Meal Recognition Scale with Food Nutrition Calculation and Smart Application" Engineering Proceedings 74, no. 1: 54. https://doi.org/10.3390/engproc2024074054

APA Style

Demition, A. D. R., Narciso, Z. A. L., & Paglinawan, C. C. (2024). Filipino Meal Recognition Scale with Food Nutrition Calculation and Smart Application. Engineering Proceedings, 74(1), 54. https://doi.org/10.3390/engproc2024074054

Article Metrics

Back to TopTop