Urban Plants Classification Using Deep-Learning Methodology: A Case Study on a New Dataset
Abstract
:1. Introduction
2. Related Work
3. Urban Planter Dataset
4. Case Study
4.1. Methods
4.2. Datasets for Transfer Learning
- ImageNet. ImageNet was used for transfer learning in many tasks and domains in computer vision. All DNNs that we applied are available with weights pre-trained on ImageNet. The dataset contains about 1.2 M images classified into 1 K categories;
- Oxford102. In contrast to ImageNet, this dataset is much closer to the plants domain. It contains about 8000 images of flowers assigned into 102 species.
4.3. Experiment Scenarios
- Training the models from scratch, i.e., with random initialization, on the Urban Planter dataset (denoted by 0-TL);
- One-step transfer learning using Oxford102 (denoted by 1-TL-Ox), where models, pre-trained on Oxford102, are trained and applied on Urban Planter.
- One-step transfer learning using ImageNet (denoted by 1-TL-IN),where models, pre-trained on ImageNet, are trained and applied on Urban Planter.
- Two-steps transfer learning (denoted by 2-TL), where models, pre-trained first on ImageNet and then on Oxford 102, were trained and applied on Urban Planter.
4.4. Data Preprocessing
5. Results and Discussion
6. Limitations of our Study
7. Conclusions and Future Work
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Tan, W.N.; Sem, R.; Tan, Y.F. Blooming flower recognition by using eigenvalues of shape features. In Proceedings of the Sixth International Conference on Digital Image Processing (ICDIP 2014), Athens, Greece, 5–6 April 2014; Volume 9159, pp. 344–348. [Google Scholar]
- Tan, W.N.; Tan, Y.F.; Koo, A.C.; Lim, Y.P. Petals’ shape descriptor for blooming flowers recognition. In Proceedings of the Fourth International Conference on Digital Image Processing (ICDIP 2012), Kuala Lumpur, Malaysia, 7–8 April 2012; Volume 8334, pp. 693–698. [Google Scholar]
- Phyu, K.H.; Kutics, A.; Nakagawa, A. Self-adaptive feature extraction scheme for mobile image retrieval of flowers. In Proceedings of the 2012 Eighth International Conference on Signal Image Technology and Internet Based Systems, Sorrento, Italy, 25–29 November 2012; pp. 366–373. [Google Scholar]
- Hsu, T.H.; Lee, C.H.; Chen, L.H. An interactive flower image recognition system. Multimed. Tools Appl. 2011, 53, 53–73. [Google Scholar] [CrossRef]
- Hong, S.W.; Choi, L. Automatic recognition of flowers through color and edge based contour detection. In Proceedings of the 2012 3rd International conference on image processing theory, tools and applications (IPTA), Istanbul, Turkey, 15–18 October 2012; pp. 141–146. [Google Scholar]
- Cho, S.Y.; Lim, P.T. A novel virus infection clustering for flower images identification. In Proceedings of the 18th International Conference on Pattern Recognition (ICPR’06), Hong Kong, China, 20–24 August 2006; Volume 2, pp. 1038–1041. [Google Scholar]
- Cho, S.Y. Content-based structural recognition for flower image classification. In Proceedings of the 2012 7th IEEE Conference on Industrial Electronics and Applications (ICIEA), Singapore, 18–20 July 2012; pp. 541–546. [Google Scholar]
- Apriyanti, D.H.; Arymurthy, A.M.; Handoko, L.T. Identification of orchid species using content-based flower image retrieval. In Proceedings of the 2013 International Conference on Computer, Control, Informatics and its Applications (IC3INA), Jakarta, Indonesia, 19–21 November 2013; pp. 53–57. [Google Scholar]
- Nilsback, M.E.; Zisserman, A. Automated flower classification over a large number of classes. In Proceedings of the 2008 Sixth Indian Conference on Computer Vision, Graphics & Image Processing, Bhubaneswar, India, 16–19 December 2008; pp. 722–729. [Google Scholar]
- Nilsback, M.E.; Zisserman, A. A visual vocabulary for flower classification. In Proceedings of the 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’06), New York, NY, USA, 17–22 June 2006; Volume 2, pp. 1447–1454. [Google Scholar]
- Qi, W.; Liu, X.; Zhao, J. Flower classification based on local and spatial visual cues. In Proceedings of the 2012 IEEE International Conference on Computer Science and Automation Engineering (CSAE), Zhangjiajie, China, 25–27 May 2012; Volume 3, pp. 670–674. [Google Scholar]
- Zawbaa, H.M.; Abbass, M.; Basha, S.H.; Hazman, M.; Hassenian, A.E. An automatic flower classification approach using machine learning algorithms. In Proceedings of the 2014 International Conference on Advances in Computing, Communications and Informatics (ICACCI), Delhi, India, 24–27 September 2014; pp. 895–901. [Google Scholar]
- Machhour, A.; Zouhri, A.; El Mallahi, M.; Lakhliai, Z.; Tahiri, A.; Chenouni, D. Plants Classification Using Neural Shifted Legendre-Fourier Moments. In Proceedings of the International Conference on Smart Information & Communication Technologies, Oujda, Morocco, 26–28 September 2019; pp. 149–153. [Google Scholar]
- Wäldchen, J.; Mäder, P. Plant species identification using computer vision techniques: A systematic literature review. Arch. Comput. Methods Eng. 2018, 25, 507–543. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Anubha Pearline, S.; Sathiesh Kumar, V.; Harini, S. A study on plant recognition using conventional image processing and deep learning approaches. J. Intell. Fuzzy Syst. 2019, 36, 1997–2004. [Google Scholar] [CrossRef]
- Yosinski, J.; Clune, J.; Bengio, Y.; Lipson, H. How transferable are features in deep neural networks? arXiv 2014, arXiv:1411.1792. [Google Scholar]
- Xia, X.; Xu, C.; Nan, B. Inception-v3 for flower classification. In Proceedings of the 2017 2nd International Conference on Image, Vision and Computing (ICIVC), Chengdu, China, 2–4 June 2017; pp. 783–787. [Google Scholar]
- Hiary, H.; Saadeh, H.; Saadeh, M.; Yaqub, M. Flower classification using deep convolutional neural networks. IET Comput. Vis. 2018, 12, 855–862. [Google Scholar] [CrossRef]
- Wu, Y.; Qin, X.; Pan, Y.; Yuan, C. Convolution neural network based transfer learning for classification of flowers. In Proceedings of the 2018 IEEE 3rd International Conference on Signal and Image Processing (ICSIP), Shenzhen, China, 13–15 July 2018; pp. 562–566. [Google Scholar]
- Gavai, N.R.; Jakhade, Y.A.; Tribhuvan, S.A.; Bhattad, R. MobileNets for flower classification using TensorFlow. In Proceedings of the 2017 International Conference on Big Data, IoT and Data Science (BID), Pune, India, 20–22 December 2017; pp. 154–158. [Google Scholar]
- Diaz, C.A.M.; Castaneda, E.E.M.; Vassallo, C.A.M. Deep learning for plant classification in precision agriculture. In Proceedings of the 2019 International Conference on Computer, Control, Informatics and its Applications (IC3INA), Tangerang, Indonesia, 23–24 October 2019; pp. 9–13. [Google Scholar]
- Van Hieu, N.; Hien, N.L.H. Recognition of Plant Species using Deep Convolutional Feature Extraction. Int. J. Emerg. Technol. 2020, 11, 904–910. [Google Scholar]
- Van Hieu, N.; Hien, N.L.H. Automatic Plant Image Identification of Vietnamese species using Deep Learning Models. arXiv 2020, arXiv:2005.02832. [Google Scholar]
- Beikmohammadi, A.; Faez, K. Leaf classification for plant recognition with deep transfer learning. In Proceedings of the 2018 4th Iranian Conference on Signal Processing and Intelligent Systems (ICSPIS), Tehran, Iran, 25–27 December 2018; pp. 21–26. [Google Scholar]
- Ripley, B.D. Pattern Recognition and Neural Networks; Cambridge University Press: Cambridge, UK, 2007. [Google Scholar]
- Simonyan, K.; Zisserman, A. Very deep convolutional networks for large-scale image recognition. arXiv 2014, arXiv:1409.1556. [Google Scholar]
- Szegedy, C.; Liu, W.; Jia, Y.; Sermanet, P.; Reed, S.; Anguelov, D.; Erhan, D.; Vanhoucke, V.; Rabinovich, A. Going deeper with convolutions. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Boston, MA, USA, 8–10 June 2015; pp. 1–9. [Google Scholar]
- Ioffe, S.; Szegedy, C. Batch normalization: Accelerating deep network training by reducing internal covariate shift. In Proceedings of the International Conference on Machine Learning, Lille, France, 6–11 July 2015; pp. 448–456. [Google Scholar]
- Szegedy, C.; Vanhoucke, V.; Ioffe, S.; Shlens, J.; Wojna, Z. Rethinking the inception architecture for computer vision. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 2818–2826. [Google Scholar]
- Szegedy, C.; Ioffe, S.; Vanhoucke, V.; Alemi, A. Inception-v4, inception-resnet and the impact of residual connections on learning. In Proceedings of the AAAI Conference on Artificial Intelligence, San Francisco, CA, USA, 4–9 February 2017; Volume 31. [Google Scholar]
- Chollet, F. Xception: Deep learning with depthwise separable convolutions. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; pp. 1251–1258. [Google Scholar]
- Huang, G.; Liu, Z.; Van Der Maaten, L.; Weinberger, K.Q. Densely connected convolutional networks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–27 July 2017; pp. 4700–4708. [Google Scholar]
- Howard, A.G.; Zhu, M.; Chen, B.; Kalenichenko, D.; Wang, W.; Weyand, T.; Andreetto, M.; Adam, H. Mobilenets: Efficient convolutional neural networks for mobile vision applications. arXiv 2017, arXiv:1704.04861. [Google Scholar]
- Wei, X.S.; Song, Y.Z.; Mac Aodha, O.; Wu, J.; Peng, Y.; Tang, J.; Yang, J.; Belongie, S. Fine-Grained Image Analysis with Deep Learning: A Survey. IEEE Trans. Pattern Anal. Mach. Intell. 2021. [Google Scholar] [CrossRef] [PubMed]
- Chen, W.Y.; Liu, Y.C.; Kira, Z.; Wang, Y.C.F.; Huang, J.B. A Closer Look at Few-shot Classification. In Proceedings of the International Conference on Learning Representations, New Orleans, LA, USA, 6–9 May 2019. [Google Scholar]
ID | Class | Scientific Name | Higher Classification | Habitat |
---|---|---|---|---|
0 | Begonia Maculata | Begonia maculata | Begonia | Brazil |
1 | Coleus | Coleus | Ocimeae | Southeast Asia and Malaysia |
2 | Elephant’s Ear | Colocasia | Aroideae | Pacific Islands |
3 | House Leek | Sempervivum | Stonecrops | Sahara Desert and Caucasus |
4 | Jade Plant | Crassula ovata | Pigmyweeds | South Africa |
5 | Lucky Bamboo | Dracaena sanderiana | Dracaena | Southeast Asia |
6 | Moon Cactus | Gymnocalycium mihanovichii | Gymnocalycium | tropical and subtropical America |
7 | Nerve Plant | Fittonia albivenis | Fittonia | South America |
8 | Paddle Plant | Kalanchoe luciae | Kalanchoideae | South Africa |
9 | Parlor Palm | Chamaedorea elegans | Chamaedorea | Southern Mexico and Guatemala |
10 | Poinsettia | Euphorbia pulcherrima | Euphorbia subg. Poinsettia | Central America |
11 | Sansevieria Ballyi | Sansevieria Ballyi | Asparagaceae | Africa, Madagascar and southern Asia |
12 | String Of Banana | Senecio rowleyanus | Ragworts | South Africa |
13 | Venus Fly Trap | Dionaea muscipula | Dionaea | Carolinas |
14 | Zebra Cactus | Haworthia attenuata | Haworthiopsis | South Africa |
Network | Used Models | Architecture | Size | Params |
---|---|---|---|---|
VGGNet | VGGNet16 | 13 convolution and 3 fully connected | 113 MB | 138 M |
VGGNet19 | 16 convolution and 3 fully connected | 153 MB | 144 M | |
Inception | Inception-v3 | 48 layers | 169 MB | 24 M |
Inception-ResNet-v2 | 164 layers | 419 MB | 56 M | |
Xception | Xception | 36 convolution layers | 160 MB | 24 M |
DenseNet | DenseNet201 | 5 convolution (201 total) layers | 144 MB | 20 M |
MobileNet | MobileNet-v2 | 3 convolution (20 in total) layers | 19 MB | 13 M |
0-TL | 1-TL-Ox | 1-TL-IN | 2-TL | |
---|---|---|---|---|
Xception | 66.33% | 69.67% | 94.67% | 95.00% |
Inception-ResNet-v2 | 74.33% | 72.00% | 93.33% | 93.67% |
Inception-v3 | 71.67% | 62.33% | 91.67% | 90.00% |
DenseNet201 | 63.33% | 66.33% | 96.00% | 94.67% |
MobileNet-v2 | 6.67% | 41.67% | 86.33% | 83.67% |
VGG19 | 51.67% | 59.00% | 70.67% | 75.00% |
VGG16 | 62.00% | 57.67% | 80.67% | 81.00% |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Litvak, M.; Divekar, S.; Rabaev, I. Urban Plants Classification Using Deep-Learning Methodology: A Case Study on a New Dataset. Signals 2022, 3, 524-534. https://doi.org/10.3390/signals3030031
Litvak M, Divekar S, Rabaev I. Urban Plants Classification Using Deep-Learning Methodology: A Case Study on a New Dataset. Signals. 2022; 3(3):524-534. https://doi.org/10.3390/signals3030031
Chicago/Turabian StyleLitvak, Marina, Sarit Divekar, and Irina Rabaev. 2022. "Urban Plants Classification Using Deep-Learning Methodology: A Case Study on a New Dataset" Signals 3, no. 3: 524-534. https://doi.org/10.3390/signals3030031
APA StyleLitvak, M., Divekar, S., & Rabaev, I. (2022). Urban Plants Classification Using Deep-Learning Methodology: A Case Study on a New Dataset. Signals, 3(3), 524-534. https://doi.org/10.3390/signals3030031