Comparison of CNN Models With Transfer Learning in the Classification of Insect Pests
Angga Prima Syahputra(1), Alda Cendekia Siregar(2*), Rachmat Wahid Saleh Insani(3)
(1) Universitas Muhammadiyah Pontianak
(2) Universitas Muhammadiyah Pontianak
(3) Universitas Muhammadiyah Pontianak
(*) Corresponding Author
Abstract
Insect pests are an important problem to overcome in agriculture. The purpose of this research is to classify insect pests with the IP-102 dataset using several CNN pre-trained models and choose which model is best for classifying insect pest data. The method used is the transfer learning method with a fine-tuning approach. Transfer learning was chosen because this technique can use the features and weights that have been obtained during the previous training process. Thus, computation time can be reduced and accuracy can be increased. The models used include Xception, MobileNetV3L, MobileNetV2, DenseNet-201, and InceptionV3. Fine-tuning and freeze layer techniques are also used to improve the quality of the resulting model, making it more accurate and better suited to the problem at hand. This study uses 75,222 image data with 102 classes. The results of this study are the DenseNet-201 model with fine-tuning produces an accuracy value of 70%, MobileNetV2 66%, MobileNetV3L 68%, InceptionV3 67%, Xception 69%. The conclusion of this study is that the transfer learning method with the fine-tuning approach produces the highest accuracy value of 70% in the DenseNet-201 model.
Keywords
Full Text:
PDFReferences
[1] X. Wu, C. Zhan, Y. K. Lai, M. M. Cheng, and J. Yang, “IP102: A large-scale benchmark dataset for insect pest recognition,” Proc. IEEE Comput. Soc. Conf. Comput. Vis. Pattern Recognit., vol. 2019-June, pp. 8779–8788, 2019, doi: 10.1109/CVPR.2019.00899.
[2] D. M. Wonohadidjojo, “Perbandingan Convolutional Neural Network pada Transfer Learning Method untuk Mengklasifikasikan Sel Darah Putih,” Ultim. J. Tek. Inform., vol. 13, no. 1, pp. 51–57, 2021, doi: 10.31937/ti.v13i1.2040.
[3] L. Nanni, G. Maguolo, and F. Pancino, “Insect pest image detection and recognition based on bio-inspired methods,” Ecol. Inform., vol. 57, 2020, doi: 10.1016/j.ecoinf.2020.101089.
[4] N. E. M. Khalifa, M. Loey, and M. H. N. Taha, “Insect pests recognition based on deep transfer learning models,” J. Theor. Appl. Inf. Technol., vol. 98, no. 1, pp. 60–68, 2020.
[5] R. Setiawan, Wahyudi Utoyo, Moh Imam Rulaningtyas, “Transfer learning with multiple pre-trained network for fundus classification,” Telkomnika (Telecommunication Comput. Electron. Control., vol. 18, no. 3, pp. 1382–1388, 2020, doi: 10.12928/TELKOMNIKA.v18i3.14868.
[6] G. Vrbančič and V. Podgorelec, “Transfer learning with adaptive fine-tuning,” IEEE Access, vol. 8, no. December, pp. 196197–196211, 2020, doi: 10.1109/ACCESS.2020.3034343.
[7] L. Huang, J. Qin, Y. Zhou, F. Zhu, L. Liu, and L. Shao, “Normalization Techniques in Training DNNs : Methodology , Analysis and Application,” pp. 1–20.
[8] R. Kalaiarasan, K. Madhan Kumar, S. Sridhar, and M. Yuvarai, “Deep Learning-based Transfer Learning for Classification of Skin Cancer,” Proc. - Int. Conf. Appl. Artif. Intell. Comput. ICAAIC 2022, pp. 450–454, 2022, doi: 10.1109/ICAAIC53929.2022.9792651.
[9] C. Shorten and T. M. Khoshgoftaar, “A survey on Image Data Augmentation for Deep Learning,” J. Big Data, vol. 6, no. 1, 2019, doi: 10.1186/s40537-019-0197-0.
[10] F. Chollet, “Xception: Deep learning with depthwise separable convolutions,” Proc. - 30th IEEE Conf. Comput. Vis. Pattern Recognition, CVPR 2017, vol. 2017-Janua, pp. 1800–1807, 2017, doi: 10.1109/CVPR.2017.195.
[11] M. Sandler, A. Howard, M. Zhu, A. Zhmoginov, and L. C. Chen, “MobileNetV2: Inverted Residuals and Linear Bottlenecks,” Proc. IEEE Comput. Soc. Conf. Comput. Vis. Pattern Recognit., pp. 4510–4520, 2018, doi: 10.1109/CVPR.2018.00474.
[12] A. Howard et al., “Searching for mobileNetV3,” Proc. IEEE Int. Conf. Comput. Vis., vol. 2019-Octob, pp. 1314–1324, 2019, doi: 10.1109/ICCV.2019.00140.
[13] M. F. Naufal and S. F. Kusuma, “Pendeteksi Citra Masker Wajah Menggunakan CNN dan Transfer Learning,” J. Teknol. Inf. dan Ilmu Komput., vol. 8, no. 6, p. 1293, 2021, doi: 10.25126/jtiik.2021865201.
[14] C. Szegedy, V. Vanhoucke, S. Ioffe, J. Shlens, and Z. Wojna, “Rethinking the Inception Architecture for Computer Vision,” Proc. IEEE Comput. Soc. Conf. Comput. Vis. Pattern Recognit., vol. 2016-Decem, pp. 2818–2826, 2016, doi: 10.1109/CVPR.2016.308.
[15] J. E. Widyaya and S. Budi, “Pengaruh Preprocessing Terhadap Klasifikasi Diabetic Retinopathy dengan Pendekatan Transfer Learning Convolutional Neural Network,” J. Tek. Inform. dan Sist. Inf., vol. 7, no. 1, pp. 110–124, 2021, doi: 10.28932/jutisi.v7i1.3327.
[16] A. Informatics, H. Noprisson, M. Buana, and A. Info, “Fine-Tuning Model Transfer Learning VGG16 Untuk Klasifikasi Citra Penyakit Tanaman Padi,” vol. 5, no. 3, pp. 244–249, 2022.
[17] E. I. Haksoro and A. Setiawan, “Pengenalan Jamur Yang Dapat Dikonsumsi Menggunakan Metode Transfer Learning Pada Convolutional Neural Network,” J. ELTIKOM, vol. 5, no. 2, pp. 81–91, 2021, doi: 10.31961/eltikom.v5i2.428.
[18] R. Rismiyati and A. Luthfiarta, “VGG16 Transfer Learning Architecture for Salak Fruit Quality Classification,” Telematika, vol. 18, no. 1, p. 37, 2021, doi: 10.31315/telematika.v18i1.4025.
DOI: https://doi.org/10.22146/ijccs.80956
Article Metrics
Abstract views : 3509 | views : 2125Refbacks
- There are currently no refbacks.
Copyright (c) 2023 IJCCS (Indonesian Journal of Computing and Cybernetics Systems)
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
View My Stats1