Mobile-based Primate Image Recognition using CNN

https://doi.org/10.22146/ijccs.65640

Nuruddin Wiranda(1*), Agfianto Eko Putra(2)

(1) Department of Computer Education, FKIP, ULM, Banjarmasin
(2) Department of Computer Science and Electronics, FMIPA UGM, Yogyakarta
(*) Corresponding Author

Abstract


Six out of 25 species of primates most endangered are in Indonesia. Six of these primates are namely Orangutan, Lutung, Bekantan, Tarsius tumpara, Kukang, and Simakobu. Three of the six primates live mostly on the island of Borneo. One form of preservation of primate treasures found in Kalimantan is by conducting studies on primate identification. In this study, an android app was developed using the CNN method to identify primate species in Kalimantan wetlands. CNN is used to extract spatial features from primate images to be very efficient for image identification problems. The data set used in this study is ImageNets, while the model used is MobileNets. The application was tested using two scenarios, namely using photos and video recordings. Photos were taken directly, then reduced to a resolution of 256 x 256. Then, videos were taken in approximately 10 to 30 seconds with two megapixel camera resolution. The results obtained was an average accuracy of 93.6% when using photos and 79% when using video recordings. After calculating the accuracy, the usability test using SUS was performed. Based on the SUS results, it is known that the application developed is feasible to use.


Keywords


Image Recognition; Mobile-based; CNN

Full Text:

PDF


References

[1] R. E. Bontrop, “Non-human primates: essential partners in biomedical research,” Immunol. Rev., vol. 183, pp. 5–9, 2001, doi: 10.1034/j.1600-065X.2001.1830101.x. [Online]. Available: https://pubmed.ncbi.nlm.nih.gov/11782243/. [Accessed: 3-May-2021]

[2] L. R. Sibal and K. J. Samson, “Nonhuman primates: A critical role in current disease research,” ILAR J., vol. 42, no. 2, pp. 74–84, 2001, doi: 10.1093/ilar.42.2.74. Available: https://pubmed.ncbi.nlm.nih.gov/11406709. [Accessed: 3-May-2021]

[3] H. E. Carlsson, S. J. Schapiro, I. Farah, and J. Hau, “Use of primates in research: A global overview,” Am. J. Primatol., vol. 63, no. 4, pp. 225–237, 2004, doi: 10.1002/ajp.20054. Available: https://pubmed.ncbi.nlm.nih.gov/15300710/. [Accessed: 3-May-2021]

[4] A. J. Marshall and S. A. Wich, “Why conserve primates?,” An Introd. to Primate Conserv., pp. 13–30, 2016, doi: 10.1093/acprof:oso/9780198703389.003.0002. Available: https://pubmed.ncbi.nlm.nih.gov/15300710/. [Accessed: 3-May-2021]

[5] C. A. Chapman, “Primate seed dispersal: Coevolution and conservation implications,” Evol. Anthropol. Issues, News, Rev., vol. 4, no. 3, pp. 74–82, 1995, doi: 10.1002/evan.1360040303. Available: https://sites.lsa.umich.edu/wp-content/uploads/sites/162/2016/07/Marshall-Wich-2016-Why-conserve-primates-small.pdf. [Accessed: 3-May-2021]

[6] C. Chapman and S. E. Russo, “Primate seed dispersal,” Primates Perspect., pp. 510–525, 2007. Available: https://onlinelibrary.wiley.com/doi/10.1002/9781119179313.wbprim0300 [Accessed: 3-May-2021]

[7] K. R. McConkey, “Seed Dispersal by Primates in Asian Habitats: From Species, to Communities, to Conservation,” Int. J. Primatol., vol. 39, no. 3, pp. 466–492, 2018, doi: 10.1007/s10764-017-0013-7. Available: https://link.springer.com/article/10.1007/s10764-017-0013-7. [Accessed: 3-May-2021]

[8] F. S. Bufalo, M. Galetti, and L. Culot, “Seed Dispersal by Primates and Implications for the Conservation of a Biodiversity Hotspot, the Atlantic Forest of South America,” Int. J. Primatol., vol. 37, no. 3, pp. 333–349, 2016, doi: 10.1007/s10764-016-9903-3. Available: https://link.springer.com/article/10.1007/s10764-016-9903-3 [Accessed: 3-May-2021]

[9] R. A. Mittermeier et al., Primates in Peril. 2010. Available: https://www.researchgate.net/publication/237805822_Primates_in_Peril_The_World's_Top_25_Most_Endangered_Primates [Accessed: 3-May-2021]

[10] A. Estrada et al., “Impending extinction crisis of the world’s primates: Why primates matter,” Sci. Adv., vol. 3, no. 1, 2017, doi: 10.1126/sciadv.1600946. Available: https://advances.sciencemag.org/content/3/1/e1600946 [Accessed: 3-May-2021]

[11] C. Christoph, Schwitzer; Russell A., Mittermeier; Anthony B. , Rylands; Lucy A. , aylor; Federica , Chiozza; Elizabeth A. , Williamson; Janette. Wallis; Fay E., Primates in Peril: The World’s 25 Most Endangered Primates 2012–2014. IUCN SSC Primate Specialist Group (PSG), International Primatological Society (IPS), Conservation International (CI), Bristol Zoological Society (BZS), 2014. Available: https://portals.iucn.org/library/node/4467. [Accessed: 3-May-2021]

[12] W. F. Laurance et al., “Tapanuli orangutan endangered by Sumatran hydropower scheme,” Nat. Ecol. Evol., vol. 4, no. 11, pp. 1438–1439, 2020, doi: 10.1038/s41559-020-1263-x. Available: https://www.nature.com/articles/s41559-020-1263-x. [Accessed: 3-May-2021]

[13] A. Nater et al., “Marked population structure and recent migration in the critically endangered sumatran orangutan (Pongo abelii),” J. Hered., vol. 104, no. 1, pp. 2–13, 2013, doi: 10.1093/jhered/ess065. Available: https://pubmed.ncbi.nlm.nih.gov/23077232/. [Accessed: 3-May-2021]

[14] R. T. Sataloff, M. M. Johns, and K. M. Kost, “Distribusi Spasial Lutung Surili (Presbytis comata) di Taman Nasional Gunung Merbabu,” Pros. Semin. Nas. Konserv. dan Pemanfaat. Tumbuh. dan Satwa Liar “Riset Sebagai Fondasi Konserv. dan Pemanfaat. Tumbuh. dan Satwa Liar,” pp. 118–125, 2019. Available: https://www.researchgate.net/publication/329642101_Distribusi_Spasial_Lutung_Surili_Presbytis_comata_di_Taman_Nasional_Gunung_Merbabu. [Accessed: 3-May-2021]

[15] A. M. Russell, A. B. Rylands, C. Schwitzer, L. A. Taylor, F. Chiozza, and E. A. Williamson, “Primates in Peril: The World’s 25 Most Endangered Primates 2010–2012,” IUCN/SSC Primate Spec. Gr. (PSG), Int. Primatol. Soc. (IPS), Conserv. Int. (CI), Arlington, VA, pp. 23–24, 2012. Available: https://portals.iucn.org/library/node/44679. [Accessed: 3-May-2021]

[16] L. E. Harding, “Trachypithecus cristatus (Primates: Cercopithecidae),” Mamm. Species, vol. 42, no. 862, pp. 149–165, 2010, doi: 10.1644/862.1. Available: https://www.researchgate.net/publication/250068591_Trachypithecus_cristatus_Primates_Cercopithecidae. [Accessed: 3-May-2021]

[17] Z. Cao, J. C. Principe, B. Ouyang, F. Dalgleish, and A. Vuorenkoski, “Marine animal classification using combined CNN and hand-designed image features,” pp. 1–6, 2017, doi: 10.23919/oceans.2015.7404375. Available: https://ieeexplore.ieee.org/document/7404375. [Accessed: 3-May-2021]

[18] M. Syarief and W. Setiawan, “Convolutional neural network for maize leaf disease image classification,” Telkomnika (Telecommunication Comput. Electron. Control., vol. 18, no. 3, pp. 1376–1381, 2020, doi: 10.12928/TELKOMNIKA.v18i3.14840. Available: http://journal.uad.ac.id/index.php/TELKOMNIKA/article/view/14840. [Accessed: 3-May-2021]

[19] S. Taheri and Ö. Toygar, “Animal classification using facial images with score-level fusion,” IET Comput. Vis., vol. 12, no. 5, pp. 679–685, 2018, doi: 10.1049/iet-cvi.2017.0079. Available: https://digital-library.theiet.org/content/journals/10.1049/iet-cvi.2017.0079. [Accessed: 3-May-2021]

[20] M. T. Islam, B. M. N. Karim Siddique, S. Rahman, and T. Jabid, “Food Image Classification with Convolutional Neural Network,” 2018 Int. Conf. Intell. Informatics Biomed. Sci. ICIIBMS 2018, vol. 3, pp. 257–262, 2018, doi: 10.1109/ICIIBMS.2018.8550005. Available: https://ieeexplore.ieee.org/document/8550005. [Accessed: 3-May-2021]

[21] S. Hayat, S. Kun, Z. Tengtao, Y. Yu, T. Tu, and Y. Du, “A Deep Learning Framework Using Convolutional Neural Network for Multi-Class Object Recognition,” 2018 3rd IEEE Int. Conf. Image, Vis. Comput. ICIVC 2018, pp. 194–198, 2018, doi: 10.1109/ICIVC.2018.8492777. Available: https://ieeexplore.ieee.org/document/8492777. [Accessed: 3-May-2021]

[22] K. Gupta, M. Kumar, and N. Sachdeva, “Character Recognition From Image Using Tensorflow and Convolutional Neural Networks,” Int. J. Eng. Appl. Sci. Technol., vol. 5, no. 1, pp. 660–663, 2020, doi: 10.33564/ijeast.2020.v05i01.117. Available: https://pubmed.ncbi.nlm.nih.gov/15300710/. [Accessed: 3-May-2021]

[23] S. Sakib, Z. Ashrafi, and M. A. B. Siddique, “Implementation of Fruits Recognition Classifier using Convolutional Neural Network Algorithm for Observation of Accuracies for Various Hidden Layers,” pp. 3–6, 2019, [Online]. Available: http://arxiv.org/abs/1904.00783. Available: https://arxiv.org/abs/1904.00783. [Accessed: 3-May-2021]

[24] Liliana, J. H. Chae, J. J. Lee, and B. G. Lee, “A robust method for VR-based hand gesture recognition using density-based CNN,” Telkomnika (Telecommunication Comput. Electron. Control., vol. 18, no. 2, pp. 761–769, 2020, doi: 10.12928/TELKOMNIKA.v18i2.14747. Available: http://journal.uad.ac.id/index.php/TELKOMNIKA/article/view/14747. [Accessed: 3-May-2021]

[25] D. Han, Q. Liu, and W. Fan, “A new image classification method using CNN transfer learning and web data augmentation,” Expert Syst. Appl., vol. 95, pp. 43–56, 2018, doi: 10.1016/j.eswa.2017.11.028. Available: https://www.sciencedirect.com/science/article/abs/pii/S0957417417307844. [Accessed: 3-May-2021]

[26] B. Hou, J. Li, X. Zhang, S. Wang, and L. Jiao, “Object Detection and Trcacking Based on Convolutional Neural Networks for High-Resolution Optical Remote Sensing Video,” Int. Geosci. Remote Sens. Symp., pp. 5433–5436, 2019, doi: 10.1109/IGARSS.2019.8898173. Available: https://ieeexplore.ieee.org/abstract/document/8898173. [Accessed: 3-May-2021]

[27] L. Yuan, Z. Qu, Y. Zhao, H. Zhang, and Q. Nian, “A convolutional neural network based on TensorFlow for face recognition,” Proc. 2017 IEEE 2nd Adv. Inf. Technol. Electron. Autom. Control Conf. IAEAC 2017, pp. 525–529, 2017, doi: 10.1109/IAEAC.2017.8054070. Available: https://ieeexplore.ieee.org/document/8054070. [Accessed: 3-May-2021]

[28] N. Wiranda, H. S. Purba, and R. A. Sukmawati, “Survei Penggunaan Tensorflow pada Machine Learning untuk Identifikasi Ikan Kawasan Lahan Basah,” Indones. J. Electron. Instrum. Syst., vol. 10, no. x, pp. 179–188, 2020, doi: 10.22146/ijeis.xxxx. Available: https://jurnal.ugm.ac.id/ijeis/article/view/58315/0. [Accessed: 3-May-2021]

[29] F. Ertam, “Data classification with deep learning using tensorflow,” 2nd Int. Conf. Comput. Sci. Eng. UBMK 2017, pp. 755–758, 2017, doi: 10.1109/UBMK.2017.8093521. Available: https://ieeexplore.ieee.org/document/8093521. [Accessed: 3-May-2021]

[30] F. Sultana, A. Sufian, and P. Dutta, “Advancements in image classification using convolutional neural network,” arXiv, pp. 122–129, 2019. Available: https://arxiv.org/abs/1905.03288. [Accessed: 3-May-2021]

[31] T. Guo, J. Dong, H. Li, and Y. Gao, “Simple convolutional neural network on image classification,” 2017 IEEE 2nd Int. Conf. Big Data Anal. ICBDA 2017, pp. 721–724, 2017, doi: 10.1109/ICBDA.2017.8078730. Available: https://www.analyticsvidhya.com/blog/2020/02/learn-image-classification-cnn-convolutional-neural-networks-3-datasets. [Accessed: 3-May-2021]

[32] S. Albawi, T. A. M. Mohammed, and S. Alzawi, “Understanding of a Convolutional Neural Network,” Ieee, 2017. Available: https://ieeexplore.ieee.org/document/8308186. [Accessed: 3-May-2021]

[33] A. Freytag, E. Rodner, M. Simon, A. Loos, H. S. Kühl, and J. Denzler, “Chimpanzee faces in the wild: Log-euclidean CNNs for predicting identities and attributes of primates,” Lect. Notes Comput. Sci. (including Subser. Lect. Notes Artif. Intell. Lect. Notes Bioinformatics), vol. 9796 LNCS, pp. 51–63, 2016, doi: 10.1007/978-3-319-45886-1_5. Available: https://link.springer.com/chapter/10.1007/978-3-319-45886-1_5. [Accessed: 3-May-2021]

[34] D. Deb et al., “Face recognition: Primates in the wild,” arXiv, pp. 1–10, 2018. Available: https://arxiv.org/abs/1804.08790. [Accessed: 3-May-2021]

[35] Q. Li, W. Cai, X. Wang, Y. Zhou, D. D. Feng, and M. Chen, “Medical image classification with convolutional neural network,” 2014 13th Int. Conf. Control Autom. Robot. Vision, ICARCV 2014, vol. 2014, no. December, pp. 844–848, 2014, doi: 10.1109/ICARCV.2014.7064414. Available: https://www.researchgate.net/publication/343543278_CHARACTER_RECOGNITION_FROM_IMAGE_USING_TENSORFLOW_AND_CONVOLUTIONAL_NEURAL_NETWORKS. [Accessed: 3-May-2021]

[36] J. R. Lewis, “The System Usability Scale: Past, Present, and Future,” Int. J. Hum. Comput. Interact., vol. 34, no. 7, pp. 577–590, 2018, doi: 10.1080/10447318.2018.1455307. Available: https://www.tandfonline.com/doi/abs/10.1080/10447318.2018.1455307 /. [Accessed: 3-May-2021]

[37] U. Ependi, T. B. Kurniawan, and F. Panjaitan, “System Usability Scale Vs Heuristic Evaluation: a Review,” Simetris J. Tek. Mesin, Elektro dan Ilmu Komput., vol. 10, no. 1, pp. 65–74, 2019, doi: 10.24176/simet.v10i1.2725. Available: https://www.researchgate.net/publication/332732648_SYSTEM_USABILITY_SCALE_VS_HEURISTIC_EVALUATION_A_REVIEW. [Accessed: 3-May-2021]



DOI: https://doi.org/10.22146/ijccs.65640

Article Metrics

Abstract views : 3206 | views : 2595

Refbacks

  • There are currently no refbacks.




Copyright (c) 2022 IJCCS (Indonesian Journal of Computing and Cybernetics Systems)

Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.



Copyright of :
IJCCS (Indonesian Journal of Computing and Cybernetics Systems)
ISSN 1978-1520 (print); ISSN 2460-7258 (online)
is a scientific journal the results of Computing
and Cybernetics Systems
A publication of IndoCEISS.
Gedung S1 Ruang 416 FMIPA UGM, Sekip Utara, Yogyakarta 55281
Fax: +62274 555133
email:ijccs.mipa@ugm.ac.id | http://jurnal.ugm.ac.id/ijccs



View My Stats1
View My Stats2