Early Detection and Diagnosis of Oral Cancer Using Deep Neural Network

Authors

  • K. Vinay Kumar Department of CSE, Kakatiya Institute of Technology and Science, Warangal, Telangana,506001, India Author
  • Sumanaswini Palakurthy Department of CSE, Kakatiya Institute of Technology and Science, Warangal, Telangana,506001, India Author
  • Sri Harsha Balijadaddanala Department of CSE, Kakatiya Institute of Technology and Science, Warangal, Telangana,506001, India Author
  • Sharmila Reddy Pappula Department of CSE, Kakatiya Institute of Technology and Science, Warangal, Telangana,506001, India Author
  • Anil Kumar Lavudya Department of CSE, Kakatiya Institute of Technology and Science, Warangal, Telangana,506001, India Author

DOI:

https://doi.org/10.69996/jcai.2024008

Keywords:

Convolutional Neural Network (CNN), Oral squamous cell carcinoma (OSCC), oral cancer detection, Deep Learning (DL), Inception-Resnet-V2

Abstract

Oral squamous cell carcinoma (OSCC) is a malignancy that destroys the ability of the tissues surrounding the mouth to develop layers and membranes. Automated early diagnosis of oral histopathological images has allowed for the successful diagnosis of oral cancer, thanks to recent advancements in Deep Learning (DL) for biomedical image classification. By using a convolutional neural network (CNN) model based on deep learning for the initial analysis of oral squamous cell carcinoma (OSCC), this work aims to automate the classification of benign and malignant oral biopsy histopathological images. For this study, the CNN model Inception-Resnet-V2 is selected using the transfer learning approach. To enhance OSCC detection, additional layers are incorporated into this pretrained model. By mining a repository of oral cancer histopathology images, we can gauge how well these tweaked models perform. We examine the modified structure of the pre-trained Inception-Resnet-V2 model and suggest a DL-CNN model that uses it. With an accuracy of 91.78%, it has outperformed in terms of performance metrics

References

[1] F. Bray and J. Ferlay, “Global cancer statistics 2018: Globocan estimates of incidence and mortality worldwide for 36 cancers in 185 countries,” CA Cancer J Clin, vol. 68, pp. 394–424, 2018.

[2] R. D. Coletta, W. A. Yeudall and T. Salo, “Grand challenges in oral cancers,” Front Oral Health, vol. 1, pp. 1–3, 2020.

[3] A. Duggento and A. Conti, “Deep computational pathology in breast cancer,” Seminars in cancer biology, vol. 72, pp. 226–237, 2020.

[4] J. Gigliotti, S. Madathil and N. Makhoul, “Delays in oral cavity cancer,” Int J Oral Maxillofac Surg, vol. 48, pp. 1131–1137, 2019.

[5] N. Fujima and V. C. Andreu-Arasa, “Deep learning analysis using fdg-pet to predict treatment outcome in patients with oral cavity squamous cell carcinoma,” Eur Radiol, vol. 30, pp. 6322–6330, 2020.

[6] B. R. Nanditha and A. Geetha, “An ensemble deep neural network approach for oral cancer screening,” International Association of Online Engineering, 2021.

[7] F. Jubair and O. Al-karadsheh, “A novel lightweight deep convolutional neural network for early detection of oral cancer,” Oral Diseases, vol. 28, pp. 1123–1130, 2021.

[8] F. U. Qiuyun and Yehansen Chen., “A deep learning algorithm for detection of oral cavity squamous cell carcinoma from photographic images: A retrospective study,” E-Clinical Medicine, vol. 27, 2020.

[9] Navarun Das and Elima Hussain, et al, “Automated classification of cells into multiple classes in epithelial tissue of oral squamous cell carcinoma using transfer learning and convolutionalneural network,” Neural Networks, vol. 128, pp. 47–60, 2020.

[10] G. Tanriver and Soluk Tekkesin, “Automated detection and classification of oral lesions using deep learning to detect oral potentially malignant disorders,” Cancers, vol. 11, 2021.

[11] Welikala and R. A. Remagnino, “Automated detection and classification of oral lesions using deep learning for early detection of oral cancer,” The Multidisciplinary Open Access Journal, vol.8, pp. 132677–132693, 2020.

[12] Shamim and M.Z.M., “Automated detection of oral precancerous tongue lesions using deep learning for early diagnosis of oral cavity cancer,” The Computer Journal, vol. 65, pp. 91–104,2022.

[13] Kevin Figueroa and Bofan Song, “Interpretable deep learning approach for oral cancer classification using guided attention inference network,” Journal of Biomedical Optics, vol. 27,2022.

[14] R. Ramprasaath and Selvaraju, “Grad-cam: Visual explanations from deep networks via gradientbased localization,” IEEE International Conference on Computer Vision, pp. 618–626, 2017.

[15] Kunpeng Li and Ziyan Wu., “Tell me where to look: Guided attention inference network,” IEEE Conference on Computer Vision and Pattern Recognition, pp. 9215–9223, 2018.

[16] R. O. Alabi and A. Almangush, “Deep machine learning fororal cancer: From precise diagnosis to precision medicine,” Frontiers in Oral Health, vol. 2, 2022.

[17] S. Panigrahi and T. Swarnkar, “Machine learning techniques used for the histopathological image analysis of oral cancer-a review,” Journal of Multimedia Information System, vol. 13, pp. 106–118, 2020.

[18].Sahanaz Praveen Ahmed and Lekshmy Jayan, “Oral squamous cell carcinoma under microscopic vision: A review of histological variants and its prognostic indicators,” SRM Journal of Research in Dental Sciences, vol. 10, pp. 90–97, 2019.

[19] R. Ramprasaath and Selvaraju, “Grad-cam: Visual explanations from deep networks via gradientbased localization,” IEEE International Conference on Computer Vision, pp. 618–626, 2017.

[20] Shipu Xu and Chang Liu, “An early diagnosis of oral cancer based on three-dimensional convolutional neural networks,” The Multidisciplinary Open Access Journal, vol. 7, pp. 158603– 158611, 2019.

Downloads

Published

2024-04-30

Issue

Section

Research Articles

How to Cite

K. Vinay Kumar, Sumanaswini Palakurthy, Sri Harsha Balijadaddanala, Sharmila Reddy Pappula, & Anil Kumar Lavudya. (2024). Early Detection and Diagnosis of Oral Cancer Using Deep Neural Network. Journal of Computer Allied Intelligence(JCAI, ISSN: 2584-2676), 2(2), 22-34. https://doi.org/10.69996/jcai.2024008