Visual Search Interactive Model for Artificial Intelligence Robotics Model forthe Agricultural Field Analysis

Authors

  • Asmatullah Nashir Assistant Professor, Department of Information technology, Badakhshan University, Afghanistan. Author

DOI:

https://doi.org/10.69996/jcai.2024021

Keywords:

Auxiliary Field, Machine Learning, Clustering, k-means, Classification, Artificial Intelligence

Abstract

The Visual Search Interactive Model for Artificial Intelligence (AI) is designed to enhance the efficiency and effectiveness of visual data analysis across various applications. By leveraging advanced computer vision techniques and machine learning algorithms, this model enables AI systems to interpret and analyze visual information in real-time, facilitating tasks such as object recognition, image classification, and scene understanding. The interactive nature of the model allows users to engage with the AI, refining searches and improving outcomes through iterative feedback. This paper introduces the Auxiliary Clustering k-means Machine Learning (AC k-means ML) model, designed to enhance agricultural efficiency through advanced data analysis and robotic integration. The study evaluates the performance of the AC k-means ML model using a dataset comprising 1,950 samples, achieving an overall accuracy of 91.5% and a precision of 89.2%. Key performance metrics such as F1 scores averaged 88.6%, with the highest individual cluster accuracy reaching 96% for Cluster 10. In addition to data classification, the model facilitated the completion of 250 tasks with a remarkable success rate of 92%, while maintaining an average task completion time of 15.4 minutes and an energy consumption of just 0.5 kWh per task. The implementation of the AC k-means ML model resulted in a 15% increase in crop yield and substantial cost savings estimated at $2,000. With a user satisfaction score averaging 8.7 and an adaptability score of 9.0, the findings indicate that the integration of machine learning and robotics significantly optimizes agricultural processes, promoting sustainability and efficiency in farming practices.

References

[1] Y.Bai, B.Zhang, N.Xu, J.Zhou, J.Shi and Z.Diao, “Vision-based navigation and guidance for agricultural autonomous vehicles and robots: A review,” Computers and Electronics inAgriculture, vol.205, pp.107584, 2023.

[2] R.Espinel, G.Herrera-Franco, J.L Rivadeneira García and P.Escandón-Panchana, “Artificial Intelligence in Agricultural Mapping: A Review,” Agriculture, vol.14, no.7, pp.1071, 2024.

[3] J.Shi, Y.Bai, Z.Diao, J.Zhou and X.Yao, B.Zhang, “Row detection BASED navigation and guidance for agricultural robots and autonomous vehicles in row-crop fields: methods and applications,” Agronomy, vol.13, no.7, pp.1780, 2023.

[4] V.Sachithra and L.D.C.S Subhashini, “How artificial intelligence uses to achieve the agriculture sustainability: Systematic review,” Artificial Intelligence in Agriculture, vol.8, pp.46-59, 2023.

[5] Q.Tang, Y.W Luo and X.D Wu, “Research on the evaluation method of agricultural intelligent robot design solutions,” Plos one, vol.18, no.3, pp.e0281554, 2023.

[6] O.Mitrofanova, I.Blekanov, D.Sevostyanov, J.Zhang and E.Mitrofanov, “Development of a robot for agricultural field scouting,” In International Conference on Interactive Collaborative Robotics, pp. 185-196, 2023.

[7] A.Pal, A.C Leite and P.J From, “A novel end-to-end vision-based architecture for agricultural human–robot collaboration in fruit picking operations,” Robotics and Autonomous Systems, vol.172, pp.104567, 2024.

[8] S.Sow, S.Ranjan, M.F Seleiman, H.M Alkharabsheh, M.Kumar, N.Kumar and D.O.Wasonga, “Artificial Intelligence for Maximizing Agricultural Input Use Efficiency: Exploring Nutrient, Water and Weed Management Strategies,” Phyton, vol.93, no.7, pp.0031-9457, 2023.

[9] C.Ganeshkumar, S.K Jena, A.Sivakumar, and T.Nambirajan, “Artificial intelligence in agricultural value chain: review and future directions,” Journal of Agribusiness in Developing and Emerging Economies, vol.13, no.3, pp.379-398, 2023.

[10] J.P.Vásconez, F.Basoalto, I.C Briceño, J.M Pantoja, R.A Larenas, J.H Rios and F.A Castro, “Comparison of path planning methods for robot navigation in simulated agricultural environments,” Procedia Computer Science, vol. 220, pp. 898-903, 2023.

[11] B.Xie, Y. Jin, M.Faheem, W.Gao, J.Liu, H.Jiang and Y. Li, “Research progress of autonomous navigation technology for multi-agricultural scenes,” Computers and Electronics in Agriculture, vol. 211, pp.107963, 2023.

[12] X.Fan, X.Chai, J.Zhou and T.Sun, “Deep learning based weed detection and target spraying robot system at seedling stage of cotton field,” Computers and Electronics in Agriculture, vol.214, pp.108317, 2023.

[13] J.Yuan, W. Ji and Q.Feng, “Robots and Autonomous Machines for Sustainable Agriculture Production,” Agriculture, vol.13, no.7, pp.1340, 2023.

[14] A.k Mahlein, J.G Arnal Barbedo, K.S Chiang, E.M Del Ponte and C.H Bock, “From Detection to Protection: The Role of Optical Sensors, Robots, and Artificial Intelligence in Modern Plant Disease Management,” Phytopathology®, vol.114, no.8, pp.1733-1741, 2024.

[15] E.Cortinas, L.Emmi and P.Gonzalez-de-Santos, “Crop Identification and Growth Stage Determination for Autonomous Navigation of Agricultural Robots,” Agronomy, vol.13, no.12, pp.2873, 2023.

[16] J.F Ferreira, D.Portugal, M.E Andrada, P.Machado, R.P Rocha and P.Peixoto, “Sensing and Artificial Perception for Robots in Precision Forestry: A Survey,” Robotics, vol.12, no,5, pp.139, 2023.

[17] L.Benos, V.Moysiadis, D.Kateris, A.C Tagarakis, P.Busato et al., “Human–robot interaction in agriculture: a systematic review,” Sensors, vol.23, no.15, pp.6776, 2013.

[18] L. Gong, Z.Yang, Y. Yao, B.Chen, W.Wang et al., “An Integrated in Situ Image Acquisition and Annotation Scheme for Instance Segmentation Models in Open Scenes with a Human–Robot Interaction Approach,” IEEE Transactions on Human-Machine Systems, 2023.

[19] J.Yu, J.Zhang, A.Shu, Y.Chen, J.Chen, Y.Yang and Y. Zhang, “Study of convolutional neural network-based semantic segmentation methods on edge intelligence devices for field agricultural robot navigation line extraction,” Computers and Electronics in Agriculture, vol. 209, pp.107811,2023.

[20] G.G.D Castro, G.S Berger, A.Cantieri, M.Teixeira, J.Lima et al., “Adaptive path planning for using rapidly exploring random trees and deep reinforcement learning in an agriculture dynamic environment UAVs,” Agriculture, vol.13, no.2, pp.354, 2023.

[21] G.S Berger, M.Teixeira, A.Cantieri, J.Lima, A.I Pereira et al., “Cooperative heterogeneous robots for autonomous insects trap monitoring system in a precision agriculture scenario,” Agriculture, vol.13, no.2, pp.239, 2023.

[22] R.C.D Oliveira and R.D.D.S.E.E Silva, “Artificial intelligence in agriculture: benefits, challenges, and trends,” Applied Sciences, vol.13, no.13, pp.7405, 2023.

[23] V.E González-Rodríguez, I.Izquierdo-Bueno, J.M Cantoral, M.Carbú and C.Garrido, “Artificial intelligence: A promising tool for application in phytopathology,” Horticulturae, vol.10, no.3, pp.197, 2024.

Downloads

Published

2024-10-31

How to Cite

Asmatullah Nashir. (2024). Visual Search Interactive Model for Artificial Intelligence Robotics Model forthe Agricultural Field Analysis. Journal of Computer Allied Intelligence(JCAI, ISSN: 2584-2676), 2(5), 1-16. https://doi.org/10.69996/jcai.2024021