Published Online:August 2025
Product Name:The IUP Journal of Telecommunications
Product Type:Article
Product Code:IJCT020825
DOI:10.71329/IUPJTC/2025.17.3.20-32
Author Name:Renju John and Johann Jose Issac
Availability:YES
Subject/Domain:Engineering
Download Format:PDF
Pages:20-32
Human-wildlife conflict (HWC) in biodiversity hotspots like Wayanad demands real-time, intelligent solutions that work reliably under field constraints. This paper presents an endto- end wildlife detection pipeline using YOLOv8, built from the ground up with a hybrid dataset of over 15,000 images and 35 h of field video, covering species like elephants, leopards, and gaurs. The system was trained and optimized on NVIDIA A100 GPUs and deployed on edge devices like Jetson Nano and Xavier NX, achieving up to 93.9% mean average precision (mAP) @0.5 and 24.7 FPS in real-time conditions. Field-tested across multiple terrains and lighting conditions, the system sustained solar-powered operation, delivered alerts with sub-500 ms latency, and maintained a false positive rate under 8%. A comparative analysis with YOLOv4 and YOLOv5 confirmed YOLOv8’s superior precision, efficiency, and edge-readiness. The study marks a shift from lab-bench AI to boots-onground deployment, showcasing how intelligent vision systems can enable proactive conservation and human safety in high-risk ecological zones.
Conservation in biodiversity hotspots like Wayanad demands more than passive monitoring—it requires intelligent, responsive systems that can operate under realworld constraints. Traditional wildlife surveillance methods, whether manual patrols or passive camera traps, often fall short in delivering timely and actionable insights. This study introduces an AI-powered, edge-deployable wildlife detection system designed specifically for conflict-prone forest fringes. Unlike theoretical models tested in labs or simulations, the system was built from the ground up with a pragmatic focus: high precision, real-time alerts, low power usage and cost-efficiency. The study evaluated multiple versions of the YOLO object detection architecture across diverse edge devices, benchmarking performance on real-field data. The models were assessed not just on mean Average Precision (mAP), but also frame rate (FPS), latency, power draw, and resilience to weather and lighting conditions.