Kleijkers, Yrsa
- Department of Ecology, Swedish University of Agricultural Sciences
Wildlife causes significant economic losses to Swedish agriculture through their feeding behaviour in crops. Accurately assessing these losses is crucial for developing mitigation strategies and reducing conflicts between stakeholders. However, traditional ground-based surveys are labour-intensive, observer-dependent, spatially limited, and not easily scalable. Advances in remote sensing and artificial intelligence (AI) offer new opportunities for automatised to semi-automatized damage detection and mapping at very-high spatial resolution scale. In this study, we developed a deep learning approach based on Convolutional Neural Networks (CNNs) applied to UAV-derived orthomosaics to discriminate between damage types. The workflow integrated four key steps: preprocessing UAV imagery into normalized image tiles and structured datasets; optimizing model behaviour through hyperparameter tuning; training the CNN with transfer learning, where dense layers were fitted to labelled damage data while convolutional layers remained frozen; and evaluating model performance with independent test sets. Performance metrics, including accuracy, precision, recall, and F1-score, demonstrated clear differences between wheat and grasslands, as well as between training strategies. In general, models trained on crop-specific datasets outperformed those trained on the full dataset, highlighting the importance of tailoring training data to individual crop types. Across both crops, the no-grid approaches consistently achieved stronger results than grid-based models, suggesting that preserving spatial context improves classification performance. Wheat models benefited more strongly from crop-specific training, showing a pronounced gain in classification reliability compared to grasslands, where improvements were present but more moderate. When applied to full-field predictions, performance declined compared to validation polygons, indicating challenges in generalizing from controlled validation areas to more heterogeneous field conditions. Overall, the observed trends confirm that CNN-based approaches can capture relevant spectral and spatial features for damage type discrimination, with wheat classifications being particularly sensitive to training data design and quantity. These findings demonstrate the potential of CNN-based methods for UAV-assisted monitoring of crop damage and provide a foundation for scalable and semi-automatized applications in precision agriculture.
Agriculture; CNN; Deep learning; Drone; UAV; Wildlife damages
Publisher: Department of Ecology, Swedish University of Agricultural Sciences
Artificial Intelligence
Agricultural Science
https://res.slu.se/id/publ/130946