|
Tropentag, September 11 - 13, 2024, Vienna
"Explore opportunities... for managing natural resources and a better life for all"
|
Enhancing crop systems classification through fusion of Planetscope and Sentinel-2A imagery using deep learning
Henry Kyalo, Tobias Landmann
International Centre of Insect Physiology and Ecology (icipe), Kenya
Abstract
Crop systems classification is important for various applications such as agricultural monitoring and food security assessment, yield prediction, and crop health monitoring. Accurate classification allows for targeted interventions such as precise irrigation and fertiliser application, resulting to improved yields. Additionally, it allows draught monitoring and monitoring of crop rotation patterns and pest infestations. Accurate crop systems classification allows for identification of areas where crop diversification can be implemented. Growing a variety of crops improves the resilience to pests, diseases, and climate change, reducing the risk of crop failure and therefore ensuring a more stable food supply. Further, accurate crop systems classification helps policy makers and agricultural planners in allocation of resources more efficiently, which leads to higher yields per unit of input, increasing overall food production.
In this study, we propose a novel approach that fuses Sentinel-2A and Planetscope satellite imagery through a ResNet deep learning model to enhance the spatial and spectral resolution of Sentinel-2A for agricultural crop systems classification. For crop systems classification, image fusion is important for identification of different cropping systems, and is also useful in regions with frequent cloud cover. The trained model is used to predict Sentinel-2 images using Planetscope data as the reference, resulting in improved Sentinel images with finer details.
The predicted images are overlayed with ground truth data and spectral indices extracted from each point data and utilised for multivariate timeseries crop systems classification. The model performance is evaluated on overall accuracy and f1-score. Thereafter, maps based on the area of interest are generated, showing intensity and areas where monocrop and intercrop systems are most likely practised.
This study demonstrates the effectiveness of our proposed approach in improving crop systems classification through fusion of satellite imagery with deep learning techniques. The results highlight the potential for enhanced agricultural monitoring and decision-making, ultimately contributing to food security and sustainable agricultural practices.
Keywords: Classification, crop systems, data fusion, deep learning, food security
Contact Address: Tobias Landmann, International Centre of Insect Physiology and Ecology (icipe), Icipe str. 1, 00100 Nairobi, Kenya, e-mail: tlandmannicipe.org
|