A1 Refereed original research article in a scientific journal
Peatland pixel-level classification via multispectral, multiresolution and multisensor data using convolutional neural network
Authors: Zelioli, Luca; Farahnakian, Fahimeh; Middleton, Maarit; Pitkänen, Timo P.; Tuominen, Sakari; Nevalainen, Paavo; Pohjankukka, Jonne; Heikkonen, Jukka
Publisher: Elsevier BV
Publication year: 2025
Journal: Ecological Informatics
Journal name in source: Ecological Informatics
Article number: 103233
Volume: 90
ISSN: 1574-9541
eISSN: 1878-0512
DOI: https://doi.org/10.1016/j.ecoinf.2025.103233
Web address : https://doi.org/10.1016/j.ecoinf.2025.103233
Self-archived copy’s web address: https://research.utu.fi/converis/portal/detail/Publication/499465113
High-resolution mapping of boreal peatlands is crucial for greenhouse gas inventories, ecological monitoring, and sustainable land management. However, accurately classifying peatland ecotypes at large scales remains challenging due to the complex phenological changes, dense tree canopies, water table level variations, and the mosaiced structure of vegetation communities typical of these landscapes. To address these challenges, we propose a novel multi-modal convolutional neural network (CNN) architecture designed specifically for pixel-level peatland classification. The motivation behind this research stems from the need for improved accuracy in peatland site type and fertility level mapping, which is vital for effective environmental decision-making. The core strategy of our method involves a late fusion architecture that seamlessly integrates multi-source remote sensing (RS) data, including optical imagery, synthetic aperture radar (SAR), airborne laser scanning (ALS), and multi-source national forest inventory (MS-NFI) datasets. These diverse data sources, characterized by different spatial resolutions, are fused to preserve their spatial integrity, enabling richer feature extraction for classification tasks. Additionally, a sliding-window approach is applied to manage multi-resolution datasets, enhancing pixel-wise classification by preserving spatial and contextual relationships. We evaluated the proposed architecture across three diverse peatland zones in Finland, demonstrating its capability to generalize across varying ecological conditions. Experimental results indicate classification accuracies for peatland site types and fertility levels ranging from 36.6% to 55.0%, highlighting the effectiveness of our approach even with limited labeled training samples. Canopy height models, Sentinel-2 bands, and Sentinel-1 bands emerged as the most influential data sources for accurate classification. Our findings underscore the potential of integrating multi-source RS data with advanced CNN architectures for large-scale peatland mapping. Future work will focus on incorporating LiDAR-derived vegetation structural indices, hyperspectral RS data, and expanding the training dataset to further enhance classification performance.
Downloadable publication This is an electronic reprint of the original article. |
Funding information in the publication:
This work is part of the Advances in soil information- MaaTi project funded by the Ministry of Agriculture and Forestry of Finland (2021–2022, funding decision VN/27416/2020-MMM-2). The authors wish to acknowledge CSC – IT Center for Science, Finland, for computational resources, and the MaaTi project management and steering group for constructive comments during the work. The TerraSAR-X and RADARSAT-2 data were supplied through the European Space Agency’s Third party mission proposals no. 36096 “Remote sensing as a tool for mapping and evaluating peatlands and peatland carbon stock in Northern Finland; Radarsat-2” and no. 36096 “Remote sensing as a tool for mapping and evaluating peatlands and peatland carbon stock in Northern Finland; Radarsat-2” and by Maarit Middleton in year 2017.