A Scale Sequence Object-based Convolutional Neural Network (SS-OCNN) for crop classification from fine spatial resolution remotely sensed imagery

Huapeng Li*, Ce Zhang, Yong Zhang, Shuqing Zhang, Xiaohui Ding, Peter M. Atkinson

*Corresponding author for this work

Research output: Contribution to journalArticle (Academic Journal)peer-review

29 Citations (Scopus)

Abstract

The highly dynamic nature of agro-ecosystems in space and time usually leads to high intra-class variance and low inter-class separability in the fine spatial resolution (FSR) remotely sensed imagery. This makes traditional classifiers essentially relying on spectral information for crop mapping from FSR imagery an extremely challenging task. To mine effectively the rich spectral and spatial information in FSR imagery, this paper proposed a Scale Sequence Object-based Convolutional Neural Network (SS-OCNN) that classifies images at the object level by taking segmented objects (crop parcels) as basic units of analysis, thus, ensuring that the boundaries between crop parcels are delineated precisely. These segmented objects were subsequently classified using a CNN model integrated with an automatically generated scale sequence of input patch sizes. This scale sequence can fuse effectively the features learned at different scales by transforming progressively the information extracted at small scales to larger scales. The effectiveness of the SS-OCNN was investigated using two heterogeneous agricultural areas with FSR SAR and optical imagery, respectively. Experimental results revealed that the SS-OCNN consistently achieved the most accurate classification results. The SS-OCNN, thus, provides a new paradigm for crop classification over heterogeneous areas using FSR imagery, and has a wide application prospect.

Original languageEnglish
Pages (from-to)1528-1546
Number of pages19
JournalInternational Journal of Digital Earth
Volume14
Issue number11
DOIs
Publication statusPublished - 30 Nov 2021

Bibliographical note

Funding Information:
This work was supported by the National Natural Science Foundation of China (41301465), the Capital Construction Fund of Jilin Province (2021C045-2), and the Open Fund of State Laboratory of Information Engineering in Surveying, Mapping and Remote Sensing, Wuhan University (grant number 20R04). The OCNN approach was developed during a PhD studentship ‘Deep Learning in massive area, multi-scale resolution remotely sensed imagery’ (NO. EAA7369), sponsored by Lancaster University and Ordnance Survey (the national mapping agency of Great Britain). Ordnance Survey owns the intellectual property arising from the project, together with a US patent pending: ‘Object Based Convolutional Neural Network’ (US application number 16/156044). Lancaster University wishes to thank Ordnance Survey for permission to publish this paper and for the supply of aerial imagery and the supporting geospatial data which facilitated the PhD (which is protected as Crown copyright).

Publisher Copyright:
© 2021 Informa UK Limited, trading as Taylor & Francis Group.

Keywords

  • CNNs
  • crop classification
  • image classification
  • multi-scale deep learning
  • object-based mapping

Fingerprint

Dive into the research topics of 'A Scale Sequence Object-based Convolutional Neural Network (SS-OCNN) for crop classification from fine spatial resolution remotely sensed imagery'. Together they form a unique fingerprint.

Cite this