Paper ID | SAM-9.5 |
Paper Title |
A META-LEARNING FRAMEWORK FOR FEW-SHOT CLASSIFICATION OF REMOTE SENSING SCENE |
Authors |
Pei Zhang, Yunpeng Bai, Dong Wang, Northwestern Polytechnical University, China; Bendu Bai, Xi’an University of Posts and Telecommunications, China; Ying Li, Northwestern Polytechnical University, China |
Session | SAM-9: Detection and Classification |
Location | Gather.Town |
Session Time: | Thursday, 10 June, 16:30 - 17:15 |
Presentation Time: | Thursday, 10 June, 16:30 - 17:15 |
Presentation |
Poster
|
Topic |
Sensor Array and Multichannel Signal Processing: [RAS-DTCL] Target detection, classification, localization |
IEEE Xplore Open Preview |
Click here to view in IEEE Xplore |
Virtual Presentation |
Click here to watch in the Virtual Conference |
Abstract |
While achieving remarkable success in remote sensing (RS) scene classification for the past few years, CNN-based methods suffer from the demand for large amounts of training data. The bottleneck in prediction accuracy has shifted from data processing limits toward a lack of ground truth samples, usually collected manually by experienced experts. In this work, we provide a meta-learning framework for few-shot classification of RS scene. Under the umbrella of meta-learning, we show it is possible to learn much information about a new category from only 1 or 5 samples. The proposed method is based on Prototypical Networks with a pre-trained stage and a learnable similarity metric. The experimental results show that our method outperforms three state-of-the-art few-shot algorithms and one typical CNN-based method, D-CNN, on two challenging datasets: NWPU-RESISC45 and RSD46-WHU. |