Deep neural networks excel in remote sensing image semantic segmentation, but existing methods, despite their sophistication, often focus on channel and spatial dependencies within identical feature maps. This can lead to a uniform treatment of diverse feature maps, hindering information exchange and impacting model efficacy. To address this, we introduce Feature Map Attention, dynamically modulating weights based on interdependencies among various feature maps. This fosters connections and feature fusion, enhancing the model's capability to represent features. Importantly, this improvement comes with minimal additional computational expense. We also incorporate multipath skip connections, efficiently transmitting features at various scales from encoder to decoder, boosting overall model effectiveness. Our FMAMPN, a lightweight neural network, outperforms other state-of-the-art lightweight models across various datasets.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.