Deep metric learning is an approach to establish a distance metric between data to measure the similarity. Metric learning methods map an input image to a representative feature space where semantically similar samples are closer and dissimilar examples are far away. The use of metric learning in fine-grained recognition is widely studied in recent years. Fine-grained Recognition (FGR) focuses on categorizing hard-to-distinguish classes such as birds' species and models of the cars. In the FGR datasets, the intra-class variance is high, while the inter-class variance is low. This makes it challenging to annotate them, leading to erroneous labels. Especially in defense applications, it is quite costly to label the data due to the fact that this work should be done by experts. The performance of the metric learning methods is directly related to the loss function exploited during training the model. The loss functions are divided into two categories: Pair-based and proxy-based approaches. A proxy is a representation of distribution in feature space. Although the pair-based loss functions utilize the data-to-data relations, the proxy-based loss functions exploit data-to-proxy relations. In this paper, we analyzed the effect of the label noise on open-set fine-grained recognition performance. The pair-based and proxy-based methods are evaluated on three widely adopted benchmark datasets: CUB 200-2011, Stanford Cars 196, and FGVC Aircraft.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.