학술논문

A Multilayer Framework for Online Metric Learning
Document Type
Periodical
Source
IEEE Transactions on Neural Networks and Learning Systems IEEE Trans. Neural Netw. Learning Syst. Neural Networks and Learning Systems, IEEE Transactions on. 34(10):6701-6713 Oct, 2023
Subject
Computing and Processing
Communication, Networking and Broadcast Technologies
Components, Circuits, Devices and Systems
General Topics for Engineers
Measurement
Extraterrestrial measurements
Nonhomogeneous media
Software algorithms
Software
Data models
Training
Interpretability
metric layer
nonlinearity
online metric learning (OML)
passive-aggressive (PA) strategy
Language
ISSN
2162-237X
2162-2388
Abstract
Online metric learning (OML) has been widely applied in classification and retrieval. It can automatically learn a suitable metric from data by restricting similar instances to be separated from dissimilar instances with a given margin. However, the existing OML algorithms have limited performance in real-world classifications, especially, when data distributions are complex. To this end, this article proposes a multilayer framework for OML to capture the nonlinear similarities among instances. Different from the traditional OML, which can only learn one metric space, the proposed multilayer OML (MLOML) takes an OML algorithm as a metric layer and learns multiple hierarchical metric spaces, where each metric layer follows a nonlinear layer for the complicated data distribution. Moreover, the forward propagation (FP) strategy and backward propagation (BP) strategy are employed to train the hierarchical metric layers. To build a metric layer of the proposed MLOML, a new Mahalanobis-based OML (MOML) algorithm is presented based on the passive-aggressive strategy and one-pass triplet construction strategy. Furthermore, in a progressively and nonlinearly learning way, MLOML has a stronger learning ability than traditional OML in the case of limited available training data. To make the learning process more explainable and theoretically guaranteed, theoretical analysis is provided. The proposed MLOML enjoys several nice properties, indeed learns a metric progressively, and performs better on the benchmark datasets. Extensive experiments with different settings have been conducted to verify these properties of the proposed MLOML.