학술논문

Fisher Discriminant Triplet and Contrastive Losses for Training Siamese Networks
Document Type
Working Paper
Source
International Joint Conference on Neural Networks (IJCNN), IEEE, 2020
Subject
Computer Science - Machine Learning
Computer Science - Computer Vision and Pattern Recognition
Statistics - Machine Learning
Language
Abstract
Siamese neural network is a very powerful architecture for both feature extraction and metric learning. It usually consists of several networks that share weights. The Siamese concept is topology-agnostic and can use any neural network as its backbone. The two most popular loss functions for training these networks are the triplet and contrastive loss functions. In this paper, we propose two novel loss functions, named Fisher Discriminant Triplet (FDT) and Fisher Discriminant Contrastive (FDC). The former uses anchor-neighbor-distant triplets while the latter utilizes pairs of anchor-neighbor and anchor-distant samples. The FDT and FDC loss functions are designed based on the statistical formulation of the Fisher Discriminant Analysis (FDA), which is a linear subspace learning method. Our experiments on the MNIST and two challenging and publicly available histopathology datasets show the effectiveness of the proposed loss functions.
Comment: Accepted (to appear) in International Joint Conference on Neural Networks (IJCNN) 2020, IEEE, in IEEE World Congress on Computational Intelligence (WCCI) 2020