학술논문
Auto-TabTransformer: Hierarchical Transformers for Self and Semi Supervised Learning in Tabular Data
Document Type
Conference
Source
2023 International Joint Conference on Neural Networks (IJCNN) Neural Networks (IJCNN), 2023 International Joint Conference on. :1-8 Jun, 2023
Subject
Language
ISSN
2161-4407
Abstract
Self and Semi-Supervised Learning have shown promising results in language and computer vision but are still underexplored in the context of tabular data. This paper focuses on exploring self and semi-supervised methods for tabular data. Towards this, we have proposed Auto-Tab Transformer, a method for training hierarchical transformers in a self and semi-supervised setup using redundancy reduction. The technique focuses on key aspects of self and semi-supervised learning: feature encoding, pre-training objective, training methodology and neural architecture. Performing extensive experiments on four publically accessible datasets, we show that Auto-Tab Transformer achieves state of the art (SOTA) results in the less labelled data domain. We conduct extensive ablation studies detailing the importance of all the components used.