학술논문

Auto-TabTransformer: Hierarchical Transformers for Self and Semi Supervised Learning in Tabular Data
Document Type
Conference
Source
2023 International Joint Conference on Neural Networks (IJCNN) Neural Networks (IJCNN), 2023 International Joint Conference on. :1-8 Jun, 2023
Subject
Components, Circuits, Devices and Systems
Computing and Processing
Power, Energy and Industry Applications
Robotics and Control Systems
Signal Processing and Analysis
Training
Computer vision
Redundancy
Supervised learning
Neural networks
Self-supervised learning
Computer architecture
Tabular Data
Self Supervised Learning
Semi Supervised Learning
Transformers
Hierarchical Networks
Language
ISSN
2161-4407
Abstract
Self and Semi-Supervised Learning have shown promising results in language and computer vision but are still underexplored in the context of tabular data. This paper focuses on exploring self and semi-supervised methods for tabular data. Towards this, we have proposed Auto-Tab Transformer, a method for training hierarchical transformers in a self and semi-supervised setup using redundancy reduction. The technique focuses on key aspects of self and semi-supervised learning: feature encoding, pre-training objective, training methodology and neural architecture. Performing extensive experiments on four publically accessible datasets, we show that Auto-Tab Transformer achieves state of the art (SOTA) results in the less labelled data domain. We conduct extensive ablation studies detailing the importance of all the components used.