Deep Tabular Learning

TabTransformer

Introduced by Huang et al. in TabTransformer: Tabular Data Modeling Using Contextual Embeddings

TabTransformer is a deep tabular data modeling architecture for supervised and semi-supervised learning. The TabTransformer is built upon self-attention based Transformers. The Transformer layers transform the embeddings of categorical features into robust contextual embeddings to achieve higher prediction accuracy.

As an overview, the architecture comprises a column embedding layer, a stack of $N$ Transformer layers, and a multi-layer perceptron (MLP). The contextual embeddings (outputted by the Transformer layer) are concatenated along with continuous features which is inputted to an MLP. The loss function is then minimized to learn all the parameters in an end-to-end learning.

Source: TabTransformer: Tabular Data Modeling Using Contextual Embeddings

Papers


Paper Code Results Date Stars

Tasks


Task Papers Share
Self-Supervised Learning 1 50.00%
Unsupervised Pre-training 1 50.00%

Categories