Fieldwise Factorized Networks for Tabular Data Classification

29 Sep 2021  ·  Chen Almagor, Yedid Hoshen ·

Tabular data is one of the most common data-types in machine learning, however, deep neural networks have not yet convincingly outperformed classical baselines on such datasets. In this paper, we first investigate the theoretical connection between neural network and factorization machine techniques, and present fieldwise factorized neural networks (F2NN), a neural network architecture framework that is aimed for tabular classification. Our framework learns high-dimensional field representations by a low-rank factorization, and handles both categorical and numerical fields. Furthermore, we show that simply by changing our penultimate activation function, the framework recovers a range of popular tabular classification methods. We evaluate our method against state-of-the-art tabular baselines, including tree-based and deep neural network methods, on a range of tasks. Our findings suggest that our theoretically grounded but simple and shallow neural network architecture achieves as strong or better results than more complex methods.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here