1 code implementation • International Conference on Learning Representations 2024 • Hangting Ye, Wei Fan, Xiaozhuang Song, Shun Zheng, He Zhao, Dan dan Guo, Yi Chang
With the recent success of deep learning, many tabular machine learning (ML) methods based on deep networks (e. g., Transformer, ResNet) have achieved competitive performance on tabular benchmarks.
no code implementations • 1 Jan 2021 • Ruiying Lu, Bo Chen, Dan dan Guo, Dongsheng Wang, Mingyuan Zhou
Moving beyond conventional Transformers that ignore longer-range word dependencies and contextualize their word representations at the segment level, the proposed method not only captures global semantic coherence of all segments and global word concurrence patterns, but also enriches the representation of each token by adapting it to its local context, which is not limited to the segment it resides in and can be flexibly defined according to the task.