2 code implementations • CONLL 2018 • WooJin Chung, Sheng-Fu Wang, Samuel R. Bowman
Tree-structured neural network architectures for sentence encoding draw inspiration from the approach to semantic composition generally seen in formal linguistics, and have shown empirical improvements over comparable sequence models by doing so.