Latent Tree Language Model

In this paper we introduce Latent Tree Language Model (LTLM), a novel approach to language modeling that encodes syntax and semantics of a given sentence as a tree of word roles. The learning phase iteratively updates the trees by moving nodes according to Gibbs sampling... (read more)

Results in Papers With Code
(↓ scroll down to see all results)