Polyglot Neural Language Models: A Case Study in Cross-Lingual Phonetic Representation Learning

NAACL 2016 Yulia TsvetkovSunayana SitaramManaal FaruquiGuillaume LamplePatrick LittellDavid MortensenAlan W BlackLori LevinChris Dyer

We introduce polyglot language models, recurrent neural network models trained to predict symbol sequences in many different languages using shared representations of symbols and conditioning on typological information about the language to be predicted. We apply these to the problem of modeling phone sequences---a domain in which universal symbol inventories and cross-linguistically shared feature representations are a natural fit... (read more)

PDF Abstract


No code implementations yet. Submit your code now

Results from the Paper

  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.