Search Results for author: Vlad Schogol

Found 5 papers, 0 papers with code

Distilling weighted finite automata from arbitrary probabilistic models

no code implementations WS 2019 An Suresh, a Theertha, Brian Roark, Michael Riley, Vlad Schogol

Weighted finite automata (WFA) are often used to represent probabilistic models, such as n-gram language models, since they are efficient for recognition tasks in time and space.

Latin script keyboards for South Asian languages with finite-state normalization

no code implementations WS 2019 Lawrence Wolf-Sonkin, Vlad Schogol, Brian Roark, Michael Riley

The use of the Latin script for text entry of South Asian languages is common, even though there is no standard orthography for these languages in the script.

Transliteration

Approximating probabilistic models as weighted finite automata

no code implementations CL (ACL) 2021 Ananda Theertha Suresh, Brian Roark, Michael Riley, Vlad Schogol

Weighted finite automata (WFA) are often used to represent probabilistic models, such as $n$-gram language models, since they are efficient for recognition tasks in time and space.

No Need for a Lexicon? Evaluating the Value of the Pronunciation Lexica in End-to-End Models

no code implementations5 Dec 2017 Tara N. Sainath, Rohit Prabhavalkar, Shankar Kumar, Seungji Lee, Anjuli Kannan, David Rybach, Vlad Schogol, Patrick Nguyen, Bo Li, Yonghui Wu, Zhifeng Chen, Chung-Cheng Chiu

However, there has been little previous work comparing phoneme-based versus grapheme-based sub-word units in the end-to-end modeling framework, to determine whether the gains from such approaches are primarily due to the new probabilistic model, or from the joint learning of the various components with grapheme-based units.

Language Modelling

Cannot find the paper you are looking for? You can Submit a new open access paper.