1 code implementation • 15 Apr 2021 • Daniel Hernandez Diaz, Siyang Qin, Reeve Ingle, Yasuhisa Fujii, Alessandro Bissacco
Unlike the more common Transformer-based models, this architecture can handle inputs of arbitrary length, a requirement for universal line recognition.
Ranked #2 on Handwritten Text Recognition on IAM (using extra training data)
no code implementations • 27 Sep 2018 • Daniel Hernandez Diaz, Antonio Khalil Moretti, Ziqiang Wei, Shreya Saxena, John Cunningham, Liam Paninski
In the case of sequential data, closed-form inference is possible when the transition and observation functions are linear.