no code implementations • ICLR 2018 • Ke Zhai, Huan Wang
We propose a novel framework to adaptively adjust the dropout rates for the deep neural network based on a Rademacher complexity bound.
no code implementations • TACL 2014 • Ke Zhai, Jordan Boyd-Graber, Shay B. Cohen
Adaptor grammars are a flexible, powerful formalism for defining nonparametric, unsupervised models of grammar productions.