no code implementations • ICCV 2019 • Hongwei Ge, Zehang Yan, Kai Zhang, Mingde Zhao, Liang Sun
In the training process, the forward and backward LSTMs encode the succeeding and preceding words into their respective hidden states by simultaneously constructing the whole sentence in a complementary manner.