no code implementations • 22 Mar 2024 • Zhichao Wei, Qingkun Su, Long Qin, Weizhi Wang
CLS embeddings are used on the one hand to augment the text embeddings, and on the other hand together with patch embeddings to derive a small number of detail-rich subject embeddings, both of which are efficiently integrated into the diffusion model through the well-designed multimodal cross-attention mechanism.
no code implementations • 18 Mar 2024 • Zhenghao Zhang, Zuozhuo Dai, Long Qin, Weizhi Wang
Large-scale text-to-video models have shown remarkable abilities, but their direct application in video editing remains challenging due to limited available datasets.
1 code implementation • 21 Nov 2023 • Zuozhuo Dai, Zhenghao Zhang, Yao Yao, Bingxue Qiu, Siyu Zhu, Long Qin, Weizhi Wang
Image animation is a key task in computer vision which aims to generate dynamic visual content from static image.
no code implementations • 3 Aug 2021 • Xinmeng Li, Wansen Wu, Long Qin, Quanjun Yin
Evaluating the quality of a dialogue system is an understudied problem.
no code implementations • WS 2019 • Shuyao Xu, Jiehao Zhang, Jin Chen, Long Qin
It has been demonstrated that the utilization of a monolingual corpus in neural Grammatical Error Correction (GEC) systems can significantly improve the system performance.
1 code implementation • IEEE Transactions on Intelligent Transportation Systems 2019 • Tao Deng, Hongmei Yan, Long Qin, Thuyen Ngo, and B. S. Manjunath
Based on the multiple drivers’ attention allocation dataset, we propose a convolutional-deconvolutional neural network (CDNN) to predict the drivers’ eye fixations.
no code implementations • 5 Nov 2018 • Junjie Zeng, Long Qin, Yue Hu, Cong Hu, Quanjun Yin
The first advantage of the proposed method is that SSG can solve the limitations of sparse reward and local minima trap for RL agents; thus, LSPI can be used to generate paths in complex environments.
no code implementations • WS 2018 • Shuyao Xu, Jin Chen, Long Qin
Second Language Acquisition Modeling is the task to predict whether a second language learner would respond correctly in future exercises based on their learning history.