no code implementations • 6 Apr 2024 • Derui Zhu, Dingfan Chen, Qing Li, Zongxiong Chen, Lei Ma, Jens Grossklags, Mario Fritz
Despite tremendous advancements in large language models (LLMs) over recent years, a notably urgent challenge for their practical deployment is the phenomenon of hallucination, where the model fabricates facts and produces non-factual statements.
no code implementations • 22 Oct 2023 • Da Song, Xuan Xie, Jiayang Song, Derui Zhu, Yuheng Huang, Felix Juefei-Xu, Lei Ma
the trustworthiness perspective, is bound to and enriches the abstract model with semantics, which enables more detailed analysis applications for diverse purposes.
no code implementations • 5 May 2023 • Zongxiong Chen, Jiahui Geng, Derui Zhu, Herbert Woisetschlaeger, Qing Li, Sonja Schimmler, Ruben Mayer, Chunming Rong
The aim of dataset distillation is to encode the rich features of an original dataset into a tiny dataset.
no code implementations • 15 Feb 2023 • Derui Zhu, Dingfan Chen, Jens Grossklags, Mario Fritz
In recent years, diffusion models have achieved tremendous success in the field of image generation, becoming the stateof-the-art technology for AI-based image processing applications.
no code implementations • 27 Jan 2023 • Zhuo Li, Derui Zhu, Yujing Hu, Xiaofei Xie, Lei Ma, Yan Zheng, Yan Song, Yingfeng Chen, Jianjun Zhao
Generally, episodic control-based approaches are solutions that leverage highly-rewarded past experiences to improve sample efficiency of DRL algorithms.
1 code implementation • 5 Jan 2022 • Amin Eslami Abyane, Derui Zhu, Roberto Souza, Lei Ma, Hadi Hemmati
Therefore, to better understand the current quality status and challenges of these SOTA FL techniques in the presence of attacks and faults, we perform a large-scale empirical study to investigate the SOTA FL's quality from multiple angles of attacks, simulated faults (via mutation operators), and aggregation (defense) methods.
no code implementations • ACL 2018 • Weiyue Wang, Derui Zhu, Tamer Alkhouli, Zixuan Gan, Hermann Ney
Attention-based neural machine translation (NMT) models selectively focus on specific source positions to produce a translation, which brings significant improvements over pure encoder-decoder sequence-to-sequence models.
no code implementations • ACL 2017 • Weiyue Wang, Tamer Alkhouli, Derui Zhu, Hermann Ney
Recently, the neural machine translation systems showed their promising performance and surpassed the phrase-based systems for most translation tasks.