Search Results for author: Jiahui Geng

Found 9 papers, 2 papers with code

Multimodal Large Language Models to Support Real-World Fact-Checking

no code implementations6 Mar 2024 Jiahui Geng, Yova Kementchedjhieva, Preslav Nakov, Iryna Gurevych

To the best of our knowledge, we are the first to evaluate MLLMs for real-world fact-checking.

Fact Checking

A Survey of Confidence Estimation and Calibration in Large Language Models

no code implementations14 Nov 2023 Jiahui Geng, Fengyu Cai, Yuxia Wang, Heinz Koeppl, Preslav Nakov, Iryna Gurevych

Assessing their confidence and calibrating them across different tasks can help mitigate risks and enable LLMs to produce better generations.

Language Modelling

A Survey on Dataset Distillation: Approaches, Applications and Future Directions

1 code implementation3 May 2023 Jiahui Geng, Zongxiong Chen, Yuandou Wang, Herbert Woisetschlaeger, Sonja Schimmler, Ruben Mayer, Zhiming Zhao, Chunming Rong

Dataset distillation is attracting more attention in machine learning as training sets continue to grow and the cost of training state-of-the-art models becomes increasingly high.

Continual Learning Neural Architecture Search

Towards General Deep Leakage in Federated Learning

no code implementations18 Oct 2021 Jiahui Geng, Yongli Mou, Feifei Li, Qing Li, Oya Beyan, Stefan Decker, Chunming Rong

We find that image restoration fails even if there is only one incorrectly inferred label in the batch; we also find that when batch images have the same label, the corresponding image is restored as a fusion of that class of images.

Federated Learning Image Restoration +1

DID-eFed: Facilitating Federated Learning as a Service with Decentralized Identities

no code implementations18 May 2021 Jiahui Geng, Neel Kanwal, Martin Gilje Jaatun, Chunming Rong

DID enables a more flexible and credible decentralized access management in our system, while the smart contract offers a frictionless and less error-prone process.

Federated Learning Management

Improving Unsupervised Word-by-Word Translation with Language Model and Denoising Autoencoder

no code implementations EMNLP 2018 Yunsu Kim, Jiahui Geng, Hermann Ney

Unsupervised learning of cross-lingual word embedding offers elegant matching of words across languages, but has fundamental limitations in translating sentences.

Denoising Language Modelling +2

The RWTH Aachen University English-German and German-English Unsupervised Neural Machine Translation Systems for WMT 2018

no code implementations WS 2018 Miguel Gra{\c{c}}a, Yunsu Kim, Julian Schamper, Jiahui Geng, Hermann Ney

This paper describes the unsupervised neural machine translation (NMT) systems of the RWTH Aachen University developed for the English ↔ German news translation task of the \textit{EMNLP 2018 Third Conference on Machine Translation} (WMT 2018).

Machine Translation NMT +2

Cannot find the paper you are looking for? You can Submit a new open access paper.