Search Results for author: Mengjie Zhao

Found 13 papers, 1 papers with code

Are Code Pre-trained Models Powerful to Learn Code Syntax and Semantics?

no code implementations20 Dec 2022 Wei Ma, Mengjie Zhao, Xiaofei Xie, Qiang Hu, Shangqing Liu, Jie Zhang, Wenhan Wang, Yang Liu

To further understand the code features learnt by these models, in this paper, we target two well-known representative code pre-trained models (i. e., CodeBERT and GraphCodeBERT) and devise a set of probing tasks for the syntax and semantics analysis.

Code Completion Code Search +2

Discrete and Soft Prompting for Multilingual Models

1 code implementation EMNLP 2021 Mengjie Zhao, Hinrich Schütze

It has been shown for English that discrete and soft prompting perform strongly in few-shot learning with pretrained language models (PLMs).

Few-Shot Learning Natural Language Inference

A Closer Look at Few-Shot Crosslingual Transfer: The Choice of Shots Matters

no code implementations ACL 2021 Mengjie Zhao, Yi Zhu, Ehsan Shareghi, Ivan Vulić, Roi Reichart, Anna Korhonen, Hinrich Schütze

Few-shot crosslingual transfer has been shown to outperform its zero-shot counterpart with pretrained encoders like multilingual BERT.

Few-Shot Learning

Masking as an Efficient Alternative to Finetuning for Pretrained Language Models

no code implementations EMNLP 2020 Mengjie Zhao, Tao Lin, Fei Mi, Martin Jaggi, Hinrich Schütze

We present an efficient method of utilizing pretrained language models, where we learn selective binary masks for pretrained weights in lieu of modifying them through finetuning.

Quantifying the Contextualization of Word Representations with Semantic Class Probing

no code implementations Findings of the Association for Computational Linguistics 2020 Mengjie Zhao, Philipp Dufter, Yadollah Yaghoobzadeh, Hinrich Schütze

Pretrained language models have achieved a new state of the art on many NLP tasks, but there are still many open questions about how and why they work so well.

A Multilingual BPE Embedding Space for Universal Sentiment Lexicon Induction

no code implementations ACL 2019 Mengjie Zhao, Hinrich Sch{\"u}tze

We present a new method for sentiment lexicon induction that is designed to be applicable to the entire range of typological diversity of the world{'}s languages.

Domain Adaptation

Multilingual Embeddings Jointly Induced from Contexts and Concepts: Simple, Strong and Scalable

no code implementations1 Nov 2018 Philipp Dufter, Mengjie Zhao, Hinrich Schütze

A simple and effective context-based multilingual embedding learner is Levy et al. (2017)'s S-ID (sentence ID) method.

Multilingual Word Embeddings

Cannot find the paper you are looking for? You can Submit a new open access paper.