Search Results for author: Jingting Ye

Found 3 papers, 2 papers with code

Are Structural Concepts Universal in Transformer Language Models? Towards Interpretable Cross-Lingual Generalization

1 code implementation19 Oct 2023 Ningyu Xu, Qi Zhang, Jingting Ye, Menghan Zhang, Xuanjing Huang

We then propose a meta-learning-based method to learn to align conceptual spaces of different languages, which facilitates zero-shot and few-shot generalization in concept classification and also offers insights into the cross-lingual in-context learning phenomenon.

In-Context Learning Meta-Learning +1

RealBehavior: A Framework for Faithfully Characterizing Foundation Models' Human-like Behavior Mechanisms

no code implementations17 Oct 2023 Enyu Zhou, Rui Zheng, Zhiheng Xi, Songyang Gao, Xiaoran Fan, Zichu Fei, Jingting Ye, Tao Gui, Qi Zhang, Xuanjing Huang

Reports of human-like behaviors in foundation models are growing, with psychological theories providing enduring tools to investigate these behaviors.

Cross-Linguistic Syntactic Difference in Multilingual BERT: How Good is It and How Does It Affect Transfer?

1 code implementation21 Dec 2022 Ningyu Xu, Tao Gui, Ruotian Ma, Qi Zhang, Jingting Ye, Menghan Zhang, Xuanjing Huang

We demonstrate that the distance between the distributions of different languages is highly consistent with the syntactic difference in terms of linguistic formalisms.

Zero-Shot Cross-Lingual Transfer

Cannot find the paper you are looking for? You can Submit a new open access paper.