Search Results for author: Peng Qian

Found 18 papers, 8 papers with code

Structural Guidance for Transformer Language Models

1 code implementation ACL 2021 Peng Qian, Tahira Naseem, Roger Levy, Ramón Fernandez Astudillo

Here we study whether structural guidance leads to more human-like systematic linguistic generalization in Transformer language models without resorting to pre-training on very large amounts of data.

Language Modelling

Combining Graph Neural Networks with Expert Knowledge for Smart Contract Vulnerability Detection

1 code implementation24 Jul 2021 Zhenguang Liu, Peng Qian, Xiaoyang Wang, Yuan Zhuang, Lin Qiu, Xun Wang

Then, we propose a novel temporal message propagation network to extract the graph feature from the normalized graph, and combine the graph feature with designed expert patterns to yield a final detection system.

Vulnerability Detection

What if This Modified That? Syntactic Interventions via Counterfactual Embeddings

1 code implementation28 May 2021 Mycal Tucker, Peng Qian, Roger Levy

Neural language models exhibit impressive performance on a variety of tasks, but their internal reasoning may be difficult to understand.

Digital Quantum Simulation of Floquet Topological Phases with a Solid-State Quantum Simulator

no code implementations10 Dec 2020 Bing Chen, Shuo Li, Xianfei Hou, Feifei Zhou, Peng Qian, Feng Mei, Suotang Jia, Nanyang Xu, Heng Shen

Quantum simulator with the ability to harness the dynamics of complex quantum systems has emerged as a promising platform for probing exotic topological phases.

Quantum Physics

Structural Supervision Improves Few-Shot Learning and Syntactic Generalization in Neural Language Models

no code implementations EMNLP 2020 Ethan Wilcox, Peng Qian, Richard Futrell, Ryosuke Kohita, Roger Levy, Miguel Ballesteros

Humans can learn structural properties about a word from minimal experience, and deploy their learned syntactic representations uniformly in different grammatical contexts.

Few-Shot Learning

On the Predictive Power of Neural Language Models for Human Real-Time Comprehension Behavior

1 code implementation2 Jun 2020 Ethan Gotlieb Wilcox, Jon Gauthier, Jennifer Hu, Peng Qian, Roger Levy

Human reading behavior is tuned to the statistics of natural language: the time it takes human subjects to read a word can be predicted from estimates of the word's probability in context.

A Systematic Assessment of Syntactic Generalization in Neural Language Models

1 code implementation ACL 2020 Jennifer Hu, Jon Gauthier, Peng Qian, Ethan Wilcox, Roger P. Levy

While state-of-the-art neural network models continue to achieve lower perplexity scores on language modeling benchmarks, it remains unknown whether optimizing for broad-coverage predictive performance leads to human-like syntactic knowledge.

Language Modelling

Representation of Constituents in Neural Language Models: Coordination Phrase as a Case Study

1 code implementation IJCNLP 2019 Aixiu An, Peng Qian, Ethan Wilcox, Roger Levy

We assess whether different neural language models trained on English and French represent phrase-level number and gender features, and use those features to drive downstream expectations.

Neural Language Models as Psycholinguistic Subjects: Representations of Syntactic State

1 code implementation NAACL 2019 Richard Futrell, Ethan Wilcox, Takashi Morita, Peng Qian, Miguel Ballesteros, Roger Levy

We deploy the methods of controlled psycholinguistic experimentation to shed light on the extent to which the behavior of neural network language models reflects incremental representations of syntactic state.

Structural Supervision Improves Learning of Non-Local Grammatical Dependencies

no code implementations NAACL 2019 Ethan Wilcox, Peng Qian, Richard Futrell, Miguel Ballesteros, Roger Levy

State-of-the-art LSTM language models trained on large corpora learn sequential contingencies in impressive detail and have been shown to acquire a number of non-local grammatical dependencies with some success.

Hierarchical structure Language Modelling

Bridging LSTM Architecture and the Neural Dynamics during Reading

no code implementations22 Apr 2016 Peng Qian, Xipeng Qiu, Xuanjing Huang

Recently, the long short-term memory neural network (LSTM) has attracted wide interest due to its success in many tasks.

Overview of the NLPCC 2015 Shared Task: Chinese Word Segmentation and POS Tagging for Micro-blog Texts

no code implementations28 May 2015 Xipeng Qiu, Peng Qian, Liusong Yin, Shiyu Wu, Xuanjing Huang

In this paper, we give an overview for the shared task at the 4th CCF Conference on Natural Language Processing \& Chinese Computing (NLPCC 2015): Chinese word segmentation and part-of-speech (POS) tagging for micro-blog texts.

Chinese Word Segmentation Part-Of-Speech Tagging +1

Cannot find the paper you are looking for? You can Submit a new open access paper.