1 code implementation • ACL 2022 • Yikang Shen, Shawn Tan, Alessandro Sordoni, Peng Li, Jie zhou, Aaron Courville
We introduce a new model, the Unsupervised Dependency Graph Network (UDGN), that can induce dependency structures from raw corpora and the masked language modeling task.
1 code implementation • COLING 2022 • Jiaxin Mi, Po Hu, Peng Li
To this end, we propose a simple yet effective model named DualGAT (Dual Relational Graph Attention Networks), which exploits the complementary nature of syntactic and semantic relations to alleviate the problem.
1 code implementation • EMNLP 2021 • Yuan YAO, Jiaju Du, Yankai Lin, Peng Li, Zhiyuan Liu, Jie zhou, Maosong Sun
Existing relation extraction (RE) methods typically focus on extracting relational facts between entity pairs within single sentences or documents.
no code implementations • EMNLP 2020 • Xiuyi Chen, Fandong Meng, Peng Li, Feilong Chen, Shuang Xu, Bo Xu, Jie zhou
Here, we deal with these issues on two aspects: (1) We enhance the prior selection module with the necessary posterior information obtained from the specially designed Posterior Information Prediction Module (PIPM); (2) We propose a Knowledge Distillation Based Training Strategy (KDBTS) to train the decoder with the knowledge selected from the prior distribution, removing the exposure bias of knowledge selection.
1 code implementation • Findings (ACL) 2022 • Xin Lv, Yankai Lin, Yixin Cao, Lei Hou, Juanzi Li, Zhiyuan Liu, Peng Li, Jie zhou
In recent years, pre-trained language models (PLMs) have been shown to capture factual knowledge from massive texts, which encourages the proposal of PLM-based knowledge graph completion (KGC) models.
no code implementations • 16 Mar 2023 • Ning Qi, Peng Li, Lin Cheng, Ziyi Zhang, Wenrui Huang, Weiwei Yang
Energy storage (ES) and virtual energy storage (VES) are key components to realizing power system decarbonization.
no code implementations • 7 Mar 2023 • Zhiqiang Zhou, Chaoli Zhang, Lingna Ma, Jing Gu, Huajie Qian, Qingsong Wen, Liang Sun, Peng Li, Zhimin Tang
This paper discusses horizontal POD resources management in Alibaba Cloud Container Services with a newly deployed AI algorithm framework named AHPA -- the adaptive horizontal pod auto-scaling system.
no code implementations • 3 Feb 2023 • Zihu Wang, Yu Wang, Hanbin Hu, Peng Li
Contrastive learning demonstrates great promise for representation learning.
no code implementations • 28 Jan 2023 • Zeyuan Yang, Zonghan Yang, Peng Li, Yang Liu
The basic idea is to adopt a restricted orthogonal constraint allowing parameters optimized in the direction oblique to the whole frozen space to facilitate forward knowledge transfer while consolidating previous knowledge.
no code implementations • 25 Jan 2023 • Wenkai Yang, Yankai Lin, Guangxiang Zhao, Peng Li, Jie zhou, Xu sun
Federated Learning has become a widely-used framework which allows learning a global model on decentralized local datasets under the condition of protecting local data privacy.
no code implementations • 19 Dec 2022 • Xuancheng Huang, Zijun Liu, Peng Li, Maosong Sun, Yang Liu
Multi-source translation (MST), which typically receives multiple source sentences of the same meaning in different languages, has been shown superior to single-source translation.
no code implementations • 18 Dec 2022 • Yuanchi Zhang, Peng Li, Maosong Sun, Yang Liu
Although continually extending an existing NMT model to new domains or languages has attracted intensive interest in recent years, the equally valuable problem of continually improving a given NMT model in its domain by leveraging knowledge from an unlimited number of existing NMT models is not explored yet.
no code implementations • 1 Dec 2022 • Yukun Yang, Peng Li
Gradient-based first-order adaptive optimization methods such as the Adam optimizer are prevalent in training artificial networks, achieving the state-of-the-art results.
1 code implementation • 14 Nov 2022 • Xiaozhi Wang, Yulin Chen, Ning Ding, Hao Peng, Zimu Wang, Yankai Lin, Xu Han, Lei Hou, Juanzi Li, Zhiyuan Liu, Peng Li, Jie zhou
It contains 103, 193 event coreference chains, 1, 216, 217 temporal relations, 57, 992 causal relations, and 15, 841 subevent relations, which is larger than existing datasets of all the ERE tasks by at least an order of magnitude.
no code implementations • 29 Oct 2022 • Zhiheng Hu, Yongzhen Wang, Peng Li, Jie Qin, Haoran Xie, Mingqiang Wei
First, to maintain small targets in deep layers, we develop a multi-scale nested interaction module to explore a wide range of context information.
no code implementations • 28 Oct 2022 • Zhaowei Chen, Peng Li, Zeyong Wei, Honghua Chen, Haoran Xie, Mingqiang Wei, Fu Lee Wang
We propose GeoGCN, a novel geometric dual-domain graph convolution network for point cloud denoising (PCD).
1 code implementation • 18 Oct 2022 • Lan Jiang, Hao Zhou, Yankai Lin, Peng Li, Jie zhou, Rui Jiang
Even though the large-scale language models have achieved excellent performances, they suffer from various adversarial attacks.
1 code implementation • 11 Oct 2022 • Lei LI, Yankai Lin, Xuancheng Ren, Guangxiang Zhao, Peng Li, Jie zhou, Xu sun
We then design a Model Uncertainty--aware Knowledge Integration (MUKI) framework to recover the golden supervision for the student.
no code implementations • 10 Oct 2022 • Zonghan Yang, Xiaoyuan Yi, Peng Li, Yang Liu, Xing Xie
Warning: this paper contains model outputs exhibiting offensiveness and biases.
2 code implementations • 12 Sep 2022 • Ze Wang, Kailun Yang, Hao Shi, Peng Li, Fei Gao, Jian Bai, Kaiwei Wang
We collect the PALVIO dataset using a Panoramic Annular Lens (PAL) system with an entire FoV of 360\deg x(40\deg-120\deg) and IMU sensor to address the lack of panoramic SLAM datasets.
no code implementations • 16 Aug 2022 • Ryuichi Takanobu, Hao Zhou, Yankai Lin, Peng Li, Jie zhou, Minlie Huang
Modeling these subtasks is consistent with the human agent's behavior patterns.
no code implementations • 3 Jun 2022 • Qiqi Ding, Peng Li, Xuefeng Yan, Ding Shi, Luming Liang, Weiming Wang, Haoran Xie, Jonathan Li, Mingqiang Wei
To our knowledge, RSOD is the first quantitatively evaluated and graded snowy OD dataset.
no code implementations • 2 Jun 2022 • Yinghao Zhang, Peng Li, Yue Hu
While low-rank matrix prior has been exploited in dynamic MR image reconstruction and has obtained satisfying performance, tensor low-rank models have recently emerged as powerful alternative representations for three-dimensional dynamic MR datasets.
1 code implementation • 23 May 2022 • Shuo Wang, Peng Li, Zhixing Tan, Zhaopeng Tu, Maosong Sun, Yang Liu
In this work, we propose a template-based method that can yield results with high translation quality and match accuracy and the inference speed of our method is comparable with unconstrained NMT models.
no code implementations • 15 May 2022 • Yukun Yang, Peng Li
We employ the Hebbian rule operating in local compartments to update synaptic weights and achieve supervised learning in a biologically plausible manner.
1 code implementation • ACL 2022 • Pei Ke, Hao Zhou, Yankai Lin, Peng Li, Jie zhou, Xiaoyan Zhu, Minlie Huang
Existing reference-free metrics have obvious limitations for evaluating controlled text generation models.
no code implementations • 26 Mar 2022 • Sha Yuan, Hanyu Zhao, Shuai Zhao, Jiahong Leng, Yangxiao Liang, Xiaozhi Wang, Jifan Yu, Xin Lv, Zhou Shao, Jiaao He, Yankai Lin, Xu Han, Zhenghao Liu, Ning Ding, Yongming Rao, Yizhao Gao, Liang Zhang, Ming Ding, Cong Fang, Yisen Wang, Mingsheng Long, Jing Zhang, Yinpeng Dong, Tianyu Pang, Peng Cui, Lingxiao Huang, Zheng Liang, HuaWei Shen, HUI ZHANG, Quanshi Zhang, Qingxiu Dong, Zhixing Tan, Mingxuan Wang, Shuo Wang, Long Zhou, Haoran Li, Junwei Bao, Yingwei Pan, Weinan Zhang, Zhou Yu, Rui Yan, Chence Shi, Minghao Xu, Zuobai Zhang, Guoqiang Wang, Xiang Pan, Mengjie Li, Xiaoyu Chu, Zijun Yao, Fangwei Zhu, Shulin Cao, Weicheng Xue, Zixuan Ma, Zhengyan Zhang, Shengding Hu, Yujia Qin, Chaojun Xiao, Zheni Zeng, Ganqu Cui, Weize Chen, Weilin Zhao, Yuan YAO, Peng Li, Wenzhao Zheng, Wenliang Zhao, Ziyi Wang, Borui Zhang, Nanyi Fei, Anwen Hu, Zenan Ling, Haoyang Li, Boxi Cao, Xianpei Han, Weidong Zhan, Baobao Chang, Hao Sun, Jiawen Deng, Chujie Zheng, Juanzi Li, Lei Hou, Xigang Cao, Jidong Zhai, Zhiyuan Liu, Maosong Sun, Jiwen Lu, Zhiwu Lu, Qin Jin, Ruihua Song, Ji-Rong Wen, Zhouchen Lin, LiWei Wang, Hang Su, Jun Zhu, Zhifang Sui, Jiajun Zhang, Yang Liu, Xiaodong He, Minlie Huang, Jian Tang, Jie Tang
With the rapid development of deep learning, training Big Models (BMs) for multiple downstream tasks becomes a popular paradigm.
1 code implementation • Findings (ACL) 2022 • Yujia Qin, Jiajie Zhang, Yankai Lin, Zhiyuan Liu, Peng Li, Maosong Sun, Jie zhou
We experiment ELLE with streaming data from 5 domains on BERT and GPT.
no code implementations • 4 Mar 2022 • Peng Li, Jiayin Zhao, Jingyao Wu, Chao Deng, Haoqian Wang, Tao Yu
Light field disparity estimation is an essential task in computer vision with various applications.
1 code implementation • ACL 2022 • Deming Ye, Yankai Lin, Peng Li, Maosong Sun, Zhiyuan Liu
Pre-trained language models (PLMs) cannot well recall rich factual knowledge of entities exhibited in large-scale corpora, especially those rare entities.
1 code implementation • 27 Feb 2022 • Hao Shi, Yifan Zhou, Kailun Yang, Xiaoting Yin, Ze Wang, Yaozu Ye, Zhe Yin, Shi Meng, Peng Li, Kaiwei Wang
PanoFlow achieves state-of-the-art performance on the public OmniFlowNet and the established FlowScape benchmarks.
1 code implementation • 25 Feb 2022 • Ze Wang, Kailun Yang, Hao Shi, Peng Li, Fei Gao, Kaiwei Wang
To tackle this issue, we propose LF-VIO, a real-time VIO framework for cameras with extremely large FoV.
no code implementations • 8 Feb 2022 • Guhong Nie, Lirui Xiao, Menglong Zhu, Dongliang Chu, Yue Shen, Peng Li, Kang Yang, Li Du, Bo Chen
For binary neural networks (BNNs) to become the mainstream on-device computer vision algorithm, they must achieve a superior speed-vs-accuracy tradeoff than 8-bit quantization and establish a similar degree of general applicability in vision tasks.
no code implementations • 26 Jan 2022 • Peng Li, Arim Park, Soohyun Cho, Yao Zhao
In this paper, we study the effect of compensated reviews on non-compensated reviews by utilizing online reviews on 1, 240 auto shipping companies over a ten-year period from a transportation website.
no code implementations • 14 Dec 2021 • Lei LI, Yankai Lin, Xuancheng Ren, Guangxiang Zhao, Peng Li, Jie zhou, Xu sun
As many fine-tuned pre-trained language models~(PLMs) with promising performance are generously released, investigating better ways to reuse these models is vital as it can greatly reduce the retraining computational cost and the potential environmental side-effects.
no code implementations • 14 Nov 2021 • Yukun Yang, Peng Li
Our experiments show that the proposed framework demonstrates learning accuracy comparable to BP-based rules and may provide new insights on how learning is orchestrated in biological systems.
1 code implementation • NAACL 2022 • Yusheng Su, Xiaozhi Wang, Yujia Qin, Chi-Min Chan, Yankai Lin, Huadong Wang, Kaiyue Wen, Zhiyuan Liu, Peng Li, Juanzi Li, Lei Hou, Maosong Sun, Jie zhou
To explore whether we can improve PT via prompt transfer, we empirically investigate the transferability of soft prompts across different downstream tasks and PLMs in this work.
no code implementations • 2 Nov 2021 • Fahao Chen, Peng Li, Toshiaki Miyazaki, Celimuge Wu
In this paper, we propose FedGraph for federated graph learning among multiple computing clients, each of which holds a subgraph.
no code implementations • 29 Oct 2021 • Guanglin Niu, Yang Li, Chengguang Tang, Zhongkai Hu, Shibin Yang, Peng Li, Chengyu Wang, Hao Wang, Jian Sun
The multi-relational Knowledge Base Question Answering (KBQA) system performs multi-hop reasoning over the knowledge graph (KG) to achieve the answer.
Knowledge Base Question Answering Knowledge Graph Embedding +1
no code implementations • 23 Oct 2021 • Pudong Ge, Peng Li, Boli Chen, Fei Teng
The robust distributed state estimation for a class of continuous-time linear time-invariant systems is achieved by a novel kernel-based distributed observer, which, for the first time, ensures fixed-time convergence properties.
1 code implementation • 15 Oct 2021 • Yujia Qin, Xiaozhi Wang, Yusheng Su, Yankai Lin, Ning Ding, Jing Yi, Weize Chen, Zhiyuan Liu, Juanzi Li, Lei Hou, Peng Li, Maosong Sun, Jie zhou
In the experiments, we study diverse few-shot NLP tasks and surprisingly find that in a 250-dimensional subspace found with 100 tasks, by only tuning 250 free parameters, we can recover 97% and 83% of the full prompt tuning performance for 100 seen tasks (using different training data) and 20 unseen tasks, respectively, showing great generalization ability of the found intrinsic task subspace.
1 code implementation • EMNLP 2021 • Wenkai Yang, Yankai Lin, Peng Li, Jie zhou, Xu sun
Motivated by this observation, we construct a word-based robustness-aware perturbation to distinguish poisoned samples from clean samples to defend against the backdoor attacks on natural language processing (NLP) models.
1 code implementation • NeurIPS 2021 • Deli Chen, Yankai Lin, Guangxiang Zhao, Xuancheng Ren, Peng Li, Jie zhou, Xu sun
The class imbalance problem, as an important issue in learning node representations, has drawn increasing attention from the community.
1 code implementation • Findings (ACL) 2022 • Zhengyan Zhang, Yankai Lin, Zhiyuan Liu, Peng Li, Maosong Sun, Jie zhou
In this work, we study the computational patterns of FFNs and observe that most inputs only activate a tiny ratio of neurons of FFNs.
no code implementations • 29 Sep 2021 • Yukun Yang, Peng Li
There exists a marked cleavage between the biological plausible approaches and the practical backpropagation-based approaches on how to train a deep spiking neural network (DSNN) with better performance.
no code implementations • 29 Sep 2021 • Yu Wang, Jan Drgona, Jiaxin Zhang, Karthik Somayaji NS, Frank Y Liu, Malachi Schram, Peng Li
Although various flow models based on different transformations have been proposed, there still lacks a quantitative analysis of performance-cost trade-offs between different flows as well as a systematic way of constructing the best flow architecture.
1 code implementation • EMNLP 2021 • Lei LI, Yankai Lin, Shuhuai Ren, Peng Li, Jie zhou, Xu sun
Knowledge distillation~(KD) has been proved effective for compressing large-scale pre-trained language models.
no code implementations • Findings (ACL) 2021 • Feilong Chen, Xiuyi Chen, Fandong Meng, Peng Li, Jie zhou
Specifically, GoG consists of three sequential graphs: 1) H-Graph, which aims to capture coreference relations among dialog history; 2) History-aware Q-Graph, which aims to fully understand the question through capturing dependency relations between words based on coreference resolution on the dialog history; and 3) Question-aware I-Graph, which aims to capture the relations between objects in an image based on fully question representation.
1 code implementation • Findings (ACL) 2021 • Feilong Chen, Fandong Meng, Xiuyi Chen, Peng Li, Jie zhou
Visual dialogue is a challenging task since it needs to answer a series of coherent questions on the basis of understanding the visual environment.
1 code implementation • ACL 2022 • Deming Ye, Yankai Lin, Peng Li, Maosong Sun
In particular, we propose a neighborhood-oriented packing strategy, which considers the neighbor spans integrally to better model the entity boundary information.
Ranked #1 on Named Entity Recognition (NER) on Few-NERD (SUP)
no code implementations • WMT (EMNLP) 2021 • Xianfeng Zeng, Yijin Liu, Ernan Li, Qiu Ran, Fandong Meng, Peng Li, Jinan Xu, Jie zhou
This paper introduces WeChat AI's participation in WMT 2021 shared news translation task on English->Chinese, English->Japanese, Japanese->English and English->German.
no code implementations • 4 Aug 2021 • Wenrui Zhang, Peng Li
The small size of the motifs and sparse inter-motif connectivity leads to an RSNN architecture scalable to large network sizes.
1 code implementation • ACL 2021 • Wenkai Yang, Yankai Lin, Peng Li, Jie zhou, Xu sun
In this work, we point out a potential problem of current backdoor attacking research: its evaluation ignores the stealthiness of backdoor attacks, and most of existing backdoor attacking methods are not stealthy either to system deployers or to system users.
no code implementations • 25 Jul 2021 • Ling Liang, Zheng Qu, Zhaodong Chen, Fengbin Tu, Yujie Wu, Lei Deng, Guoqi Li, Peng Li, Yuan Xie
Although spiking neural networks (SNNs) take benefits from the bio-plausible neural modeling, the low accuracy under the common local synaptic plasticity learning rules limits their application in many practical tasks.
no code implementations • 22 Jun 2021 • Yukun Yang, Wenrui Zhang, Peng Li
While backpropagation (BP) has been applied to spiking neural networks (SNNs) achieving encouraging results, a key challenge involved is to backpropagate a continuous-valued loss over layers of spiking neurons exhibiting discontinuous all-or-none firing activities.
no code implementations • 21 Jun 2021 • Renzhi Wu, Prem Sakala, Peng Li, Xu Chu, Yeye He
Panda's IDE includes many novel features purpose-built for EM, such as smart data sampling, a builtin library of EM utility functions, automatically generated LFs, visual debugging of LFs, and finally, an EM-specific labeling model.
no code implementations • 10 Jun 2021 • Runhuan Feng, Peng Li
The nesting of such stochastic modeling can be computationally challenging.
no code implementations • NAACL 2021 • Yingxue Zhang, Fandong Meng, Peng Li, Ping Jian, Jie zhou
Implicit discourse relation recognition (IDRR) aims to identify logical relations between two adjacent sentences in the discourse.
1 code implementation • ACL 2022 • Weize Chen, Xu Han, Yankai Lin, Hexu Zhao, Zhiyuan Liu, Peng Li, Maosong Sun, Jie zhou
Hyperbolic neural networks have shown great potential for modeling complex data.
1 code implementation • ACL 2021 • Ziqi Wang, Xiaozhi Wang, Xu Han, Yankai Lin, Lei Hou, Zhiyuan Liu, Peng Li, Juanzi Li, Jie zhou
Event extraction (EE) has considerably benefited from pre-trained language models (PLMs) by fine-tuning.
2 code implementations • NAACL 2022 • Yujia Qin, Yankai Lin, Jing Yi, Jiajie Zhang, Xu Han, Zhengyan Zhang, Yusheng Su, Zhiyuan Liu, Peng Li, Maosong Sun, Jie zhou
Specifically, we introduce a pre-training framework named "knowledge inheritance" (KI) and explore how could knowledge distillation serve as auxiliary supervision during pre-training to efficiently learn larger PLMs.
1 code implementation • Findings (ACL) 2021 • Tianyu Gao, Xu Han, Keyue Qiu, Yuzhuo Bai, Zhiyu Xie, Yankai Lin, Zhiyuan Liu, Peng Li, Maosong Sun, Jie zhou
Distantly supervised (DS) relation extraction (RE) has attracted much attention in the past few years as it can utilize large-scale auto-labeled data.
1 code implementation • 7 Feb 2021 • Yusheng Su, Xu Han, Yankai Lin, Zhengyan Zhang, Zhiyuan Liu, Peng Li, Jie zhou, Maosong Sun
We then perform contrastive semi-supervised learning on both the retrieved unlabeled and original labeled instances to help PLMs capture crucial task-related semantic features.
1 code implementation • 22 Jan 2021 • Xiaowei Hu, Peng Li
In the era of a growing population, systemic changes to the world, and the rising risk of crises, humanity has been facing an unprecedented challenge of resource scarcity.
no code implementations • 21 Jan 2021 • Yuan Fang, Ding Wang, Peng Li, Hang Su, Tian Le, Yi Wu, Guo-Wei Yang, Hua-Li Zhang, Zhi-Guang Xiao, Yan-Qiu Sun, Si-Yuan Hong, Yan-Wu Xie, Huan-Hua Wang, Chao Cao, Xin Lu, Hui-Qiu Yuan, Yang Liu
We report growth, electronic structure and superconductivity of ultrathin epitaxial CoSi2 films on Si(111).
Mesoscale and Nanoscale Physics
1 code implementation • ACL 2021 • Yujia Qin, Yankai Lin, Ryuichi Takanobu, Zhiyuan Liu, Peng Li, Heng Ji, Minlie Huang, Maosong Sun, Jie zhou
Pre-trained Language Models (PLMs) have shown superior performance on various downstream Natural Language Processing (NLP) tasks.
1 code implementation • Findings (EMNLP) 2021 • Lei LI, Yankai Lin, Deli Chen, Shuhuai Ren, Peng Li, Jie zhou, Xu sun
On the other hand, the exiting decisions made by internal classifiers are unreliable, leading to wrongly emitted early predictions.
no code implementations • 16 Dec 2020 • Peng Li, Jinjun Ding, Steven S. -L. Zhang, James Kally, Timothy Pillsbury, Olle G. Heinonen, Gaurab Rimal, Chong Bi, August DeMann, Stuart B. Field, Weigang Wang, Jinke Tang, J. S. Jiang, Axel Hoffmann, Nitin Samarth, Mingzhong Wu
A topological insulator (TI) interfaced with a magnetic insulator (MI) may host an anomalous Hall effect (AHE), a quantum AHE, and a topological Hall effect (THE).
Materials Science Mesoscale and Nanoscale Physics Applied Physics
no code implementations • 14 Dec 2020 • Deli Chen, Yankai Lin, Lei LI, Xuancheng Ren, Peng Li, Jie zhou, Xu sun
Graph Contrastive Learning (GCL) has proven highly effective in promoting the performance of Semi-Supervised Node Classification (SSNC).
1 code implementation • Asian Chapter of the Association for Computational Linguistics 2020 • Xiaozhi Wang, Shengyu Jia, Xu Han, Zhiyuan Liu, Juanzi Li, Peng Li, Jie zhou
Existing EAE methods either extract each event argument roles independently or sequentially, which cannot adequately model the joint probability distribution among event arguments and their roles.
no code implementations • 23 Nov 2020 • Peng Li, Baijiang Lv, Yuan Fang, Wei Guo, Zhongzheng Wu, Yi Wu, Cheng-Maw Cheng, Dawei Shen, Yuefeng Nie, Luca Petaccia, Chao Cao, Zhu-An Xu, Yang Liu
Using angle-resolved photoemission spectroscopy (ARPES) and low-energy electron diffraction (LEED), together with density-functional theory (DFT) calculation, we report the formation of charge density wave (CDW) and its interplay with the Kondo effect and topological states in CeSbTe.
Strongly Correlated Electrons Materials Science
2 code implementations • 18 Nov 2020 • Minghui Qiu, Peng Li, Chengyu Wang, Hanjie Pan, Ang Wang, Cen Chen, Xianyan Jia, Yaliang Li, Jun Huang, Deng Cai, Wei Lin
The literature has witnessed the success of leveraging Pre-trained Language Models (PLMs) and Transfer Learning (TL) algorithms to a wide range of Natural Language Processing (NLP) applications, yet it is not easy to build an easy-to-use and scalable TL toolkit for this purpose.
no code implementations • 16 Nov 2020 • Uwe Aickelin, Jenna Marie Reps, Peer-Olaf Siebers, Peng Li
In this paper, we present a case study demonstrating how dynamic and uncertain criteria can be incorporated into a multicriteria analysis with the help of discrete event simulation.
no code implementations • 28 Oct 2020 • Xiaoyu Kou, Yankai Lin, Yuntao Li, Jiahao Xu, Peng Li, Jie zhou, Yan Zhang
Knowledge graph embedding (KGE), aiming to embed entities and relations into low-dimensional vectors, has attracted wide attention recently.
no code implementations • 23 Oct 2020 • Wenrui Zhang, Peng Li
Moreover, we propose a new backpropagation (BP) method called backpropagated intrinsic plasticity (BIP) to further boost the performance of ScSr-SNNs by training intrinsic model parameters.
no code implementations • 10 Oct 2020 • Yingxue Zhang, Fandong Meng, Peng Li, Ping Jian, Jie zhou
As conventional answer selection (AS) methods generally match the question with each candidate answer independently, they suffer from the lack of matching information between the question and the candidate.
1 code implementation • EMNLP 2020 • Xiaoyu Kou, Yankai Lin, Shaobo Liu, Peng Li, Jie zhou, Yan Zhang
Graph embedding (GE) methods embed nodes (and/or edges) in graph into a low-dimensional semantic space, and have shown its effectiveness in modeling multi-relational data.
1 code implementation • EMNLP 2020 • Hao Peng, Tianyu Gao, Xu Han, Yankai Lin, Peng Li, Zhiyuan Liu, Maosong Sun, Jie zhou
We find that (i) while context is the main source to support the predictions, RE models also heavily rely on the information from entity mentions, most of which is type information, and (ii) existing datasets may leak shallow heuristics via entity mentions and thus contribute to the high performance on RE benchmarks.
Ranked #24 on Relation Extraction on TACRED
no code implementations • WMT (EMNLP) 2020 • Fandong Meng, Jianhao Yan, Yijin Liu, Yuan Gao, Xianfeng Zeng, Qinsong Zeng, Peng Li, Ming Chen, Jie zhou, Sifan Liu, Hao Zhou
We participate in the WMT 2020 shared news translation task on Chinese to English.
1 code implementation • 29 Sep 2020 • Yusheng Su, Xu Han, Zhengyan Zhang, Peng Li, Zhiyuan Liu, Yankai Lin, Jie zhou, Maosong Sun
In this paper, we propose a novel framework named Coke to dynamically select contextual knowledge and embed knowledge context according to textual context for PLMs, which can avoid the effect of redundant and ambiguous knowledge in KGs that cannot match the input text.
1 code implementation • 4 Aug 2020 • Siddharth Maddali, Marc Allain, Peng Li, Virginie Chamard, Stephan O. Hruszkewycz
This paper addresses three-dimensional signal distortion and image reconstruction issues in x-ray Bragg coherent diffraction imaging (BCDI) in the event of a non-trivial, non-orthogonal orientation of the area detector with respect to the diffracted beam.
Instrumentation and Detectors Image and Video Processing
no code implementations • ACL 2020 • Xu Han, Yi Dai, Tianyu Gao, Yankai Lin, Zhiyuan Liu, Peng Li, Maosong Sun, Jie zhou
Continual relation learning aims to continually train a model on new data to learn incessantly emerging novel relations while avoiding catastrophically forgetting old relations.
1 code implementation • ACL 2020 • Qiu Ran, Yankai Lin, Peng Li, Jie zhou
By dynamically determining segment length and deleting repetitive segments, RecoverSAT is capable of recovering from repetitive and missing token errors.
3 code implementations • 20 May 2020 • Dehong Gao, Linbo Jin, Ben Chen, Minghui Qiu, Peng Li, Yi Wei, Yi Hu, Hao Wang
In this paper, we address the text and image matching in cross-modal retrieval of the fashion industry.
no code implementations • 12 May 2020 • Fei Gao, Jingjie Zhu, Zeyuan Yu, Peng Li, Tao Wang
The whole portrait drawing robotic system is named AiSketcher.
1 code implementation • 11 May 2020 • Bojan Karlaš, Peng Li, Renzhi Wu, Nezihe Merve Gürel, Xu Chu, Wentao Wu, Ce Zhang
Machine learning (ML) applications have been thriving recently, largely attributed to the increasing availability of data.
1 code implementation • EMNLP 2020 • Xiaozhi Wang, Ziqi Wang, Xu Han, Wangyi Jiang, Rong Han, Zhiyuan Liu, Juanzi Li, Peng Li, Yankai Lin, Jie zhou
Most existing datasets exhibit the following issues that limit further development of ED: (1) Data scarcity.
2 code implementations • EMNLP 2020 • Deming Ye, Yankai Lin, Jiaju Du, Zheng-Hao Liu, Peng Li, Maosong Sun, Zhiyuan Liu
Language representation models such as BERT could effectively capture contextual semantic information from plain text, and have been proved to achieve promising results in lots of downstream NLP tasks with appropriate fine-tuning.
Ranked #30 on Relation Extraction on DocRED
no code implementations • Asian Chapter of the Association for Computational Linguistics 2020 • Xu Han, Tianyu Gao, Yankai Lin, Hao Peng, Yaoliang Yang, Chaojun Xiao, Zhiyuan Liu, Peng Li, Maosong Sun, Jie zhou
Relational facts are an important component of human knowledge, which are hidden in vast amounts of text.
1 code implementation • NeurIPS 2020 • Wenrui Zhang, Peng Li
Spiking neural networks (SNNs) are well suited for spatio-temporal learning and implementations on energy-efficient event-driven neuromorphic processors.
no code implementations • 1 Jan 2020 • Ling Liang, Xing Hu, Lei Deng, Yujie Wu, Guoqi Li, Yufei Ding, Peng Li, Yuan Xie
Recently, backpropagation through time inspired learning algorithms are widely introduced into SNNs to improve the performance, which brings the possibility to attack the models accurately given Spatio-temporal gradient maps.
1 code implementation • 18 Dec 2019 • Feilong Chen, Fandong Meng, Jiaming Xu, Peng Li, Bo Xu, Jie zhou
Visual Dialog is a vision-language task that requires an AI agent to engage in a conversation with humans grounded in an image.
no code implementations • 27 Nov 2019 • Xiao-Yu Zhang, Changsheng Li, Haichao Shi, Xiaobin Zhu, Peng Li, Jing Dong
The point process is a solid framework to model sequential data, such as videos, by exploring the underlying relevance.
no code implementations • 20 Nov 2019 • Xuepeng Fan, Peng Li, Yulong Zeng, Xiaoping Zhou
We study the liquid democracy problem, where each voter can either directly vote to a candidate or delegate his voting power to a proxy.
Cryptography and Security
no code implementations • 10 Nov 2019 • Deli Chen, Xiaoqian Liu, Yankai Lin, Peng Li, Jie zhou, Qi Su, Xu sun
To address this issue, we propose to model long-distance node relations by simply relying on shallow GNN architectures with two solutions: (1) Implicitly modelling by learning to predict node pair relations (2) Explicitly modelling by adding edges between nodes that potentially have the same label.
no code implementations • 6 Nov 2019 • Qiu Ran, Yankai Lin, Peng Li, Jie zhou
Non-autoregressive neural machine translation (NAT) generates each target word in parallel and has achieved promising inference acceleration.
no code implementations • 3 Nov 2019 • Lei Deng, Yujie Wu, Yifan Hu, Ling Liang, Guoqi Li, Xing Hu, Yufei Ding, Peng Li, Yuan Xie
As well known, the huge memory and compute costs of both artificial neural networks (ANNs) and spiking neural networks (SNNs) greatly hinder their deployment on edge devices with high efficiency.
1 code implementation • IJCNLP 2019 • Xiaozhi Wang, Ziqi Wang, Xu Han, Zhiyuan Liu, Juanzi Li, Peng Li, Maosong Sun, Jie zhou, Xiang Ren
Existing event extraction methods classify each argument role independently, ignoring the conceptual correlations between different argument roles.
1 code implementation • IJCNLP 2019 • Tianyu Gao, Xu Han, Hao Zhu, Zhiyuan Liu, Peng Li, Maosong Sun, Jie zhou
We present FewRel 2. 0, a more challenging task to investigate two aspects of few-shot relation classification models: (1) Can they adapt to a new domain with only a handful of instances?
2 code implementations • IJCNLP 2019 • Qiu Ran, Yankai Lin, Peng Li, Jie zhou, Zhiyuan Liu
Numerical reasoning, such as addition, subtraction, sorting and counting is a critical skill in human's reading comprehension, which has not been well considered in existing machine reading comprehension (MRC) systems.
Ranked #9 on Question Answering on DROP Test
no code implementations • 10 Sep 2019 • Changqing Xu, Wenrui Zhang, Yu Liu, Peng Li
Using spiking speech and image recognition datasets, we demonstrate the feasibility of supporting large time compression ratios of up to 16x, delivering up to 15. 93x, 13. 88x, and 86. 21x improvements in throughput, energy dissipation, the tradeoffs between hardware area, runtime, energy, and classification accuracy, respectively based on different spike codes on a Xilinx Zynq-7000 FPGA.
no code implementations • 7 Sep 2019 • Deli Chen, Yankai Lin, Wei Li, Peng Li, Jie zhou, Xu sun
Graph Neural Networks (GNNs) have achieved promising performance on a wide range of graph-based tasks.
Ranked #49 on Node Classification on Cora
no code implementations • 3 Sep 2019 • Dafydd Gibbon, Peng Li
Consequently, only the LF LTS of the absolute speech signal is used in the empirical analysis.
1 code implementation • NeurIPS 2019 • Wenrui Zhang, Peng Li
However, the practical application of RSNNs is severely limited by challenges in training.
no code implementations • 15 Aug 2019 • Jiabin Zhang, Zheng Zhu, Wei Zou, Peng Li, Yanwei Li, Hu Su, Guan Huang
Given the results of MTN, we adopt an occlusion-aware Re-ID feature strategy in the pose tracking module, where pose information is utilized to infer the occlusion state to make better use of Re-ID feature.
1 code implementation • 15 Aug 2019 • Peng Li, Siddharth Maddali, Anastasios Pateras, Irene Calvo-Almazan, Stephan O. Hruszkewycz, Virginie Chamard, Marc Allain
To deal with this, the currently favored approach (detailed in Part I) is to perform the entire inversion in conjugate non-orthogonal real and Fourier space frames, and to transform the 3D sample image into an orthogonal frame as a post-processing step for result analysis.
Instrumentation and Detectors Signal Processing
1 code implementation • 15 Aug 2019 • Siddharth Maddali, Peng Li, Anastasios Pateras, Daniel Timbie, Nazar Delegan, Alex Crook, Hope Lee, Irene Calvo-Almazan, Dina Sheyfer, Wonsuk Cha, F. Joseph Heremans, David D. Awschalom, Virginie Chamard, Marc Allain, Stephan O. Hruszkewycz
Part II builds upon the geometric theory developed in Part I with the formalism to correct the shear distortions directly on an orthogonal grid within the phase retrieval algorithm itself, allowing more physically realistic constraints to be applied.
Instrumentation and Detectors
1 code implementation • ACL 2019 • Shuming Ma, Pengcheng Yang, Tianyu Liu, Peng Li, Jie zhou, Xu sun
We propose a novel model to separate the generation into two stages: key fact prediction and surface realization.
1 code implementation • ACL 2019 • Fuli Luo, Peng Li, Pengcheng Yang, Jie zhou, Yutong Tan, Baobao Chang, Zhifang Sui, Xu sun
In this paper, we focus on the task of fine-grained text sentiment transfer (FGST).
no code implementations • 19 Jun 2019 • Hanbin Hu, Mit Shah, Jianhua Z. Huang, Peng Li
It has been shown that deep neural networks (DNNs) may be vulnerable to adversarial attacks, raising the concern on their robustness particularly for safety-critical applications.
no code implementations • 18 Jun 2019 • Yao-Hui Chen, Peng Li, Jun Xu, Shengjian Guo, Rundong Zhou, Yulong Zhang, Taowei, Long Lu
Unlike the existing hybrid testing tools, SAVIOR prioritizes the concolic execution of the seeds that are likely to uncover more vulnerabilities.
4 code implementations • ACL 2019 • Yuan Yao, Deming Ye, Peng Li, Xu Han, Yankai Lin, Zheng-Hao Liu, Zhiyuan Liu, Lixin Huang, Jie zhou, Maosong Sun
Multiple entities in a document generally exhibit complex inter-sentence relations, and cannot be well handled by existing relation extraction (RE) methods that typically focus on extracting intra-sentence relations for single entity pairs.
Ranked #58 on Relation Extraction on DocRED
no code implementations • 4 Jun 2019 • Rui Zhang, Zheng Zhu, Peng Li, Rui Wu, Chaoxu Guo, Guan Huang, Hailun Xia
Human pose estimation has witnessed a significant advance thanks to the development of deep learning.
no code implementations • 4 Jun 2019 • Peng Li, Jiabin Zhang, Zheng Zhu, Yanwei Li, Lu Jiang, Guan Huang
Multi-target Multi-camera Tracking (MTMCT) aims to extract the trajectories from videos captured by a set of cameras.
1 code implementation • NAACL 2019 • Xiaozhi Wang, Xu Han, Zhiyuan Liu, Maosong Sun, Peng Li
Modern weakly supervised methods for event detection (ED) avoid time-consuming human annotation and achieve promising results by learning from auto-labeled data.
2 code implementations • 24 May 2019 • Fuli Luo, Peng Li, Jie zhou, Pengcheng Yang, Baobao Chang, Zhifang Sui, Xu sun
Therefore, in this paper, we propose a dual reinforcement learning framework to directly transfer the style of the text via a one-step mapping model, without any separation of content and style.
Ranked #1 on Unsupervised Text Style Transfer on GYAFC
no code implementations • 20 Apr 2019 • Peng Li, Xi Rao, Jennifer Blase, Yue Zhang, Xu Chu, Ce Zhang
Data quality affects machine learning (ML) model performances, and data scientists spend considerable amount of time on data cleaning before model training.
no code implementations • 7 Mar 2019 • Qiu Ran, Peng Li, Weiwei Hu, Jie zhou
However, humans typically compare the options at multiple-granularity level before reading the article in detail to make reasoning more efficient.
Ranked #2 on Question Answering on RACE
no code implementations • 29 Jan 2019 • Myung Seok Shim, Chenye Zhao, Yang Li, Xuchong Zhang, Wenrui Zhang, Peng Li
Sensor fusion has wide applications in many domains including health care and autonomous systems.
no code implementations • 23 Jan 2019 • Bo Liu, Wenhao Chi, Xinran Li, Peng Li, Wenhua Liang, Haiping Liu, Wei Wang, Jianxing He
Lung cancer is the commonest cause of cancer deaths worldwide, and its mortality can be reduced significantly by performing early diagnosis and screening.
no code implementations • ICLR 2019 • Ting-Jui Chang, Yukun He, Peng Li
However, the computational cost of the adversarial training with PGD and other multi-step adversarial examples is much higher than that of the adversarial training with other simpler attack techniques.
no code implementations • ICLR 2019 • Myung Seok Shim, Peng Li
Sensor fusion is a key technology that integrates various sensory inputs to allow for robust decision making in many applications such as autonomous driving and robot control.
1 code implementation • EMNLP 2018 • Xu Han, Pengfei Yu, Zhiyuan Liu, Maosong Sun, Peng Li
In this paper, we aim to incorporate the hierarchical information of relations for distantly supervised relation extraction and propose a novel hierarchical attention scheme.
no code implementations • 13 Sep 2018 • Haichao Shi, Peng Li, Bo wang, Zhenyu Wang
However, in this paper, we propose a novel architecture for image captioning with deep reinforcement learning to optimize image captioning tasks.
1 code implementation • NeurIPS 2018 • Yingyezhe Jin, Wenrui Zhang, Peng Li
We evaluate the proposed HM2-BP algorithm by training deep fully connected and convolutional SNNs based on the static MNIST  and dynamic neuromorphic N-MNIST .
no code implementations • COLING 2016 • Xiaotian Jiang, Quan Wang, Peng Li, Bin Wang
In this paper, we propose a multi-instance multi-label convolutional neural network for distantly supervised RE.
3 code implementations • 21 Jul 2016 • Peng Li, Wei Li, Zhengyan He, Xuguang Wang, Ying Cao, Jie zhou, Wei Xu
While question answering (QA) with neural network, i. e. neural QA, has achieved promising results in recent years, lacking of large scale real-word QA dataset is still a challenge for developing and evaluating neural QA system.
1 code implementation • TACL 2016 • Jie Zhou, Ying Cao, Xuguang Wang, Peng Li, Wei Xu
On the WMT'14 English-to-French task, we achieve BLEU=37. 7 with a single attention model, which outperforms the corresponding single shallow model by 6. 2 BLEU points.
Ranked #37 on Machine Translation on WMT2014 English-French
no code implementations • 30 Mar 2016 • Peng Li, Heng Huang
We report an implementation of a clinical information extraction tool that leverages deep neural network to annotate event spans and their attributes from raw clinical notes and pathology reports.
no code implementations • 30 Mar 2016 • Peng Li, Heng Huang
Neural network based approaches for sentence relation modeling automatically generate hidden matching features from raw sentence pairs.