Search Results for author: Zhengdong Lu

Found 43 papers, 12 papers with code

Eraser: Jailbreaking Defense in Large Language Models via Unlearning Harmful Knowledge

no code implementations8 Apr 2024 Weikai Lu, Ziqian Zeng, Jianwei Wang, Zhengdong Lu, Zelin Chen, Huiping Zhuang, Cen Chen

Jailbreaking attacks can enable Large Language Models (LLMs) to bypass the safeguard and generate harmful content.

General Knowledge

Weakly Supervised Reasoning by Neuro-Symbolic Approaches

no code implementations19 Sep 2023 Xianggen Liu, Zhengdong Lu, Lili Mou

Deep learning has largely improved the performance of various natural language processing (NLP) tasks.

Self-Balanced Dropout

1 code implementation6 Aug 2019 Shen Li, Chenhao Su, Renfen Hu, Zhengdong Lu

Dropout is known as an effective way to reduce overfitting via preventing co-adaptations of units.

Zooming Network

no code implementations4 Oct 2018 Yukun Yan, Daqi Zheng, Zhengdong Lu, Sen Song

Structural information is important in natural language understanding.

Natural Language Understanding

Neural Entity Reasoner for Global Consistency in NER

no code implementations30 Sep 2018 Xiaoxiao Yin, Daqi Zheng, Zhengdong Lu, Ruifang Liu

Given an input sentence, the NE-Reasoner layer can infer over multiple entities to increase the global consistency of output labels, which then be transfered into entities for the input of next layer.

named-entity-recognition Named Entity Recognition +2

Generalize Symbolic Knowledge With Neural Rule Engine

no code implementations30 Aug 2018 Shen Li, Hengru Xu, Zhengdong Lu

As neural networks have dominated the state-of-the-art results in a wide range of NLP tasks, it attracts considerable attention to improve the performance of neural models by integrating symbolic knowledge.

JUMPER: Learning When to Make Classification Decisions in Reading

no code implementations6 Jul 2018 Xianggen Liu, Lili Mou, Haotian Cui, Zhengdong Lu, Sen Song

Both the classification result and when to make the classification are part of the decision process, which is controlled by a policy network and trained with reinforcement learning.

General Classification text-classification +1

Deep Neural Machine Translation with Linear Associative Unit

no code implementations ACL 2017 Mingxuan Wang, Zhengdong Lu, Jie zhou, Qun Liu

Deep Neural Networks (DNNs) have provably enhanced the state-of-the-art Neural Machine Translation (NMT) with their capability in modeling complex functions and capturing complex linguistic structures.

Decoder Machine Translation +2

Coupling Distributed and Symbolic Execution for Natural Language Queries

no code implementations ICML 2017 Lili Mou, Zhengdong Lu, Hang Li, Zhi Jin

Building neural networks to query a knowledge base (a table) with natural language is an emerging research topic in deep learning.

Natural Language Queries

Neural Machine Translation Advised by Statistical Machine Translation

no code implementations17 Oct 2016 Xing Wang, Zhengdong Lu, Zhaopeng Tu, Hang Li, Deyi Xiong, Min Zhang

Neural Machine Translation (NMT) is a new approach to machine translation that has made great progress in recent years.

Machine Translation NMT +1

Interactive Attention for Neural Machine Translation

no code implementations COLING 2016 Fandong Meng, Zhengdong Lu, Hang Li, Qun Liu

Conventional attention-based Neural Machine Translation (NMT) conducts dynamic alignment in generating the target sentence.

Decoder Machine Translation +3

Context Gates for Neural Machine Translation

2 code implementations TACL 2017 Zhaopeng Tu, Yang Liu, Zhengdong Lu, Xiaohua Liu, Hang Li

In neural machine translation (NMT), generation of a target word depends on both source and target contexts.

Machine Translation NMT +1

Memory-enhanced Decoder for Neural Machine Translation

no code implementations EMNLP 2016 Mingxuan Wang, Zhengdong Lu, Hang Li, Qun Liu

We propose to enhance the RNN decoder in a neural machine translator (NMT) with external memory, as a natural but powerful extension to the state in the decoding RNN.

Decoder Machine Translation +3

Neural Machine Translation with External Phrase Memory

no code implementations6 Jun 2016 Yaohua Tang, Fandong Meng, Zhengdong Lu, Hang Li, Philip L. H. Yu

In this paper, we propose phraseNet, a neural machine translator with a phrase memory which stores phrase pairs in symbolic form, mined from corpus or specified by human experts.

Decoder Machine Translation +2

Incorporating Copying Mechanism in Sequence-to-Sequence Learning

7 code implementations ACL 2016 Jiatao Gu, Zhengdong Lu, Hang Li, Victor O. K. Li

CopyNet can nicely integrate the regular way of word generation in the decoder with the new copying mechanism which can choose sub-sequences in the input sequence and put them at proper places in the output sequence.

Decoder Text Summarization

Modeling Coverage for Neural Machine Translation

3 code implementations ACL 2016 Zhaopeng Tu, Zhengdong Lu, Yang Liu, Xiaohua Liu, Hang Li

Attention mechanism has enhanced state-of-the-art Neural Machine Translation (NMT) by jointly learning to align and translate.

Machine Translation NMT +1

Neural Generative Question Answering

1 code implementation WS 2016 Jun Yin, Xin Jiang, Zhengdong Lu, Lifeng Shang, Hang Li, Xiaoming Li

Empirical study shows the proposed model can effectively deal with the variations of questions and answers, and generate right and natural answers by referring to the facts in the knowledge-base.

Decoder Generative Question Answering +1

Neural Enquirer: Learning to Query Tables with Natural Language

no code implementations3 Dec 2015 Pengcheng Yin, Zhengdong Lu, Hang Li, Ben Kao

Neural Enquirer can be trained with gradient descent, with which not only the parameters of the controlling components and semantic parsing component, but also the embeddings of the tables and query words can be learned from scratch.

Semantic Parsing

Towards Neural Network-based Reasoning

1 code implementation22 Aug 2015 Baolin Peng, Zhengdong Lu, Hang Li, Kam-Fai Wong

For example, it improves the accuracy on Path Finding(10K) from 33. 4% [6] to over 98%.

A Deep Memory-based Architecture for Sequence-to-Sequence Learning

no code implementations22 Jun 2015 Fandong Meng, Zhengdong Lu, Zhaopeng Tu, Hang Li, Qun Liu

We propose DEEPMEMORY, a novel deep architecture for sequence-to-sequence learning, which performs the task through a series of nonlinear transformations from the representation of the input sequence (e. g., a Chinese sentence) to the final output sequence (e. g., translation to English).

Machine Translation Sentence +1

Learning to Answer Questions From Image Using Convolutional Neural Network

no code implementations1 Jun 2015 Lin Ma, Zhengdong Lu, Hang Li

We demonstrate the efficacy of our proposed model on the DAQUAR and COCO-QA datasets, which are two benchmark datasets for the image QA, with the performances significantly outperforming the state-of-the-art.

General Classification Question Answering +2

Self-Adaptive Hierarchical Sentence Model

1 code implementation20 Apr 2015 Han Zhao, Zhengdong Lu, Pascal Poupart

The ability to accurately model a sentence at varying stages (e. g., word-phrase-sentence) plays a central role in natural language processing.

General Classification Sentence +1

$gen$CNN: A Convolutional Architecture for Word Sequence Prediction

no code implementations17 Mar 2015 Mingxuan Wang, Zhengdong Lu, Hang Li, Wenbin Jiang, Qun Liu

Different from previous work on neural network-based language modeling and generation (e. g., RNN or LSTM), we choose not to greedily summarize the history of words as a fixed length vector.

Language Modelling Machine Translation +3

Context-Dependent Translation Selection Using Convolutional Neural Network

no code implementations IJCNLP 2015 Zhaopeng Tu, Baotian Hu, Zhengdong Lu, Hang Li

We propose a novel method for translation selection in statistical machine translation, in which a convolutional neural network is employed to judge the similarity between a phrase pair in two languages.

Machine Translation Semantic Similarity +3

Neural Responding Machine for Short-Text Conversation

4 code implementations IJCNLP 2015 Lifeng Shang, Zhengdong Lu, Hang Li

We propose Neural Responding Machine (NRM), a neural network-based response generator for Short-Text Conversation.

Decoder Retrieval +1

Syntax-based Deep Matching of Short Texts

no code implementations9 Mar 2015 Mingxuan Wang, Zhengdong Lu, Hang Li, Qun Liu

Many tasks in natural language processing, ranging from machine translation to question answering, can be reduced to the problem of matching two sentences or more generally two short texts.

Machine Translation Question Answering +1

Encoding Source Language with Convolutional Neural Network for Machine Translation

no code implementations IJCNLP 2015 Fandong Meng, Zhengdong Lu, Mingxuan Wang, Hang Li, Wenbin Jiang, Qun Liu

The recently proposed neural network joint model (NNJM) (Devlin et al., 2014) augments the n-gram target language model with a heuristically chosen source context window, achieving state-of-the-art performance in SMT.

Language Modelling Machine Translation +2

A Parallel and Efficient Algorithm for Learning to Match

no code implementations22 Oct 2014 Jingbo Shang, Tianqi Chen, Hang Li, Zhengdong Lu, Yong Yu

In this paper, we tackle this challenge with a novel parallel and efficient algorithm for feature-based matrix factorization.

Collaborative Filtering Link Prediction

An Information Retrieval Approach to Short Text Conversation

1 code implementation29 Aug 2014 Zongcheng Ji, Zhengdong Lu, Hang Li

Human computer conversation is regarded as one of the most difficult problems in artificial intelligence.

Information Retrieval Retrieval +1

A Deep Architecture for Matching Short Texts

no code implementations NeurIPS 2013 Zhengdong Lu, Hang Li

Many machine learning problems can be interpreted as learning for matching two types of objects (e. g., images and captions, users and products, queries and documents).

A Denoising View of Matrix Completion

no code implementations NeurIPS 2011 Weiran Wang, Miguel Á. Carreira-Perpiñán, Zhengdong Lu

In matrix completion, we are given a matrix where the values of only some of the entries are present, and we want to reconstruct the missing ones.

Denoising Matrix Completion

Hierarchical Fisher Kernels for Longitudinal Data

no code implementations NeurIPS 2008 Zhengdong Lu, Jeffrey Kaye, Todd K. Leen

We develop new techniques for time series classification based on hierarchical Bayesian generative models (called mixed-effect models) and the Fisher kernel derived from them.

General Classification Time Series +2

People Tracking with the Laplacian Eigenmaps Latent Variable Model

no code implementations NeurIPS 2007 Zhengdong Lu, Cristian Sminchisescu, Miguel Á. Carreira-Perpiñán

Reliably recovering 3D human pose from monocular video requires constraints that bias the estimates towards typical human poses and motions.

Dimensionality Reduction

Cannot find the paper you are looking for? You can Submit a new open access paper.