Search Results for author: Yi Chang

Found 56 papers, 15 papers with code

HiTRANS: A Hierarchical Transformer Network for Nested Named Entity Recognition

no code implementations Findings (EMNLP) 2021 Zhiwei Yang, Jing Ma, Hechang Chen, Yunke Zhang, Yi Chang

Specifically, we first utilize a two-phase module to generate span representations by aggregating context information based on a bottom-up and top-down transformer network.

Nested Named Entity Recognition Representation Learning

A Unified Collaborative Representation Learning for Neural-Network based Recommender Systems

no code implementations19 May 2022 Yuanbo Xu, En Wang, Yongjian Yang, Yi Chang

On the other hand, ME models directly employ inner products as a default loss function metric that cannot project users and items into a proper latent space, which is a methodological disadvantage.

Metric Learning Recommendation Systems +1

Example-based Explanations with Adversarial Attacks for Respiratory Sound Analysis

1 code implementation30 Mar 2022 Yi Chang, Zhao Ren, Thanh Tam Nguyen, Wolfgang Nejdl, Björn W. Schuller

Respiratory sound classification is an important tool for remote screening of respiratory-related diseases such as pneumonia, asthma, and COVID-19.

Unsupervised Image Deraining: Optimization Model Driven Deep CNN

no code implementations25 Mar 2022 Changfeng Yu, Yi Chang, Yi Li, XiLe Zhao, Luxin Yan

Consequently, we design an optimization model-driven deep CNN in which the unsupervised loss function of the optimization model is enforced on the proposed network for better generalization.

Rain Removal

Climate Change & Computer Audition: A Call to Action and Overview on Audio Intelligence to Help Save the Planet

no code implementations10 Mar 2022 Björn W. Schuller, Alican Akman, Yi Chang, Harry Coppock, Alexander Gebhard, Alexander Kathan, Esther Rituerto-González, Andreas Triantafyllopoulos, Florian B. Pokorny

We categorise potential computer audition applications according to the five elements of earth, water, air, fire, and aether, proposed by the ancient Greeks in their five element theory; this categorisation serves as a framework to discuss computer audition in relation to different ecological aspects.

Robust Federated Learning Against Adversarial Attacks for Speech Emotion Recognition

no code implementations9 Mar 2022 Yi Chang, Sofiane Laridi, Zhao Ren, Gregory Palmer, Björn W. Schuller, Marco Fisichella

The proposed framework consists of i) federated learning for data privacy, and ii) adversarial training at the training stage and randomisation at the testing stage for model robustness.

Federated Learning Speech Emotion Recognition

Event-based Video Reconstruction via Potential-assisted Spiking Neural Network

no code implementations25 Jan 2022 Lin Zhu, Xiao Wang, Yi Chang, Jianing Li, Tiejun Huang, Yonghong Tian

We propose a novel Event-based Video reconstruction framework based on a fully Spiking Neural Network (EVSNN), which utilizes Leaky-Integrate-and-Fire (LIF) neuron and Membrane Potential (MP) neuron.

Image Reconstruction Video Reconstruction

IMBENS: Ensemble Class-imbalanced Learning in Python

1 code implementation24 Nov 2021 Zhining Liu, Zhepei Wei, Erxin Yu, Qiang Huang, Kai Guo, Boyang Yu, Zhaonian Cai, Hangting Ye, Wei Cao, Jiang Bian, Pengfei Wei, Jing Jiang, Yi Chang

imbalanced-ensemble, abbreviated as imbens, is an open-source Python toolbox for quick implementing and deploying ensemble learning algorithms on class-imbalanced data.

Ensemble Learning

Towards Inter-class and Intra-class Imbalance in Class-imbalanced Learning

1 code implementation24 Nov 2021 Zhining Liu, Pengfei Wei, Zhepei Wei, Boyang Yu, Jing Jiang, Wei Cao, Jiang Bian, Yi Chang

We also present a detailed discussion and analysis about the pros and cons of different inter/intra-class balancing strategies based on DUBE .

Ensemble Learning

CAP: Co-Adversarial Perturbation on Weights and Features for Improving Generalization of Graph Neural Networks

no code implementations28 Oct 2021 Haotian Xue, Kaixiong Zhou, Tianlong Chen, Kai Guo, Xia Hu, Yi Chang, Xin Wang

In this paper, we investigate GNNs from the lens of weight and feature loss landscapes, i. e., the loss changes with respect to model weights and node features, respectively.

Orthogonal Graph Neural Networks

no code implementations23 Sep 2021 Kai Guo, Kaixiong Zhou, Xia Hu, Yu Li, Yi Chang, Xin Wang

Graph neural networks (GNNs) have received tremendous attention due to their superiority in learning node representations.

Graph Classification

Part2Word: Learning Joint Embedding of Point Clouds and Text by Matching Parts to Words

no code implementations5 Jul 2021 Chuan Tang, Xi Yang, Bojian Wu, Zhizhong Han, Yi Chang

To resolve this issue, we propose a method to learn joint embedding of point clouds and text by matching parts from shapes to words from sentences in a common space.

Text Matching

Image Restoration for Remote Sensing: Overview and Toolbox

no code implementations1 Jul 2021 Benhood Rasti, Yi Chang, Emanuele Dalsasso, Loïc Denis, Pedram Ghamisi

Additionally, this review paper accompanies a toolbox to provide a platform to encourage interested students and researchers in the field to further explore the restoration techniques and fast-forward the community.

Image Restoration

Self-Supervised Nonlinear Transform-Based Tensor Nuclear Norm for Multi-Dimensional Image Recovery

no code implementations29 May 2021 Yi-Si Luo, Xi-Le Zhao, Tai-Xiang Jiang, Yi Chang, Michael K. Ng, Chao Li

Recently, transform-based tensor nuclear norm minimization methods are considered to capture low-rank tensor structures to recover third-order tensors in multi-dimensional image processing applications.

Enhanced Doubly Robust Learning for Debiasing Post-click Conversion Rate Estimation

1 code implementation28 May 2021 Siyuan Guo, Lixin Zou, Yiding Liu, Wenwen Ye, Suqi Cheng, Shuaiqiang Wang, Hechang Chen, Dawei Yin, Yi Chang

Based on it, a more robust doubly robust (MRDR) estimator has been proposed to further reduce its variance while retaining its double robustness.

Imputation Recommendation Systems +1

Closing the Loop: Joint Rain Generation and Removal via Disentangled Image Translation

no code implementations CVPR 2021 Yuntong Ye, Yi Chang, Hanyu Zhou, Luxin Yan

Existing deep learning-based image deraining methods have achieved promising performance for synthetic rainy images, typically rely on the pairs of sharp images and simulated rainy counterparts.

Disentanglement Rain Removal +1

Using Prior Knowledge to Guide BERT's Attention in Semantic Textual Matching Tasks

1 code implementation22 Feb 2021 Tingyu Xia, Yue Wang, Yuan Tian, Yi Chang

We study the problem of incorporating prior knowledge into a deep Transformer-based model, i. e., Bidirectional Encoder Representations from Transformers (BERT), to enhance its performance on semantic textual matching tasks.

Adversarial Active Learning based Heterogeneous Graph Neural Network for Fake News Detection

no code implementations27 Jan 2021 Yuxiang Ren, Bo wang, Jiawei Zhang, Yi Chang

AA-HGNN utilizes an active learning framework to enhance learning performance, especially when facing the paucity of labeled data.

Active Learning Fake News Detection +2

ToHRE: A Top-Down Classification Strategy with Hierarchical Bag Representation for Distantly Supervised Relation Extraction

no code implementations COLING 2020 Erxin Yu, Wenjuan Han, Yuan Tian, Yi Chang

Distantly Supervised Relation Extraction (DSRE) has proven to be effective to find relational facts from texts, but it still suffers from two main problems: the wrong labeling problem and the long-tail problem.

Classification Relation Extraction

MESA: Boost Ensemble Imbalanced Learning with MEta-SAmpler

2 code implementations NeurIPS 2020 Zhining Liu, Pengfei Wei, Jing Jiang, Wei Cao, Jiang Bian, Yi Chang

This makes MESA generally applicable to most of the existing learning models and the meta-sampler can be efficiently applied to new tasks.

imbalanced classification Meta-Learning

Unsupervised Hyperspectral Mixed Noise Removal Via Spatial-Spectral Constrained Deep Image Prior

no code implementations22 Aug 2020 Yi-Si Luo, Xi-Le Zhao, Tai-Xiang Jiang, Yu-Bang Zheng, Yi Chang

Recently, convolutional neural network (CNN)-based methods are proposed for hyperspectral images (HSIs) denoising.

Denoising

Structure-Augmented Text Representation Learning for Efficient Knowledge Graph Completion

1 code implementation30 Apr 2020 Bo Wang, Tao Shen, Guodong Long, Tianyi Zhou, Yi Chang

In experiments, we achieve state-of-the-art performance on three benchmarks and a zero-shot dataset for link prediction, with highlights of inference costs reduced by 1-2 orders of magnitude compared to a textual encoding method.

Graph Embedding Knowledge Graph Completion +1

GraphLIME: Local Interpretable Model Explanations for Graph Neural Networks

2 code implementations17 Jan 2020 Qiang Huang, Makoto Yamada, Yuan Tian, Dinesh Singh, Dawei Yin, Yi Chang

In this paper, we propose GraphLIME, a local interpretable model explanation for graphs using the Hilbert-Schmidt Independence Criterion (HSIC) Lasso, which is a nonlinear feature selection method.

Self-paced Ensemble for Highly Imbalanced Massive Data Classification

1 code implementation8 Sep 2019 Zhining Liu, Wei Cao, Zhifeng Gao, Jiang Bian, Hechang Chen, Yi Chang, Tie-Yan Liu

To tackle this problem, we conduct deep investigations into the nature of class imbalance, which reveals that not only the disproportion between classes, but also other difficulties embedded in the nature of data, especially, noises and class overlapping, prevent us from learning effective classifiers.

Classification General Classification +1

Classical Chinese Sentence Segmentation for Tomb Biographies of Tang Dynasty

no code implementations28 Aug 2019 Chao-Lin Liu, Yi Chang

Chinese characters that are and are not followed by a punctuation mark are classified into two categories.

Sentence segmentation

Jointly Modeling Hierarchical and Horizontal Features for Relational Triple Extraction

no code implementations23 Aug 2019 Zhepei Wei, Yantao Jia, Yuan Tian, Mohammad Javad Hosseini, Sujian Li, Mark Steedman, Yi Chang

In this work, we first introduce the hierarchical dependency and horizontal commonality between the two levels, and then propose an entity-enhanced dual tagging framework that enables the triple extraction (TE) task to utilize such interactions with self-learned entity features through an auxiliary entity extraction (EE) task, without breaking the joint decoding of relational triples.

Entity Extraction using GAN graph construction +2

Generative Question Refinement with Deep Reinforcement Learning in Retrieval-based QA System

1 code implementation13 Aug 2019 Ye Liu, Chenwei Zhang, Xiaohui Yan, Yi Chang, Philip S. Yu

To improve the quality and retrieval performance of the generated questions, we make two major improvements: 1) To better encode the semantics of ill-formed questions, we enrich the representation of questions with character embedding and the recent proposed contextual word embedding such as BERT, besides the traditional context-free word embeddings; 2) To make it capable to generate desired questions, we train the model with deep reinforcement learning techniques that considers an appropriate wording of the generation as an immediate reward and the correlation between generated question and answer as time-delayed long-term rewards.

Question Answering reinforcement-learning +1

JIM: Joint Influence Modeling for Collective Search Behavior

no code implementations1 Mar 2019 Shubhra Kanti Karmaker Santu, Liangda Li, Yi Chang, ChengXiang Zhai

This assumption is unrealistic as there are many correlated events in the real world which influence each other and thus, would pose a joint influence on the user search behavior rather than posing influence independently.

Gradient-Coherent Strong Regularization for Deep Neural Networks

no code implementations20 Nov 2018 Dae Hoon Park, Chiu Man Ho, Yi Chang, Huaqing Zhang

However, we observe that imposing strong L1 or L2 regularization with stochastic gradient descent on deep neural networks easily fails, which limits the generalization ability of the underlying neural networks.

L2 Regularization

Adversarial Sampling and Training for Semi-Supervised Information Retrieval

no code implementations9 Nov 2018 Dae Hoon Park, Yi Chang

To solve the problems at the same time, we propose an adversarial sampling and training framework to learn ad-hoc retrieval models with implicit feedback.

Information Retrieval Question Answering

Sequenced-Replacement Sampling for Deep Learning

no code implementations ICLR 2019 Chiu Man Ho, Dae Hoon Park, Wei Yang, Yi Chang

We propose sequenced-replacement sampling (SRS) for training deep neural networks.

Rain Streak Removal for Single Image via Kernel Guided CNN

no code implementations26 Aug 2018 Ye-Tao Wang, Xi-Le Zhao, Tai-Xiang Jiang, Liang-Jian Deng, Yi Chang, Ting-Zhu Huang

Then, our framework starts with learning the motion blur kernel, which is determined by two factors including angle and length, by a plain neural network, denoted as parameter net, from a patch of the texture component.

Abstract Meaning Representation for Paraphrase Detection

no code implementations NAACL 2018 Fuad Issa, Marco Damonte, Shay B. Cohen, Xiaohui Yan, Yi Chang

Abstract Meaning Representation (AMR) parsing aims at abstracting away from the syntactic realization of a sentence, and denote only its meaning in a canonical form.

AMR Parsing

Contextual and Position-Aware Factorization Machines for Sentiment Classification

no code implementations18 Jan 2018 Shuai Wang, Mianwei Zhou, Geli Fei, Yi Chang, Bing Liu

While existing machine learning models have achieved great success for sentiment classification, they typically do not explicitly capture sentiment-oriented word interaction, which can lead to poor results for fine-grained analysis at the snippet level (a phrase or sentence).

Classification General Classification +3

Achieving Strong Regularization for Deep Neural Networks

no code implementations ICLR 2018 Dae Hoon Park, Chiu Man Ho, Yi Chang

L1 and L2 regularizers are critical tools in machine learning due to their ability to simplify solutions.

L2 Regularization

Transformed Low-Rank Model for Line Pattern Noise Removal

no code implementations ICCV 2017 Yi Chang, Luxin Yan, Sheng Zhong

This paper addresses the problem of line pattern noise removal from a single image, such as rain streak, hyperspectral stripe and so on.

Weighted Low-rank Tensor Recovery for Hyperspectral Image Restoration

no code implementations1 Sep 2017 Yi Chang, Luxin Yan, Houzhang Fang, Sheng Zhong, Zhijun Zhang

To overcome these limitations, in this work, we propose a unified low-rank tensor recovery model for comprehensive HSI restoration tasks, in which non-local similarity between spectral-spatial cubic and spectral correlation are simultaneously captured by 3-order tensors.

Deblurring Denoising +2

Hyper-Laplacian Regularized Unidirectional Low-Rank Tensor Recovery for Multispectral Image Denoising

no code implementations CVPR 2017 Yi Chang, Luxin Yan, Sheng Zhong

Recent low-rank based matrix/tensor recovery methods have been widely explored in multispectral images (MSI) denoising.

Image Denoising

Attributed Network Embedding for Learning in a Dynamic Environment

no code implementations6 Jun 2017 Jundong Li, Harsh Dani, Xia Hu, Jiliang Tang, Yi Chang, Huan Liu

To our best knowledge, we are the first to tackle this problem with the following two challenges: (1) the inherently correlated network and node attributes could be noisy and incomplete, it necessitates a robust consensus representation to capture their individual properties and correlations; (2) the embedding learning needs to be performed in an online fashion to adapt to the changes accordingly.

Link Prediction Network Embedding +1

Streaming Recommender Systems

no code implementations21 Jul 2016 Shiyu Chang, Yang Zhang, Jiliang Tang, Dawei Yin, Yi Chang, Mark A. Hasegawa-Johnson, Thomas S. Huang

The increasing popularity of real-world recommender systems produces data continuously and rapidly, and it becomes more realistic to study recommender systems under streaming scenarios.

Recommendation Systems

Scaling Submodular Maximization via Pruned Submodularity Graphs

no code implementations1 Jun 2016 Tianyi Zhou, Hua Ouyang, Yi Chang, Jeff Bilmes, Carlos Guestrin

We propose a new random pruning method (called "submodular sparsification (SS)") to reduce the cost of submodular maximization.

Video Summarization

A Survey of Signed Network Mining in Social Media

no code implementations24 Nov 2015 Jiliang Tang, Yi Chang, Charu Aggarwal, Huan Liu

Many real-world relations can be represented by signed networks with positive and negative links, as a result of which signed network analysis has attracted increasing attention from multiple disciplines.

Convex Factorization Machine for Regression

1 code implementation4 Jul 2015 Makoto Yamada, Wenzhao Lian, Amit Goyal, Jianhui Chen, Kishan Wimalawarne, Suleiman A. Khan, Samuel Kaski, Hiroshi Mamitsuka, Yi Chang

We propose the convex factorization machine (CFM), which is a convex variant of the widely used Factorization Machines (FMs).

Consistent Collective Matrix Completion under Joint Low Rank Structure

no code implementations5 Dec 2014 Suriya Gunasekar, Makoto Yamada, Dawei Yin, Yi Chang

We address the collective matrix completion problem of jointly recovering a collection of matrices with shared structure from partial (and potentially noisy) observations.

Matrix Completion

N$^3$LARS: Minimum Redundancy Maximum Relevance Feature Selection for Large and High-dimensional Data

no code implementations10 Nov 2014 Makoto Yamada, Avishek Saha, Hua Ouyang, Dawei Yin, Yi Chang

We propose a feature selection method that finds non-redundant features from a large and high-dimensional data in nonlinear way.

Distributed Computing

Optimal Stochastic Strongly Convex Optimization with a Logarithmic Number of Projections

no code implementations19 Apr 2013 Jianhui Chen, Tianbao Yang, Qihang Lin, Lijun Zhang, Yi Chang

We consider stochastic strongly convex optimization with a complex inequality constraint.

Cannot find the paper you are looking for? You can Submit a new open access paper.