no code implementations • Findings (EMNLP) 2021 • Zhiwei Yang, Jing Ma, Hechang Chen, Yunke Zhang, Yi Chang
Specifically, we first utilize a two-phase module to generate span representations by aggregating context information based on a bottom-up and top-down transformer network.
no code implementations • 19 May 2022 • Yuanbo Xu, En Wang, Yongjian Yang, Yi Chang
On the other hand, ME models directly employ inner products as a default loss function metric that cannot project users and items into a proper latent space, which is a methodological disadvantage.
1 code implementation • 30 Mar 2022 • Yi Chang, Zhao Ren, Thanh Tam Nguyen, Wolfgang Nejdl, Björn W. Schuller
Respiratory sound classification is an important tool for remote screening of respiratory-related diseases such as pneumonia, asthma, and COVID-19.
no code implementations • 25 Mar 2022 • Changfeng Yu, Yi Chang, Yi Li, XiLe Zhao, Luxin Yan
Consequently, we design an optimization model-driven deep CNN in which the unsupervised loss function of the optimization model is enforced on the proposed network for better generalization.
no code implementations • 10 Mar 2022 • Björn W. Schuller, Alican Akman, Yi Chang, Harry Coppock, Alexander Gebhard, Alexander Kathan, Esther Rituerto-González, Andreas Triantafyllopoulos, Florian B. Pokorny
We categorise potential computer audition applications according to the five elements of earth, water, air, fire, and aether, proposed by the ancient Greeks in their five element theory; this categorisation serves as a framework to discuss computer audition in relation to different ecological aspects.
no code implementations • 9 Mar 2022 • Yi Chang, Sofiane Laridi, Zhao Ren, Gregory Palmer, Björn W. Schuller, Marco Fisichella
The proposed framework consists of i) federated learning for data privacy, and ii) adversarial training at the training stage and randomisation at the testing stage for model robustness.
no code implementations • 25 Jan 2022 • Lin Zhu, Xiao Wang, Yi Chang, Jianing Li, Tiejun Huang, Yonghong Tian
We propose a novel Event-based Video reconstruction framework based on a fully Spiking Neural Network (EVSNN), which utilizes Leaky-Integrate-and-Fire (LIF) neuron and Membrane Potential (MP) neuron.
1 code implementation • 24 Nov 2021 • Zhining Liu, Zhepei Wei, Erxin Yu, Qiang Huang, Kai Guo, Boyang Yu, Zhaonian Cai, Hangting Ye, Wei Cao, Jiang Bian, Pengfei Wei, Jing Jiang, Yi Chang
imbalanced-ensemble, abbreviated as imbens, is an open-source Python toolbox for quick implementing and deploying ensemble learning algorithms on class-imbalanced data.
1 code implementation • 24 Nov 2021 • Zhining Liu, Pengfei Wei, Zhepei Wei, Boyang Yu, Jing Jiang, Wei Cao, Jiang Bian, Yi Chang
We also present a detailed discussion and analysis about the pros and cons of different inter/intra-class balancing strategies based on DUBE .
no code implementations • 28 Oct 2021 • Haotian Xue, Kaixiong Zhou, Tianlong Chen, Kai Guo, Xia Hu, Yi Chang, Xin Wang
In this paper, we investigate GNNs from the lens of weight and feature loss landscapes, i. e., the loss changes with respect to model weights and node features, respectively.
no code implementations • 23 Sep 2021 • Kai Guo, Kaixiong Zhou, Xia Hu, Yu Li, Yi Chang, Xin Wang
Graph neural networks (GNNs) have received tremendous attention due to their superiority in learning node representations.
1 code implementation • Findings (EMNLP) 2021 • Bo wang, Tao Shen, Guodong Long, Tianyi Zhou, Yi Chang
Aspect-level sentiment classification (ALSC) aims at identifying the sentiment polarity of a specified aspect in a sentence.
no code implementations • 5 Jul 2021 • Chuan Tang, Xi Yang, Bojian Wu, Zhizhong Han, Yi Chang
To resolve this issue, we propose a method to learn joint embedding of point clouds and text by matching parts from shapes to words from sentences in a common space.
no code implementations • 1 Jul 2021 • Benhood Rasti, Yi Chang, Emanuele Dalsasso, Loïc Denis, Pedram Ghamisi
Additionally, this review paper accompanies a toolbox to provide a platform to encourage interested students and researchers in the field to further explore the restoration techniques and fast-forward the community.
no code implementations • 29 May 2021 • Yi-Si Luo, Xi-Le Zhao, Tai-Xiang Jiang, Yi Chang, Michael K. Ng, Chao Li
Recently, transform-based tensor nuclear norm minimization methods are considered to capture low-rank tensor structures to recover third-order tensors in multi-dimensional image processing applications.
1 code implementation • 28 May 2021 • Siyuan Guo, Lixin Zou, Yiding Liu, Wenwen Ye, Suqi Cheng, Shuaiqiang Wang, Hechang Chen, Dawei Yin, Yi Chang
Based on it, a more robust doubly robust (MRDR) estimator has been proposed to further reduce its variance while retaining its double robustness.
no code implementations • CVPR 2021 • Yuntong Ye, Yi Chang, Hanyu Zhou, Luxin Yan
Existing deep learning-based image deraining methods have achieved promising performance for synthetic rainy images, typically rely on the pairs of sharp images and simulated rainy counterparts.
1 code implementation • 22 Feb 2021 • Tingyu Xia, Yue Wang, Yuan Tian, Yi Chang
We study the problem of incorporating prior knowledge into a deep Transformer-based model, i. e., Bidirectional Encoder Representations from Transformers (BERT), to enhance its performance on semantic textual matching tasks.
no code implementations • 27 Jan 2021 • Yuxiang Ren, Bo wang, Jiawei Zhang, Yi Chang
AA-HGNN utilizes an active learning framework to enhance learning performance, especially when facing the paucity of labeled data.
no code implementations • ICLR 2021 • Xiaobo Xia, Tongliang Liu, Bo Han, Chen Gong, Nannan Wang, ZongYuan Ge, Yi Chang
The \textit{early stopping} method therefore can be exploited for learning with noisy labels.
Ranked #26 on
Image Classification
on mini WebVision 1.0
(ImageNet Top-1 Accuracy metric)
no code implementations • COLING 2020 • Erxin Yu, Wenjuan Han, Yuan Tian, Yi Chang
Distantly Supervised Relation Extraction (DSRE) has proven to be effective to find relational facts from texts, but it still suffers from two main problems: the wrong labeling problem and the long-tail problem.
2 code implementations • NeurIPS 2020 • Zhining Liu, Pengfei Wei, Jing Jiang, Wei Cao, Jiang Bian, Yi Chang
This makes MESA generally applicable to most of the existing learning models and the meta-sampler can be efficiently applied to new tasks.
no code implementations • 22 Aug 2020 • Yi-Si Luo, Xi-Le Zhao, Tai-Xiang Jiang, Yu-Bang Zheng, Yi Chang
Recently, convolutional neural network (CNN)-based methods are proposed for hyperspectral images (HSIs) denoising.
1 code implementation • 30 Apr 2020 • Bo Wang, Tao Shen, Guodong Long, Tianyi Zhou, Yi Chang
In experiments, we achieve state-of-the-art performance on three benchmarks and a zero-shot dataset for link prediction, with highlights of inference costs reduced by 1-2 orders of magnitude compared to a textual encoding method.
Ranked #2 on
Link Prediction
on UMLS
2 code implementations • 17 Jan 2020 • Qiang Huang, Makoto Yamada, Yuan Tian, Dinesh Singh, Dawei Yin, Yi Chang
In this paper, we propose GraphLIME, a local interpretable model explanation for graphs using the Hilbert-Schmidt Independence Criterion (HSIC) Lasso, which is a nonlinear feature selection method.
1 code implementation • 8 Sep 2019 • Zhining Liu, Wei Cao, Zhifeng Gao, Jiang Bian, Hechang Chen, Yi Chang, Tie-Yan Liu
To tackle this problem, we conduct deep investigations into the nature of class imbalance, which reveals that not only the disproportion between classes, but also other difficulties embedded in the nature of data, especially, noises and class overlapping, prevent us from learning effective classifiers.
4 code implementations • ACL 2020 • Zhepei Wei, Jianlin Su, Yue Wang, Yuan Tian, Yi Chang
Extracting relational triples from unstructured text is crucial for large-scale knowledge graph construction.
Ranked #5 on
Relation Extraction
on NYT10-HRL
no code implementations • 28 Aug 2019 • Chao-Lin Liu, Yi Chang
Chinese characters that are and are not followed by a punctuation mark are classified into two categories.
no code implementations • 23 Aug 2019 • Zhepei Wei, Yantao Jia, Yuan Tian, Mohammad Javad Hosseini, Sujian Li, Mark Steedman, Yi Chang
In this work, we first introduce the hierarchical dependency and horizontal commonality between the two levels, and then propose an entity-enhanced dual tagging framework that enables the triple extraction (TE) task to utilize such interactions with self-learned entity features through an auxiliary entity extraction (EE) task, without breaking the joint decoding of relational triples.
1 code implementation • 13 Aug 2019 • Ye Liu, Chenwei Zhang, Xiaohui Yan, Yi Chang, Philip S. Yu
To improve the quality and retrieval performance of the generated questions, we make two major improvements: 1) To better encode the semantics of ill-formed questions, we enrich the representation of questions with character embedding and the recent proposed contextual word embedding such as BERT, besides the traditional context-free word embeddings; 2) To make it capable to generate desired questions, we train the model with deep reinforcement learning techniques that considers an appropriate wording of the generation as an immediate reward and the correlation between generated question and answer as time-delayed long-term rewards.
no code implementations • 1 Mar 2019 • Shubhra Kanti Karmaker Santu, Liangda Li, Yi Chang, ChengXiang Zhai
This assumption is unrealistic as there are many correlated events in the real world which influence each other and thus, would pose a joint influence on the user search behavior rather than posing influence independently.
no code implementations • 20 Nov 2018 • Dae Hoon Park, Chiu Man Ho, Yi Chang, Huaqing Zhang
However, we observe that imposing strong L1 or L2 regularization with stochastic gradient descent on deep neural networks easily fails, which limits the generalization ability of the underlying neural networks.
no code implementations • 9 Nov 2018 • Dae Hoon Park, Yi Chang
To solve the problems at the same time, we propose an adversarial sampling and training framework to learn ad-hoc retrieval models with implicit feedback.
no code implementations • ICLR 2019 • Chiu Man Ho, Dae Hoon Park, Wei Yang, Yi Chang
We propose sequenced-replacement sampling (SRS) for training deep neural networks.
6 code implementations • EMNLP 2018 • Congying Xia, Chenwei Zhang, Xiaohui Yan, Yi Chang, Philip S. Yu
User intent detection plays a critical role in question-answering and dialog systems.
no code implementations • 26 Aug 2018 • Ye-Tao Wang, Xi-Le Zhao, Tai-Xiang Jiang, Liang-Jian Deng, Yi Chang, Ting-Zhu Huang
Then, our framework starts with learning the motion blur kernel, which is determined by two factors including angle and length, by a plain neural network, denoted as parameter net, from a patch of the texture component.
1 code implementation • ACL 2018 • Shashi Narayan, Ronald Cardenas, Nikos Papasarantopoulos, Shay B. Cohen, Mirella Lapata, Jiangsheng Yu, Yi Chang
Document modeling is essential to a variety of natural language understanding tasks.
no code implementations • ACL 2018 • Shuai Wang, Sahisnu Mazumder, Bing Liu, Mianwei Zhou, Yi Chang
In MNs, attention mechanism plays a crucial role in detecting the sentiment context for the given target.
no code implementations • NAACL 2018 • Fuad Issa, Marco Damonte, Shay B. Cohen, Xiaohui Yan, Yi Chang
Abstract Meaning Representation (AMR) parsing aims at abstracting away from the syntactic realization of a sentence, and denote only its meaning in a canonical form.
no code implementations • 16 Feb 2018 • Shuai Wang, Mianwei Zhou, Sahisnu Mazumder, Bing Liu, Yi Chang
Stage one extracts/groups the target-related words (call t-words) for a given target.
no code implementations • 18 Jan 2018 • Shuai Wang, Mianwei Zhou, Geli Fei, Yi Chang, Bing Liu
While existing machine learning models have achieved great success for sentiment classification, they typically do not explicitly capture sentiment-oriented word interaction, which can lead to poor results for fine-grained analysis at the snippet level (a phrase or sentence).
no code implementations • ICLR 2018 • Dae Hoon Park, Chiu Man Ho, Yi Chang
L1 and L2 regularizers are critical tools in machine learning due to their ability to simplify solutions.
no code implementations • ICCV 2017 • Yi Chang, Luxin Yan, Sheng Zhong
This paper addresses the problem of line pattern noise removal from a single image, such as rain streak, hyperspectral stripe and so on.
no code implementations • 1 Sep 2017 • Yi Chang, Luxin Yan, Houzhang Fang, Sheng Zhong, Zhijun Zhang
To overcome these limitations, in this work, we propose a unified low-rank tensor recovery model for comprehensive HSI restoration tasks, in which non-local similarity between spectral-spatial cubic and spectral correlation are simultaneously captured by 3-order tensors.
no code implementations • CVPR 2017 • Yi Chang, Luxin Yan, Sheng Zhong
Recent low-rank based matrix/tensor recovery methods have been widely explored in multispectral images (MSI) denoising.
no code implementations • 6 Jun 2017 • Jundong Li, Harsh Dani, Xia Hu, Jiliang Tang, Yi Chang, Huan Liu
To our best knowledge, we are the first to tackle this problem with the following two challenges: (1) the inherently correlated network and node attributes could be noisy and incomplete, it necessitates a robust consensus representation to capture their individual properties and correlations; (2) the embedding learning needs to be performed in an online fashion to adapt to the changes accordingly.
no code implementations • 14 Aug 2016 • Makoto Yamada, Jiliang Tang, Jose Lugo-Martinez, Ermin Hodzic, Raunak Shrestha, Avishek Saha, Hua Ouyang, Dawei Yin, Hiroshi Mamitsuka, Cenk Sahinalp, Predrag Radivojac, Filippo Menczer, Yi Chang
However, sophisticated learning models are computationally unfeasible for data with millions of features.
no code implementations • 21 Jul 2016 • Shiyu Chang, Yang Zhang, Jiliang Tang, Dawei Yin, Yi Chang, Mark A. Hasegawa-Johnson, Thomas S. Huang
The increasing popularity of real-world recommender systems produces data continuously and rapidly, and it becomes more realistic to study recommender systems under streaming scenarios.
no code implementations • 21 Jul 2016 • Yilin Wang, Suhang Wang, Jiliang Tang, Neil O'Hare, Yi Chang, Baoxin Li
Understanding human actions in wild videos is an important task with a broad range of applications.
no code implementations • 1 Jun 2016 • Tianyi Zhou, Hua Ouyang, Yi Chang, Jeff Bilmes, Carlos Guestrin
We propose a new random pruning method (called "submodular sparsification (SS)") to reduce the cost of submodular maximization.
no code implementations • 24 Nov 2015 • Jiliang Tang, Yi Chang, Charu Aggarwal, Huan Liu
Many real-world relations can be represented by signed networks with positive and negative links, as a result of which signed network analysis has attracted increasing attention from multiple disciplines.
1 code implementation • 4 Jul 2015 • Makoto Yamada, Wenzhao Lian, Amit Goyal, Jianhui Chen, Kishan Wimalawarne, Suleiman A. Khan, Samuel Kaski, Hiroshi Mamitsuka, Yi Chang
We propose the convex factorization machine (CFM), which is a convex variant of the widely used Factorization Machines (FMs).
no code implementations • 5 Dec 2014 • Suriya Gunasekar, Makoto Yamada, Dawei Yin, Yi Chang
We address the collective matrix completion problem of jointly recovering a collection of matrices with shared structure from partial (and potentially noisy) observations.
no code implementations • 10 Nov 2014 • Makoto Yamada, Avishek Saha, Hua Ouyang, Dawei Yin, Yi Chang
We propose a feature selection method that finds non-redundant features from a large and high-dimensional data in nonlinear way.
no code implementations • 19 Apr 2013 • Jianhui Chen, Tianbao Yang, Qihang Lin, Lijun Zhang, Yi Chang
We consider stochastic strongly convex optimization with a complex inequality constraint.