no code implementations • ECCV 2020 • Tan Yu, Yunfeng Cai, Ping Li
To boost the efficiency in the GPU platform, recent methods rely on Newton-Schulz (NS) iteration to approximate the matrix square-root.
no code implementations • ACL 2022 • Xin Wang, Minlong Peng, Mingming Sun, Ping Li
OIE@OIA follows the methodology of Open Information eXpression (OIX): parsing a sentence to an Open Information Annotation (OIA) Graph and then adapting the OIA graph to different OIE tasks with simple rules.
no code implementations • Findings (EMNLP) 2021 • Dingcheng Li, Hongliang Fei, Shaogang Ren, Ping Li
Recently, disentanglement based on a generative adversarial network or a variational autoencoder has significantly advanced the performance of diverse applications in CV and NLP domains.
no code implementations • EMNLP 2021 • Haoliang Liu, Tan Yu, Ping Li
Through an inflating operation followed by a shrinking operation, both efficiency and accuracy of a late-interaction model are boosted.
no code implementations • EMNLP 2020 • Mingming Sun, Wenyue Hua, Zoey Liu, Xin Wang, Kangjie Zheng, Ping Li
Based on the same platform of OIX, the OIE strategies are reusable, and people can select a set of strategies to assemble their algorithm for a specific task so that the adaptability may be significantly increased.
no code implementations • ICML 2020 • Hang Zhang, Ping Li
Unlabeled linear regression, or ``linear regression with an unknown permutation'', has attracted increasing attentions due to its applications in linkage record and de-anonymization.
no code implementations • Findings (NAACL) 2022 • Yue Zhang, Hongliang Fei, Dingcheng Li, Ping Li
Recently, prompt learning has received significant attention, where the downstream tasks are reformulated to the mask-filling task with the help of a textual prompt.
no code implementations • Findings (NAACL) 2022 • Yue Feng, Zhen Han, Mingming Sun, Ping Li
DEHG employs a graph constructor to integrate structured and unstructured information, a context encoder to represent nodes and question, a heterogeneous information reasoning layer to conduct multi-hop reasoning on both information sources, and an answer decoder to generate answers for the question.
no code implementations • Findings (NAACL) 2022 • Jiaheng Liu, Tan Yu, Hanyu Peng, Mingming Sun, Ping Li
Existing multilingual video corpus moment retrieval (mVCMR) methods are mainly based on a two-stream structure.
no code implementations • NAACL 2022 • Haiyan Yin, Dingcheng Li, Ping Li
In this paper, we propose a new weakly supervised paraphrase generation approach that extends the success of a recent work that leverages reinforcement learning for effective model training with data selection.
no code implementations • 21 Mar 2023 • Yang Zhao, Jianwen Xie, Ping Li
The proposed algorithm consists of two learning stages: (i) Cooperative initialization stage: The discriminator of GAN is treated as an energy-based model (EBM) and is optimized via maximum likelihood estimation (MLE), with the help of the GAN's generator to provide synthetic data to approximate the learning gradients.
no code implementations • 20 Mar 2023 • Hang Zhang, Ping Li
From the statistical aspect, we first establish the minimax lower bounds on the sample number $n$ and the \emph{signal-to-noise ratio} ($\snr$) for the correct recovery of permutation matrix $\bPitrue$ and the support set $\supp(\bbetatrue)$, to be more specific, $n \gtrsim k\log p$ and $\log\snr \gtrsim \log n + \frac{k\log p}{n}$.
no code implementations • 20 Mar 2023 • Yu Cao, Xiangqiao Meng, P. Y. Mok, Xueting Liu, Tong-Yee Lee, Ping Li
Through multiple quantitative metrics evaluated on our dataset and a user study, we demonstrate AnimeDiffusion outperforms state-of-the-art GANs-based models for anime face line drawing colorization.
no code implementations • 2 Mar 2023 • Acong Zhang, Ping Li, Guanrong Chen
In the semi-supervised setting where labeled data are largely limited, it remains to be a big challenge for message passing based graph neural networks (GNNs) to learn feature representations for the nodes with the same class label that is distributed discontinuously over the graph.
no code implementations • 17 Feb 2023 • Acong Zhang, Jincheng Huang, Ping Li, Kai Zhang
Multiple recent studies show a paradox in graph convolutional networks (GCNs), that is, shallow architectures limit the capability of learning information from high-order neighbors, while deep architectures suffer from over-smoothing or over-squashing.
no code implementations • 7 Feb 2023 • Ping Li, Xiaoyun Li
We show that the estimation variance is essentially: $(s-1)A + \frac{D-k}{D-1}\frac{1}{k}\left[ (1-\rho^2)^2 -2A\right]$, where $A\geq 0$ is a function of the data ($u, v$).
no code implementations • 23 Jan 2023 • Jianwen Xie, Yaxuan Zhu, Yifei Xu, Dingcheng Li, Ping Li
We study a normalizing flow in the latent space of a top-down generator model, in which the normalizing flow model plays the role of the informative prior model of the generator.
no code implementations • 9 Jan 2023 • Xiao-Tong Yuan, Ping Li
The stochastic proximal point (SPP) methods have gained recent attention for stochastic optimization, with strong convergence guarantees and superior robustness to the classic stochastic gradient descent (SGD) methods showcased at little to no cost of computational overhead added.
no code implementations • 9 Jan 2023 • Lingxiao Wang, Ping Li
We further extend our theory to generalized function approximation and identified conditions for reward randomization to attain provably efficient exploration.
no code implementations • 5 Dec 2022 • Zeke Xie, Qian-Yuan Tang, Zheng He, Mingming Sun, Ping Li
Unfortunately, formal statistical tests for analyzing the structure and heavy tails of stochastic gradients in deep learning are still under-explored.
no code implementations • 25 Nov 2022 • Xiaoyun Li, Ping Li
Moreover, we develop a new analysis of the EF under partial client participation, which is an important scenario in FL.
no code implementations • 25 Nov 2022 • Tan Yu, Ping Li
To bring back the global receptive field, window-based Vision Transformers have devoted a lot of efforts to achieving cross-window communications by developing several sophisticated operations.
no code implementations • 20 Nov 2022 • Shuo Chen, Tan Yu, Ping Li
Recently, vision architectures based exclusively on multi-layer perceptrons (MLPs) have gained much attention in the computer vision community.
no code implementations • 15 Nov 2022 • Guanhua Fang, Ping Li, Gennady Samorodnitsky
We study an important variant of the stochastic multi-armed bandit (MAB) problem, which takes penalization into consideration.
1 code implementation • 10 Nov 2022 • Liansheng Wang, Jiacheng Wang, Lei Zhu, Huazhu Fu, Ping Li, Gary Cheng, Zhipeng Feng, Shuo Li, Pheng-Ann Heng
Automated detecting lung infections from computed tomography (CT) data plays an important role for combating COVID-19.
no code implementations • 1 Nov 2022 • Khoa Doan, Shulong Tan, Weijie Zhao, Ping Li
Previous learning-to-hash approaches are also not suitable to solve the fast item ranking problem since they can take a significant amount of time and computation to train the hash functions.
1 code implementation • 29 Oct 2022 • Jiayi Yao, Ping Li, Xiatao Kang, Yuzhe Wang
Firstly, we train a sparse model by GL penalty, and impose an angle dissimilarity constraint on the channels and filters of convolutional network to obtain a more sparse structure.
no code implementations • 28 Oct 2022 • Fengfan Zhou, Hefei Ling, Yuxuan Shi, Jiazhong Chen, Zongyi Li, Ping Li
Though generating hard samples has shown its effectiveness in improving the generalization of models in training tasks, the effectiveness of utilizing this idea to improve the transferability of adversarial face examples remains unexplored.
no code implementations • 27 Oct 2022 • Jun Zhang, Ping Li, Wei Wang
Recent advances in neural networks have been successfully applied to many tasks in online recommendation applications.
no code implementations • 26 Oct 2022 • Weijie Zhao, Shulong Tan, Ping Li
Typically a three-stage mechanism is employed in those systems: (i) a small collection of items are first retrieved by (e. g.,) approximate near neighbor search algorithms; (ii) then a collection of constraints are applied on the retrieved items; (iii) a fine-grained ranking neural network is employed to determine the final recommendation.
no code implementations • 19 Oct 2022 • Yue Zhang, Hongliang Fei, Dingcheng Li, Tan Yu, Ping Li
In particular, we focus on few-shot image recognition tasks on pretrained vision-language models (PVLMs) and develop a method of prompting through prototype (PTP), where we define $K$ image prototypes and $K$ prompt prototypes.
no code implementations • 18 Oct 2022 • Yue Zhang, Hongliang Fei, Ping Li
Specifically, we build a noise model to estimate the unknown labeling noise distribution over input contexts and noisy type labels.
1 code implementation • Proceedings of the 31st ACM International Conference on Information & Knowledge Management (CIKM) 2022 • Shaogang Ren, Ping Li
A new causal discovery method is introduced to solve the bivariate causal discovery problem.
no code implementations • 17 Oct 2022 • Khoa D. Doan, Yingjie Lao, Ping Li
To achieve this goal, we propose to represent the trigger function as a class-conditional generative model and to inject the backdoor in a constrained optimization framework, where the trigger function learns to generate an optimal trigger pattern to attack any target class at will while simultaneously embedding this generative backdoor into the trained model.
no code implementations • 13 Oct 2022 • Tan Yu, Jun Zhi, Yufei Zhang, Jian Li, Hongliang Fei, Ping Li
In this paper, we formulate the APP-installation user embedding learning into a bipartite graph embedding problem.
no code implementations • 9 Oct 2022 • Khoa D. Doan, Jianwen Xie, Yaxuan Zhu, Yang Zhao, Ping Li
Leveraging supervised information can lead to superior retrieval performance in the image hashing domain but the performance degrades significantly without enough labeled data.
1 code implementation • 27 Sep 2022 • Xiatao Kang, Ping Li, Jiayi Yao, Chengxi Li
Pruning on neural networks before training not only compresses the original models, but also accelerates the network training phase, which has substantial application value.
no code implementations • 26 Sep 2022 • Weijie Zhao, Xuewu Jiao, Xinsheng Luo, Jingxue Li, Belhal Karimi, Ping Li
In this paper, we propose FeatureBox, a novel end-to-end training framework that pipelines the feature extraction and the training on GPU servers to save the intermediate I/O of the feature extraction.
no code implementations • 23 Sep 2022 • Tan Yu, Zhipeng Jin, Jie Liu, Yi Yang, Hongliang Fei, Ping Li
To overcome the limitations of behavior ID features in modeling new ads, we exploit the visual content in ads to boost the performance of CTR prediction models.
no code implementations • 19 Sep 2022 • Tan Yu, Jie Liu, Yi Yang, Yi Li, Hongliang Fei, Ping Li
How to pair the video ads with the user search is the core task of Baidu video advertising.
no code implementations • 29 Aug 2022 • Faysal Hossain Shezan, Yingjie Lao, Minlong Peng, Xin Wang, Mingming Sun, Ping Li
At the core, NL2GDPR is a privacy-centric information extraction model, appended with a GDPR policy finder and a policy generator.
no code implementations • 5 Aug 2022 • Sujay Bhatt, Guanhua Fang, Ping Li, Gennady Samorodnitsky
In this paper, we provide an extension of confidence sequences for settings where the variance of the data-generating distribution does not exist or is infinite.
1 code implementation • 18 Jul 2022 • Ping Li, Weijie Zhao
Although the gain formula in Li (2010) was derived for logistic regression loss, it is a generic formula for loss functions with second-derivatives.
1 code implementation • 18 Jul 2022 • Ping Li, Weijie Zhao
In recent prior studies, the pGMM kernel has been extensively evaluated for classification tasks, for logistic regression, support vector machines, as well as deep neural networks.
no code implementations • 6 Jul 2022 • Shaogang Ren, Belhal Karimi, Dingcheng Li, Ping Li
VFGs learn the representation of high dimensional data via a message-passing scheme by integrating flow-based functions through variational inference.
no code implementations • 5 Jul 2022 • Shaogang Ren, Guanhua Fang, Ping Li
Best subset selection is considered the `gold standard' for many sparse learning problems.
no code implementations • 26 Jun 2022 • Chenglin Fan, Ping Li, Xiaoyun Li
When designing clustering algorithms, the choice of initial centers is crucial for the quality of the learned clusters.
no code implementations • 24 Jun 2022 • Khoa D. Doan, Yingjie Lao, Peng Yang, Ping Li
We first examine the vulnerability of ViTs against various backdoor attacks and find that ViTs are also quite vulnerable to existing attacks.
1 code implementation • 23 Jun 2022 • Tairan Huang, Xu Li, Hao Li, Mingming Sun, Ping Li
As discussed in this paper, under the settings of the off-policy actor critic algorithms, we demonstrate that the critic can bring more expected discounted rewards than or at least equal to the actor.
no code implementations • 22 Jun 2022 • Zhaozhuo Xu, Weijie Zhao, Shulong Tan, Zhixin Zhou, Ping Li
Given a vertex deletion request, we thoroughly investigate solutions to update the connections of the vertex.
no code implementations • 22 Jun 2022 • Yingzhen Yang, Ping Li
Our results provide theoretical guarantee on the correctness of noisy $\ell^{0}$-SSC in terms of SDP on noisy data for the first time, which reveals the advantage of noisy $\ell^{0}$-SSC in terms of much less restrictive condition on subspace affinity.
no code implementations • 13 Jun 2022 • Trung-Kien Le, Ping Li
This article proposes a new method to estimate the world points and projection matrices from their correspondences.
no code implementations • 10 Jun 2022 • Xiao-Tong Yuan, Ping Li
The FedProx algorithm is a simple yet powerful distributed proximal point optimization method widely used for federated learning (FL) over heterogeneous data.
no code implementations • 8 Jun 2022 • Xiao-Tong Yuan, Ping Li
We further substantialize these generic results to stochastic gradient descent (SGD) to derive improved high-probability generalization bounds for convex or non-convex optimization problems with natural time decaying learning rates, which have not been possible to prove with the existing hypothesis stability or uniform stability based results.
1 code implementation • CVPR 2022 • Khoa D. Doan, Peng Yang, Ping Li
However, in the existing deep supervised hashing methods, coding balance and low-quantization error are difficult to achieve and involve several losses.
no code implementations • 23 May 2022 • Jincheng Huang, Ping Li, Rui Huang, Chen Na, Acong Zhang
Alternatively, it is possible to exploit the information about the presence of heterophilous neighbors for feature learning, so a hybrid message passing approach is devised to aggregate homophilious neighbors and diversify heterophilous neighbors based on edge classification.
1 code implementation • 22 May 2022 • Ping Li, Weijie Zhao
Our framework has parameters $(s, g, w)$.
no code implementations • 19 May 2022 • Shuo Yang, Zeke Xie, Hanyu Peng, Min Xu, Mingming Sun, Ping Li
To answer these, we propose dataset pruning, an optimization-based sample selection method that can (1) examine the influence of removing a particular set of training samples on model's generalization ability with theoretical guarantee, and (2) construct the smallest subset of training data that yields strictly constrained generalization gap.
no code implementations • ICLR 2022 • Jianwen Xie, Yaxuan Zhu, Jun Li, Ping Li
Under the short-run non-mixing MCMC scenario, the estimation of the energy-based model is shown to follow the perturbation of maximum likelihood, and the short-run Langevin flow and the normalizing flow form a two-flow generator that we call CoopFlow.
no code implementations • ICLR 2022 • Xiaoyun Li, Belhal Karimi, Ping Li
We study COMP-AMS, a distributed optimization framework based on gradient averaging and adaptive AMSGrad algorithm.
no code implementations • 3 May 2022 • Trung-Kien Le, Ping Li
To our knowledge, there are no theoretical results for multi-view correspondences prior to this paper.
no code implementations • 21 Apr 2022 • Jinxing Yu, Yunfeng Cai, Mingming Sun, Ping Li
Translation distance based knowledge graph embedding (KGE) methods, such as TransE and RotatE, model the relation in knowledge graphs as translation or rotation in the vector space.
1 code implementation • 19 Apr 2022 • Jing Zhang, Jianwen Xie, Nick Barnes, Ping Li
With the generative saliency model, we can obtain a pixel-wise uncertainty map from an image, indicating model confidence in the saliency prediction.
no code implementations • 18 Mar 2022 • Belhal Karimi, Ping Li
We motivate the choice of a double dynamic by invoking the variance reduction virtue of each stage of the method on both sources of noise: the index sampling for the incremental update and the MC approximation.
no code implementations • 17 Mar 2022 • Xiao-Tong Yuan, Ping Li
In this paper, we analyze the generalization performance of the Iterative Hard Thresholding (IHT) algorithm widely used for sparse recovery problems.
1 code implementation • CVPR 2022 • Xiaoguang Li, Qing Guo, Di Lin, Ping Li, Wei Feng, Song Wang
As a result, the final method takes the advantage of effective semantic & image-level filling for high-fidelity inpainting.
no code implementations • 19 Feb 2022 • Minlong Peng, Zidi Xiong, Mingming Sun, Ping Li
In order to achieve a high attack success rate using as few poisoned training samples as possible, most existing attack methods change the labels of the poisoned samples to the target class.
1 code implementation • 31 Jan 2022 • Tan Yu, Gangming Zhao, Ping Li, Yizhou Yu
To improve efficiency, recent Vision Transformers adopt local self-attention mechanisms, where self-attention is computed within local windows.
no code implementations • 31 Jan 2022 • Zeke Xie, Qian-Yuan Tang, Yunfeng Cai, Mingming Sun, Ping Li
It is well-known that the Hessian of deep loss landscape matters to optimization, generalization, and even robustness of deep learning.
no code implementations • 7 Jan 2022 • Ping Li, Weijie Zhao
For example, one can apply GCWS on the outputs of the last layer to boost the accuracy of trained deep neural networks.
no code implementations • 5 Jan 2022 • Weijie Zhao, Xuewu Jiao, Mingqing Hu, Xiaoyun Li, Xiangyu Zhang, Ping Li
In this paper, we propose a hardware-aware training workflow that couples the hardware topology into the algorithm design.
1 code implementation • 1 Jan 2022 • Xiaoqiang Wang, Lei Zhu, Siliang Tang, Huazhu Fu, Ping Li, Fei Wu, Yi Yang, Yueting Zhuang
The depth estimation branch is trained with RGB-D images and then used to estimate the pseudo depth maps for all unlabeled RGB images to form the paired data.
no code implementations • NeurIPS 2021 • Jing Zhang, Jianwen Xie, Nick Barnes, Ping Li
In this paper, we take a step further by proposing a novel generative vision transformer with latent variables following an informative energy-based prior for salient object detection.
1 code implementation • ICDM 21 2021 • Shaogang Ren, Haiyan Yin, Mingming Sun, Ping Li
Then we formulate a novel evaluation metric to infer the scores for each potential causal direction based on the variance of the conditional density estimation.
no code implementations • NeurIPS 2021 • Haiyan Yin, Peng Yang, Ping Li
Though recent studies have achieved remarkable progress in improving the online continual learning performance empowered by the deep neural networks-based models, many of today's approaches still suffer a lot from catastrophic forgetting, a persistent challenge for continual learning.
no code implementations • NeurIPS 2021 • Zhixin Zhou, Fan Zhou, Ping Li, Cun-Hui Zhang
We show that the performance of estimating the connectivity matrix $M$ depends on the sparsity of the graph.
no code implementations • NeurIPS 2021 • Zhiqiang Xu, Ping Li
We further give the first worst-case analysis that achieves a rate of convergence at $O(\frac{1}{\epsilon}\log\frac{1}{\epsilon})$.
no code implementations • NeurIPS 2021 • Yunfeng Cai, Guanhua Fang, Ping Li
The sparse generalized eigenvalue problem (SGEP) aims to find the leading eigenvector with sparsity structure.
no code implementations • NeurIPS 2021 • Khoa Doan, Yingjie Lao, Ping Li
Many existing countermeasures found that backdoor tends to leave tangible footprints in the latent or feature space, which can be utilized to mitigate backdoor attacks. In this paper, we extend the concept of imperceptible backdoor from the input space to the latent representation, which significantly improves the effectiveness against the existing defense mechanisms, especially those relying on the distinguishability between clean inputs and backdoor inputs in latent space.
no code implementations • 18 Nov 2021 • Xiaoyun Li, Ping Li
Note that C-MinHash is different from the well-known work on "One Permutation Hashing (OPH)" published in NIPS'12.
no code implementations • 25 Oct 2021 • Shuo Chen, Tan Yu, Ping Li
Nevertheless, multi-view CNN models cannot model the communications between patches from different views, limiting its effectiveness in 3D object recognition.
no code implementations • 1 Oct 2021 • Belhal Karimi, Ping Li, Xiaoyun Li
In the emerging paradigm of Federated Learning (FL), large amount of clients such as mobile devices are used to train possibly high-dimensional models on their respective data.
no code implementations • 29 Sep 2021 • Xu Li, Yunfeng Cai, Mingming Sun, Ping Li
Discovering the causal relationship via recovering the directed acyclic graph (DAG) structure from the observed data is a challenging combinatorial problem.
no code implementations • 29 Sep 2021 • Xiaotong Yuan, Ping Li
We further substantialize these generic results to SGD to derive improved high probability generalization bounds for convex or non-convex optimization with natural time decaying learning rates, which have not been possible to prove with the existing uniform stability results.
no code implementations • 29 Sep 2021 • Guanhua Fang, Ping Li, Gennady Samorodnitsky
Under such a framework, we propose a hard-threshold UCB-like algorithm, which enjoys many merits including asymptotic fairness, nearly optimal regret, better tradeoff between reward and fairness.
no code implementations • 29 Sep 2021 • Chenglin Fan, Ping Li, Xiaoyun Li
Our method, named the HST initialization, can also be easily extended to the setting of differential privacy (DP) to generate private initial centers.
no code implementations • 29 Sep 2021 • Xiaoyun Li, Ping Li
We show the locality-sensitivity of SignRFF, and propose a new measure, called ranking efficiency, to theoretically compare different Locality-Sensitive Hashing (LSH) methods with practical implications.
no code implementations • ICLR 2022 • Tan Yu, Jun Li, Yunfeng Cai, Ping Li
A convolution layer with an orthogonal Jacobian matrix is 1-Lipschitz in the 2-norm, making the output robust to the perturbation in input.
no code implementations • 29 Sep 2021 • Weiguo Pian, Hanyu Peng, Mingming Sun, Ping Li
In this paper, we work on a seamless marriage of imbalanced regression and self-supervised learning.
no code implementations • 29 Sep 2021 • Zhiqi Bu, Ping Li, Weijie Zhao
In this work, we propose the practical adversarial training with differential privacy (DP-Adv), to combine the backbones from both communities and deliver robust and private models with high accuracy.
no code implementations • ICLR 2022 • Hanyu Peng, Mingming Sun, Ping Li
It is attracting attention to the long-tailed recognition problem, a burning issue that has become very popular recently.
Ranked #32 on
Long-tail Learning
on CIFAR-100-LT (ρ=100)
no code implementations • 29 Sep 2021 • Jun Li, Ping Li
In this paper, we propose a $f$-divergence Thermodynamic Variational Objective ($f$-TVO).
no code implementations • 29 Sep 2021 • Nanqing Dong, Jianwen Xie, Ping Li
We present a simple yet robust noise synthesis framework based on unsupervised contrastive learning.
no code implementations • 29 Sep 2021 • Zhuozhuo Tu, Zhiqiang Xu, Tairan Huang, DaCheng Tao, Ping Li
Federated Learning is a machine learning technique where a network of clients collaborates with a server to learn a centralized model while keeping data localized.
no code implementations • 29 Sep 2021 • Xiaoyun Li, Ping Li
Minwise hashing (MinHash) is an important and practical algorithm for generating random hashes to approximate the Jaccard (resemblance) similarity in massive binary (0/1) data.
no code implementations • ICLR 2022 • Yingzhen Yang, Ping Li
Similarity-based clustering methods separate data into clusters according to the pairwise similarity between the data, and the pairwise similarity is crucial for their performance.
no code implementations • 10 Sep 2021 • Xiangyi Chen, Xiaoyun Li, Ping Li
While adaptive gradient methods have been proven effective for training neural nets, the study of adaptive gradient methods in federated learning is scarce.
no code implementations • 10 Sep 2021 • Xiaoyun Li, Ping Li
That is, one single permutation is used for both the initial pre-processing step to break the structures in the data and the circulant hashing step to generate $K$ hashes.
no code implementations • 9 Sep 2021 • Sujay Bhatt, Ping Li, Gennady Samorodnitsky
We consider a multi-armed bandit problem motivated by situations where only the extreme values, as opposed to expected values in the classical bandit setting, are of interest.
no code implementations • 7 Sep 2021 • Xiangyi Chen, Belhal Karimi, Weijie Zhao, Ping Li
Adaptive gradient methods including Adam, AdaGrad, and their variants have been very successful for training deep learning models, such as neural networks.
no code implementations • 7 Sep 2021 • Xiaoyun Li, Ping Li
Unlike classical MinHash, these $K$ hashes are obviously correlated, but we are able to provide rigorous proofs that we still obtain an unbiased estimate of the Jaccard similarity and the theoretical variance is uniformly smaller than that of the classical MinHash with $K$ independent permutations.
no code implementations • 16 Aug 2021 • Mostafa Rahmani, Rasoul Shafipour, Ping Li
The proposed approach is used to design several novel global feature aggregation methods based on the choice of the LFDS.
no code implementations • 16 Aug 2021 • Weiwei Li, Mostafa Rahmani, Ping Li
It is shown that in contrast to most of the existing methods which require the subspaces to be sufficiently incoherent with each other, Innovation Pursuit only requires the innovative components of the subspaces to be sufficiently incoherent with each other.
2 code implementations • 2 Aug 2021 • Tan Yu, Xu Li, Yunfeng Cai, Mingming Sun, Ping Li
More recently, using smaller patches with a pyramid structure, Vision Permutator (ViP) and Global Filter Network (GFNet) achieve better performance than S$^2$-MLP.
no code implementations • 28 Jun 2021 • Tan Yu, Xu Li, Yunfeng Cai, Mingming Sun, Ping Li
By introducing the inductive bias from the image processing, convolution neural network (CNN) has achieved excellent performance in numerous computer vision tasks and has been established as \emph{de facto} backbone.
1 code implementation • 24 Jun 2021 • Johann Li, Guangming Zhu, Cong Hua, Mingtao Feng, BasheerBennamoun, Ping Li, Xiaoyuan Lu, Juan Song, Peiyi Shen, Xu Xu, Lin Mei, Liang Zhang, Syed Afaq Ali Shah, Mohammed Bennamoun
Thus, as comprehensive as possible, this paper provides a collection of medical image datasets with their associated challenges for deep learning research.
no code implementations • 23 Jun 2021 • Mostafa Rahmani, Ping Li
In the application of Innovation Search for outlier detection, the directions of innovation were utilized to measure the innovation of the data points.
no code implementations • CVPR 2021 • Dongsheng An, Jianwen Xie, Ping Li
Learning latent variable models with deep top-down architectures typically requires inferring the latent variables for each training example based on the posterior distribution of these latent variables.
no code implementations • CVPR 2021 • Zilong Zheng, Jianwen Xie, Ping Li
Exploiting internal statistics of a single natural image has long been recognized as a significant research paradigm where the goal is to learn the distribution of patches within the image without relying on external training data.
1 code implementation • CVPR 2021 • Gang Fu, Qing Zhang, Lei Zhu, Ping Li, Chunxia Xiao
Specular highlight detection and removal are fundamental and challenging tasks.
1 code implementation • 14 Jun 2021 • Tan Yu, Xu Li, Yunfeng Cai, Mingming Sun, Ping Li
We discover that the token-mixing MLP is a variant of the depthwise convolution with a global reception field and spatial-specific configuration.
no code implementations • NAACL 2021 • Hongliang Fei, Tan Yu, Ping Li
Recent pretrained vision-language models have achieved impressive performance on cross-modal retrieval tasks in English.
no code implementations • NeurIPS 2021 • Haiyan Yin, Peng Yang, Ping Li
Though recent studies have achieved remarkable progress in improving the online continual learning performance empowered by the deep neural networks-based models, many of today's approaches still suffer a lot from catastrophic forgetting, a persistent challenge for continual learning.
no code implementations • 1 Apr 2021 • Jingfeng Lu, Shuo Wang, Ping Li, Dong Ye
Low-dose computed tomography (CT) allows the reduction of radiation risk in clinical applications at the expense of image quality, which deteriorates the diagnosis accuracy of radiologists.
no code implementations • 25 Feb 2021 • Xiaoyun Li, Ping Li
Closely related to RP, the method of random Fourier features (RFF) has also become popular, for approximating the Gaussian kernel.
no code implementations • 16 Feb 2021 • Mengzhi Wu, Qian Liu, Ping Li, Shi Chen, Binlong Wang, Wenhan Shen, Shiping Chen, Yangheng Zheng, Yigang Xie, Jin Li
The IBF and the transparent rate of electrons are two essential indicators of TPC, which affect the energy resolution and counting rate respectively.
Instrumentation and Detectors High Energy Physics - Experiment
no code implementations • 1 Jan 2021 • Xiangyi Chen, Belhal Karimi, Weijie Zhao, Ping Li
Specifically, we propose a general algorithmic framework that can convert existing adaptive gradient methods to their decentralized counterparts.
no code implementations • ICCV 2021 • Qinghao Ye, Xiyue Shen, Yuan Gao, ZiRui Wang, Qi Bi, Ping Li, Guang Yang
Video highlight detection plays an increasingly important role in social media content filtering, however, it remains highly challenging to develop automated video highlight detection methods because of the lack of temporal annotations (i. e., where the highlight moments are in long videos) for supervised learning.
2 code implementations • ICCV 2021 • Khoa Doan, Yingjie Lao, Weijie Zhao, Ping Li
Under this optimization framework, the trigger generator function will learn to manipulate the input with imperceptible noise to preserve the model performance on the clean data and maximize the attack success rate on the poisoned data.
no code implementations • ICCV 2021 • Peng Yang, Yingjie Lao, Ping Li
Deep neural networks (DNNs) have become state-of-the-art in many application domains.
no code implementations • ICLR 2021 • Yang Zhao, Jianwen Xie, Ping Li
Energy-based models (EBMs) for generative modeling parametrize a single net and can be directly trained by maximum likelihood estimation.
no code implementations • 1 Jan 2021 • Tan Yu, Hongliang Fei, Ping Li
Inspired by the great success of BERT in NLP tasks, many text-vision BERT models emerged recently.
no code implementations • 1 Jan 2021 • Belhal Karimi, Hoi To Wai, Eric Moulines, Ping Li
Many constrained, nonconvex and nonsmooth optimization problems can be tackled using the majorization-minimization (MM) method which alternates between constructing a surrogate function which upper bounds the objective function, and then minimizing this surrogate.
no code implementations • 29 Dec 2020 • Jianwen Xie, Zilong Zheng, Ping Li
In this paper, we propose to learn a variational auto-encoder (VAE) to initialize the finite-step MCMC, such as Langevin dynamics that is derived from the energy function, for efficient amortized sampling of the EBM.
no code implementations • NeurIPS 2020 • Yingxue Zhou, Belhal Karimi, Jinxing Yu, Zhiqiang Xu, Ping Li
Adaptive gradient methods such as AdaGrad, RMSprop and Adam have been optimizers of choice for deep learning due to their fast training speed.
1 code implementation • NeurIPS 2020 • Dingguo Shen, Yuanfeng Ji, Ping Li, Yi Wang, Di Lin
In contrast to the previous methods, RANet configures the information pathways between the pixels in different regions, enabling the region interaction to exchange the regional context for enhancing all of the pixels in the image.
no code implementations • NeurIPS 2020 • Yi Hao, Ping Li
Based on a sample of size $n$, we consider estimating the number of symbols that appear at least $\mu$ times in an independent sample of size $a \cdot n$, where $a$ is a given parameter.
1 code implementation • NeurIPS 2020 • Shaogang Ren, Weijie Zhao, Ping Li
L1 regularization has been broadly employed to pursue model sparsity.
no code implementations • NeurIPS 2020 • Hexuan Liu, Yunfeng Cai, You-Lin Chen, Ping Li
We reformulate the Wasserstein Discriminant Analysis (WDA) as a ratio trace problem and present an eigensolver-based algorithm to compute the discriminative subspace of WDA.
no code implementations • pproximateinference AABI Symposium 2021 • Belhal Karimi, Ping Li
Bayesian neural networks attempt to combine the strong predictive performance of neural networks with formal quantification of uncertainty of the predicted output in the Bayesian framework.
no code implementations • 2 Nov 2020 • Yunfeng Cai, Ping Li
This paper considers the identification problem for BJBDP, that is, under what conditions and by what means, we can identify the diagonalizer $A$ and the block diagonal structure of $\Sigma_i$, especially when there is noise in $C_i$'s.
no code implementations • 29 Oct 2020 • Yunfeng Cai, Ping Li
In this paper, we propose to solve LRTC via tensor networks with a Tucker wrapper.
no code implementations • 23 Sep 2020 • Ping Li, Qinghao Ye, Luming Zhang, Li Yuan, Xianghua Xu, Ling Shao
In this paper, we propose an efficient convolutional neural network architecture for video SUMmarization via Global Diverse Attention called SUM-GDA, which adapts attention mechanism in a global perspective to consider pairwise temporal relations of video frames.
no code implementations • 18 Sep 2020 • Ruoxin Chen, Jie Li, Chentao Wu, Bin Sheng, Ping Li
Random selection based defenses can achieve certified robustness by averaging the classifiers' predictions on the sub-datasets sampled from the training set.
no code implementations • 27 Aug 2020 • Jerry Chee, Ping Li
We construct a statistical diagnostic test for convergence to the stationary phase using the inner product between successive gradients and demonstrate that the proposed diagnostic works well.
no code implementations • 11 Aug 2020 • Farzin Haddadpour, Belhal Karimi, Ping Li, Xiaoyun Li
Communication complexity and privacy are the two key challenges in Federated Learning where the goal is to perform a distributed learning through a large volume of devices.
no code implementations • 12 Jul 2020 • Liang Zhang, Johann Li, Ping Li, Xiaoyuan Lu, Peiyi Shen, Guangming Zhu, Syed Afaq Shah, Mohammed Bennarmoun, Kun Qian, Björn W. Schuller
To the best of our knowledge, MeDaS is the first open-source platform proving a collaborative and interactive service for researchers from a medical background easily using DL related toolkits, and at the same time for scientists or engineers from information sciences to understand the medical knowledge side.
no code implementations • ACL 2020 • Jingyuan Zhang, Mingming Sun, Yue Feng, Ping Li
Compared to the state-of-the-art methods, the learned network structures help improving the identification of concepts for entities based on the relations of entities on both datasets.
no code implementations • ACL 2020 • Hongliang Fei, Ping Li
Recent neural network models have achieved impressive performance on sentiment classification in English as well as other languages.
no code implementations • 3 Jun 2020 • Ping Li, Ming Chen, Yijie Mao, Zhaohui Yang, Bruno Clerckx, Mohammad Shikh-Bahaei
In this paper, we employ Cooperative Rate-Splitting (CRS) technique to enhance the Secrecy Sum Rate (SSR) for the Multiple Input Single Output (MISO) Broadcast Channel (BC), consisting of two legitimate users and one eavesdropper, with perfect Channel State Information (CSI) available at all nodes.
1 code implementation • 20 Apr 2020 • Shaogang Ren, Dingcheng Li, Zhixin Zhou, Ping Li
The thriving of deep models and generative models provides approaches to model high dimensional distributions.
no code implementations • 2 Apr 2020 • Xiaoyun Li, Jie Gui, Ping Li
In this paper, we propose the kernel version of multi-view discriminant analysis, called kernel multi-view discriminant analysis (KMvDA).
no code implementations • 2 Apr 2020 • Peng Yang, Ping Li
Conventional online multi-task learning algorithms suffer from two critical limitations: 1) Heavy communication caused by delivering high velocity of sequential data to a central machine; 2) Expensive runtime complexity for building task relatedness.
no code implementations • 2 Apr 2020 • Xiaoyun Li, Chengxi Wu, Ping Li
Feature selection is an important tool to deal with high dimensional data.
no code implementations • 24 Mar 2020 • Yunfeng Cai, Ping Li
We consider the problem of robust matrix completion, which aims to recover a low rank matrix $L_*$ and a sparse matrix $S_*$ from incomplete observations of their sum $M=L_*+S_*\in\mathbb{R}^{m\times n}$.
no code implementations • 24 Mar 2020 • Yunfeng Cai, Ping Li
Particularly, a new truncation strategy is proposed, which is able to find the support set of the leading eigenvector effectively.
1 code implementation • 12 Mar 2020 • Weijie Zhao, Deping Xie, Ronglai Jia, Yulei Qian, Ruiquan Ding, Mingming Sun, Ping Li
For example, a sponsored online advertising system can contain more than $10^{11}$ sparse features, making the neural network a massive model with around 10 TB parameters.
no code implementations • 12 Mar 2020 • Haiyan Yin, Dingcheng Li, Xu Li, Ping Li
To this end, we introduce a cooperative training paradigm, where a language model is cooperatively trained with the generator and we utilize the language model to efficiently shape the data distribution of the generator against mode collapse.
no code implementations • 4 Feb 2020 • Hefei Ling, Yangyang Qin, Li Zhang, Yuxuan Shi, Ping Li
It is well known that attention mechanisms can effectively improve the performance of many CNNs including object detectors.
1 code implementation • 30 Jan 2020 • Liang Zhang, Xudong Wang, Hongsheng Li, Guangming Zhu, Peiyi Shen, Ping Li, Xiaoyuan Lu, Syed Afaq Ali Shah, Mohammed Bennamoun
To solve these problems mentioned above, we propose a novel graph self-adaptive pooling method with the following objectives: (1) to construct a reasonable pooled graph topology, structure and feature information of the graph are considered simultaneously, which provide additional veracity and objectivity in node selection; and (2) to make the pooled nodes contain sufficiently effective graph information, node feature information is aggregated before discarding the unimportant nodes; thus, the selected nodes contain information from neighbor nodes, which can enhance the use of features of the unselected nodes.
no code implementations • 30 Dec 2019 • Mostafa Rahmani, Ping Li
In this paper, we present a new discovery that the directions of innovation can be used to design a provable and strong robust (to outlier) PCA method.
no code implementations • NeurIPS 2019 • Xiaoyun Li, Ping Li
In this paper, we consider the learning problem where the projected data is further compressed by scalar quantization, which is called quantized compressive learning.
no code implementations • NeurIPS 2019 • Mostafa Rahmani, Ping Li
In other word, an outlier carries some innovation with respect to most of the other data points.
no code implementations • NeurIPS 2019 • Ping Li, Xiaoyun Li, Cun-Hui Zhang
Jaccard similarity is widely used as a distance measure in many machine learning and search applications.
no code implementations • NeurIPS 2019 • Zhiqiang Xu, Ping Li
To promote the practical use of ALS for CCA, we propose truly alternating least-squares.
no code implementations • NeurIPS 2019 • Xiaoyun Li, Ping Li
The method of random projection has been a popular tool for data compression, similarity search, and machine learning.
no code implementations • NeurIPS 2019 • Zhixin Zhou, Shulong Tan, Zhaozhuo Xu, Ping Li
We present a fast search on graph algorithm for Maximum Inner Product Search (MIPS).
no code implementations • 5 Nov 2019 • Fan Zhou, Ping Li
Let $\mathbf{x}_j = \mathbf{\theta} + \mathbf{\epsilon}_j$, $j=1,\dots, n$ be i. i. d.
no code implementations • IJCNLP 2019 • Miao Fan, Chao Feng, Mingming Sun, Ping Li
Given a product, a selector (agent) learns from both the keys in the product metadata and one of its reviews to take an action that selects the correct value, and a successive predictor (network) makes the free-text review attend to this value to obtain better neural representations for helpfulness assessment.
no code implementations • IJCNLP 2019 • Shulong Tan, Zhixin Zhou, Zhaozhuo Xu, Ping Li
Retrieval of relevant vectors produced by representation learning critically influences the efficiency in natural language processing (NLP) tasks.
no code implementations • 3 Oct 2019 • Mostafa Rahmani, Ping Li
The proposed approach leverages a spatial representation of the graph which makes the neural network aware of the differences between the nodes and also their locations in the graph.
no code implementations • 25 Sep 2019 • Jun-Kun Wang, Xiaoyun Li, Ping Li
Perhaps the only methods that enjoy convergence guarantees are the ones that sample the perturbed points uniformly from a unit sphere or from a multivariate Gaussian distribution with an isotropic covariance.
no code implementations • 5 Sep 2019 • Hang Zhang, Martin Slawski, Ping Li
For the case in which both the signal and permutation are unknown, the problem is reformulated as a bi-convex optimization problem with an auxiliary variable, which can be solved by the Alternating Direction Method of Multipliers (ADMM).
no code implementations • 23 Aug 2019 • Weichen Dai, Yu Zhang, Donglei Sun, Naira Hovakimyan, Ping Li
Moreover, the proposed method can also provide a metric 3D reconstruction in semi-dense density with multi-spectral information, which is not available from existing multi-spectral methods.
no code implementations • 6 Aug 2019 • Xiao-Tong Yuan, Ping Li
We first introduce a simple variant of DANE equipped with backtracking line search, for which global asymptotic convergence and sharper local non-asymptotic convergence rate guarantees can be proved for both quadratic and non-quadratic strongly convex functions.
no code implementations • 16 Jul 2019 • Martin Slawski, Emanuel Ben-David, Ping Li
A tacit assumption in linear regression is that (response, predictor)-pairs correspond to identical observational units.
no code implementations • ACL 2019 • Hongliang Fei, Xu Li, Dingcheng Li, Ping Li
Recent neural network models have significantly advanced the task of coreference resolution.
Ranked #13 on
Coreference Resolution
on CoNLL 2012
no code implementations • NAACL 2019 • Dingcheng Li, Siamak Zamani, Jingyuan Zhang, Ping Li
Leveraging domain knowledge is an effective strategy for enhancing the quality of inferred low-dimensional representations of documents by topic models.
no code implementations • ICLR 2019 • Mostafa Rahmani, Ping Li
In the second step, the GNN is applied to the point-cloud representation of the graph provided by the embedding method.
no code implementations • ICLR 2019 • Ping Li, Phan-Minh Nguyen
We study the behavior of weight-tied multilayer vanilla autoencoders under the assumption of random weights.
no code implementations • ICLR 2019 • Jun-Kun Wang, Xiaoyun Li, Ping Li
We consider new variants of optimization algorithms.
no code implementations • 29 Apr 2019 • Mingming Sun, Xu Li, Xin Wang, Miao Fan, Yue Feng, Ping Li
In this paper, we consider the problem of open information extraction (OIE) for extracting entity and relation level intermediate structures from sentences in open-domain.
no code implementations • 17 Apr 2019 • Li Yuan, Francis EH Tay, Ping Li, Li Zhou, Jiashi Feng
The evaluator defines a learnable information preserving metric between original video and summary video and "supervises" the selector to identify the most informative frames to form the summary video.
Ranked #6 on
Unsupervised Video Summarization
on TvSum
no code implementations • ICLR 2020 • Jun-Kun Wang, Xiaoyun Li, Belhal Karimi, Ping Li
We propose a new variant of AMSGrad, a popular adaptive gradient based optimization algorithm widely used for training deep neural networks.
no code implementations • 8 Nov 2018 • Weichen Dai, Yu Zhang, Ping Li, Zheng Fang, Sebastian Scherer
This method utilizes the correlation between map points to separate points that are part of the static scene and points that are part of different moving objects into different groups.
no code implementations • EMNLP 2018 • Mingming Sun, Xu Li, Ping Li
We propose the task of Open-Domain Information Narration (OIN) as the reverse task of Open Information Extraction (OIE), to implement the dual structure between language and knowledge in the open domain.
no code implementations • 27 Sep 2018 • Shulong Tan, Zhixin Zhou, Zhaozhuo Xu, Ping Li
As Approximate Nearest Neighbor Search (ANNS) techniques have specifications on metric distances, efficient searching by advanced measures is still an open question.
no code implementations • ICML 2018 • Jing Wang, Jie Shen, Ping Li
As a remedy, online feature selection has attracted increasing attention in recent years.
no code implementations • 8 May 2018 • Ping Li
In this study, we propose a series of "tunable GMM kernels" which are simple and perform largely comparably to tree methods on the same datasets.
no code implementations • 26 Apr 2018 • Ping Li
At high similarity ($\rho\rightarrow1$), the asymptotic variance of recommended estimator is only $\frac{4}{3\pi} \approx 0. 4$ of the estimator for sign-sign projections.
no code implementations • NeurIPS 2017 • Jie Shen, Ping Li
In machine learning and compressed sensing, it is of central importance to understand when a tractable algorithm recovers the support of a sparse signal from its compressed measurements.
no code implementations • NeurIPS 2017 • Ping Li, Martin Slawski
Random projections have been increasingly adopted for a diverse set of tasks in machine learning involving dimensionality reduction.
no code implementations • 27 Oct 2017 • Ping Li, Tingyan Duan, Yongfeng Cao
Image matting is an important vision problem.
no code implementations • ICML 2017 • Jie Shen, Ping Li
Recovering the support of a sparse signal from its compressed samples has been one of the most important problems in high dimensional statistics.
no code implementations • 5 Mar 2017 • Yongwei Nie, Xu Cao, Chengjiang Long, Ping Li, Guiqing Li
Current face alignment algorithms can robustly find a set of landmarks along face contour.
no code implementations • 9 Jan 2017 • Ping Li
The linearized GMM kernel was extensively compared in with linearized radial basis function (RBF) kernel.
no code implementations • 29 Dec 2016 • Ping Li
Following the very recent line of work on the ``generalized min-max'' (GMM) kernel, this study proposes the ``generalized intersection'' (GInt) kernel and the related ``normalized generalized min-max'' (NGMM) kernel.
no code implementations • NeurIPS 2016 • Xiaotong Yuan, Ping Li, Tong Zhang
In this paper, we bridge this gap by showing, for the first time, that exact recovery of the global sparse minimizer is possible for HTP-style methods under restricted strong condition number bounding conditions.
no code implementations • NeurIPS 2016 • Xiaotong Yuan, Ping Li, Tong Zhang, Qingshan Liu, Guangcan Liu
We investigate a subclass of exponential family graphical models of which the sufficient statistics are defined by arbitrary additive forms.
no code implementations • NeurIPS 2016 • Ping Li, Michael Mitzenmacher, Martin Slawski
Random projections constitute a simple, yet effective technique for dimensionality reduction with applications in learning and search problems.
no code implementations • 16 Nov 2016 • Ping Li, Jiajun Bu, Chun Chen, Zhanying He, Deng Cai
In this study, we focus on improving the co-clustering performance via manifold ensemble learning, which is able to maximally approximate the intrinsic manifolds of both the sample and feature spaces.
no code implementations • 15 Nov 2016 • Ping Li, Jun Yu, Meng Wang, Luming Zhang, Deng Cai, Xuelong. Li
To achieve this goal, we cast the problem into a constrained rank minimization framework by adopting the least squares regularization.
no code implementations • 1 Aug 2016 • Ping Li, Cun-Hui Zhang