1 code implementation • ACL 2022 • Rongzhi Zhang, Yue Yu, Pranav Shetty, Le Song, Chao Zhang
Weakly-supervised learning (WSL) has shown promising results in addressing label scarcity on many NLP tasks, but manually designing a comprehensive, high-quality labeling rule set is tedious and difficult.
no code implementations • ICML 2020 • Shuang Li, Lu Wang, Ruizhi Zhang, xiaofu Chang, Xuqin Liu, Yao Xie, Yuan Qi, Le Song
We propose a modeling framework for event data, which excels in small data regime with the ability to incorporate domain knowledge.
1 code implementation • 8 Jun 2024 • Bo Chen, Zhilei Bei, Xingyi Cheng, Pan Li, Jie Tang, Le Song
Multiple Sequence Alignment (MSA) plays a pivotal role in unveiling the evolutionary trajectories of protein families.
no code implementations • 6 Feb 2024 • Qing Li, Zhihang Hu, YiXuan Wang, Lei LI, Yimin Fan, Irwin King, Le Song, Yu Li
Central to our focus is the application of FMs to specific biological problems, aiming to guide the research community in choosing appropriate FMs for their research needs.
no code implementations • 11 Jan 2024 • Bo Chen, Xingyi Cheng, Pan Li, Yangli-ao Geng, Jing Gong, Shen Li, Zhilei Bei, Xu Tan, Boyan Wang, Xin Zeng, Chiming Liu, Aohan Zeng, Yuxiao Dong, Jie Tang, Le Song
We propose a unified protein language model, xTrimoPGLM, to address these two types of tasks simultaneously through an innovative pre-training framework.
no code implementations • 2 Dec 2023 • Shuxian Zou, Hui Li, Shentong Mo, Xingyi Cheng, Eric Xing, Le Song
Predicting the structure of interacting chains is crucial for understanding biological systems and developing new drugs.
no code implementations • NeurIPS 2023 • Jing Gong, Minsheng Hao, Xingyi Cheng, Xin Zeng, Chiming Liu, Jianzhu Ma, Xuegong Zhang, Taifeng Wang, Le Song
Advances in high-throughput sequencing technology have led to significant progress in measuring gene expressions at the single-cell level.
no code implementations • 14 Jan 2023 • Zhihang Hu, Qinze Yu, Yucheng Guo, Taifeng Wang, Irwin King, Xin Gao, Le Song, Yu Li
While previous methods reported fair performance, their models usually do not take advantage of multi-modal data and they are unable to handle new drugs or cell lines.
1 code implementation • ICCV 2023 • Yanfeng Zhou, Jiaxing Huang, Chenlong Wang, Le Song, Ge Yang
Perturbations in consistency-based semi-supervised models are often artificially designed.
no code implementations • 30 Nov 2022 • Yining Wang, Xumeng Gong, Shaochuan Li, Bing Yang, YiWu Sun, Chuan Shi, Yangang Wang, Cheng Yang, Hui Li, Le Song
Its improvement in both accuracy and efficiency makes it a valuable tool for de novo antibody design and could make further improvements in immuno-theory.
1 code implementation • 26 Oct 2022 • Yuchen Zhuang, Yinghao Li, Jerry Junyang Cheung, Yue Yu, Yingjun Mou, Xiang Chen, Le Song, Chao Zhang
We study the problem of extracting N-ary relation tuples from scientific articles.
1 code implementation • 6 Oct 2022 • Ruijia Wang, Xiao Wang, Chuan Shi, Le Song
Recent studies show that graph convolutional network (GCN) often performs worse for low-degree nodes, exhibiting the so-called structural unfairness for graphs with long-tailed degree distributions prevalent in the real world.
1 code implementation • 7 Aug 2022 • Mengyang Liu, Haozheng Luo, Leonard Thong, Yinghao Li, Chao Zhang, Le Song
Compared to frequently used text annotation tools, our annotation tool allows for the development of weak labels in addition to providing a manual annotation experience.
1 code implementation • 28 Jul 2022 • Xiaomin Fang, Fan Wang, Lihang Liu, Jingzhou He, Dayong Lin, Yingfei Xiang, Xiaonan Zhang, Hua Wu, Hui Li, Le Song
Our proposed method, HelixFold-Single, first pre-trains a large-scale protein language model (PLM) with thousands of millions of primary sequences utilizing the self-supervised learning paradigm, which will be used as an alternative to MSAs for learning the co-evolution information.
no code implementations • 28 Jun 2022 • Mengyang Liu, Shanchuan Li, Xinshi Chen, Le Song
Thus, we propose Graph Condesation via Receptive Field Distribution Matching (GCDM), which is accomplished by optimizing the synthetic graph through the use of a distribution matching loss quantified by maximum mean discrepancy (MMD).
1 code implementation • 27 May 2022 • Yinghao Li, Le Song, Chao Zhang
Weakly supervised named entity recognition methods train label models to aggregate the token annotations of multiple noisy labeling functions (LFs) without seeing any manually annotated labels.
1 code implementation • 18 Mar 2022 • Rongzhi Zhang, Yue Yu, Pranav Shetty, Le Song, Chao Zhang
Weakly-supervised learning (WSL) has shown promising results in addressing label scarcity on many NLP tasks, but manually designing a comprehensive, high-quality labeling rule set is tedious and difficult.
no code implementations • 11 Feb 2022 • Karan Samel, Zelin Zhao, Binghong Chen, Shuang Li, Dharmashankar Subramanian, Irfan Essa, Le Song
Events across a timeline are a common data representation, seen in different temporal modalities.
no code implementations • NeurIPS 2021 • Zhaozhuo Xu, Beidi Chen, Chaojian Li, Weiyang Liu, Le Song, Yingyan Lin, Anshumali Shrivastava
However, as one of the most influential and practical MT paradigms, iterative machine teaching (IMT) is prohibited on IoT devices due to its inefficient and unscalable algorithms.
no code implementations • NeurIPS 2021 • Jiani Huang, Ziyang Li, Binghong Chen, Karan Samel, Mayur Naik, Le Song, Xujie Si
Deep learning and symbolic reasoning are complementary techniques for an intelligent system.
no code implementations • 30 Nov 2021 • Shuangjia Zheng, Ying Song, Zhang Pan, Chengtao Li, Le Song, Yuedong Yang
Optimizing chemical molecules for desired properties lies at the core of drug development.
no code implementations • NeurIPS 2021 • Xinshi Chen, Haoran Sun, Caleb Ellington, Eric Xing, Le Song
We consider the problem of discovering $K$ related Gaussian directed acyclic graphs (DAGs), where the involved graph structures share a consistent causal order and sparse unions of supports.
no code implementations • NeurIPS 2021 • Sihyun Yu, Sungsoo Ahn, Le Song, Jinwoo Shin
We consider the problem of searching an input maximizing a black-box objective function given a static dataset of input-output queries.
1 code implementation • 26 Oct 2021 • Yu-Ying Liu, Alexander Moreno, Maxwell A. Xu, Shuang Li, Jena C. McDaniel, Nancy C. Brady, Agata Rozga, Fuxin Li, Le Song, James M. Rehg
We solve the first challenge by reformulating the estimation problem as an equivalent discrete time-inhomogeneous hidden Markov model.
no code implementations • ICLR 2022 • Kuan Wang, Yuyu Zhang, Diyi Yang, Le Song, Tao Qin
To open the black box of GNN and investigate these problems, we dissect state-of-the-art GNN modules for QA and analyze their reasoning capability.
Ranked #12 on Question Answering on OpenBookQA
1 code implementation • NeurIPS 2021 • Zelin Zhao, Karan Samel, Binghong Chen, Le Song
Furthermore, we propose the Program-guided Transformer (ProTo), which integrates both semantic and structural guidance of a program by leveraging cross-attention and masked self-attention to pass messages between the specification and routines in the program.
Ranked #1 on Visual Question Answering (VQA) on GQA test-std
no code implementations • ICLR 2022 • Shuang Li, Mingquan Feng, Lu Wang, Abdelmajid Essofi, Yufeng Cao, Junchi Yan, Le Song
We propose a principled method to learn a set of human-readable logic rules to explain temporal point processes.
no code implementations • ICLR 2022 • Sungsoo Ahn, Binghong Chen, Tianzhe Wang, Le Song
In this paper, we explore the problem of generating molecules using deep neural networks, which has recently gained much interest in chemistry.
no code implementations • 29 Sep 2021 • Karan Samel, Zelin Zhao, Binghong Chen, Shuang Li, Dharmashankar Subramanian, Irfan Essa, Le Song
Events across a timeline are a common data representation, seen in different temporal modalities.
no code implementations • ICLR 2022 • Xinshi Chen, Haoran Sun, Le Song
In this work, we propose PLISA (Provable Learning-based Iterative Sparse recovery Algorithm) to learn algorithms automatically from data.
2 code implementations • ACL 2021 • Yinghao Li, Pranav Shetty, Lucas Liu, Chao Zhang, Le Song
To address this challenge, we propose a conditional hidden Markov model (CHMM), which can effectively infer true labels from multi-source noisy labels in an unsupervised way.
3 code implementations • 17 May 2021 • Lu Wang, xiaofu Chang, Shuang Li, Yunfei Chu, Hui Li, Wei zhang, Xiaofeng He, Le Song, Jingren Zhou, Hongxia Yang
Secondly, on top of the proposed graph transformer, we introduce a two-stream encoder that separately extracts representations from temporal neighborhoods associated with the two interaction nodes and then utilizes a co-attentional transformer to model inter-dependencies at a semantic level.
no code implementations • 25 Apr 2021 • Yuyu Zhang, Heng Chi, Binghong Chen, Tsz Ling Elaine Tang, Lucia Mirabella, Le Song, Glaucio H. Paulino
We successfully apply our ONSG framework to computational morphogenesis, a representative and challenging class of PDE-constrained optimization problems.
no code implementations • 22 Mar 2021 • Karan Samel, Zelin Zhao, Binghong Chen, Kuan Wang, Robin Luo, Le Song
In multi-modal reasoning tasks, such as visual question answering (VQA), there have been many modeling and training paradigms tested.
no code implementations • 18 Mar 2021 • James Fox, Bo Zhao, Sivasankaran Rajamanickam, Rampi Ramprasad, Le Song
Learning 3D representations that generalize well to arbitrarily oriented inputs is a challenge of practical importance in applications varying from computer vision to physics and chemistry.
1 code implementation • NeurIPS 2021 • Qingru Zhang, David Wipf, Quan Gan, Le Song
Graph neural networks (GNN) have recently emerged as a vehicle for applying deep network architectures to graph and relational data.
no code implementations • 1 Jan 2021 • Karan Samel, Zelin Zhao, Kuan Wang, Robin Luo, Binghong Chen, Le Song
We present a differentiable end-to-end program executor (DePe), which addresses Visual Question Answering (VQA) in a sample and computationally efficient manner.
no code implementations • ICLR 2021 • Binghong Chen, Tianzhe Wang, Chengtao Li, Hanjun Dai, Le Song
Optimizing molecules for desired properties is a fundamental yet challenging task in chemistry, material science and drug discovery.
no code implementations • 1 Jan 2021 • Binghong Chen, Chengtao Li, Hanjun Dai, Rampi Ramprasad, Le Song
We demonstrate that our method is able to propose high-quality polymerization plans for a dataset of 52 real-world polymers, of which a significant portion successfully recovers the currently-in-used polymerization processes in the real world.
no code implementations • 1 Jan 2021 • Xinshi Chen, Yan Zhu, Haowen Xu, Muhan Zhang, Liang Xiong, Le Song
We propose a surprisingly simple but effective two-time-scale (2TS) model for learning user representations for recommendation.
no code implementations • NeurIPS Workshop LMCA 2020 • Haoran Sun, Wenbo Chen, Hui Li, Le Song
Branch-and-Bound~(B\&B) is a general and widely used algorithm paradigm for solving Mixed Integer Programming~(MIP).
no code implementations • NeurIPS Workshop LMCA 2020 • Hanjun Dai, Xinshi Chen, Yu Li, Xin Gao, Le Song
Recently there is a surge of interests in using graph neural networks (GNNs) to learn algorithms.
no code implementations • NeurIPS 2020 • Yingxiang Yang, Negar Kiyavash, Le Song, Niao He
Macroscopic data aggregated from microscopic events are pervasive in machine learning, such as country-level COVID-19 infection statistics based on city-level data.
1 code implementation • NeurIPS 2020 • Xinshi Chen, Yufei Zhang, Christoph Reisinger, Le Song
Recently, there is a surge of interest in combining deep learning models with reasoning in order to handle more sophisticated learning tasks.
no code implementations • 4 Nov 2020 • Rohit Batra, Hanjun Dai, Tran Doan Huan, Lihua Chen, Chiho Kim, Will R. Gutekunst, Le Song, Rampi Ramprasad
The design/discovery of new materials is highly non-trivial owing to the near-infinite possibilities of material candidates, and multiple required property/performance objectives.
no code implementations • EMNLP 2020 • Kunlong Chen, Weidi Xu, Xingyi Cheng, Zou Xiaochuan, Yuyu Zhang, Le Song, Taifeng Wang, Yuan Qi, Wei Chu
Numerical reasoning over texts, such as addition, subtraction, sorting and counting, is a challenging machine reading comprehension task, since it requires both natural language understanding and arithmetic computation.
Ranked #1 on Question Answering on DROP Test
no code implementations • 16 Sep 2020 • Ping Nie, Yuyu Zhang, Arun Ramamurthy, Le Song
Existing approaches for open-domain question answering (QA) are typically designed for questions that require either single-hop or multi-hop reasoning, which make strong assumptions of the complexity of questions to be answered.
Ranked #16 on Question Answering on HotpotQA
1 code implementation • ICML 2020 • Binghong Chen, Chengtao Li, Hanjun Dai, Le Song
Retrosynthetic planning is a critical task in organic chemistry which identifies a series of reactions that can lead to the synthesis of a target product.
Ranked #5 on Multi-step retrosynthesis on USPTO-190
1 code implementation • 24 Jun 2020 • Xinshi Chen, Yufei Zhang, Christoph Reisinger, Le Song
Recently, there has been a surge of interest in combining deep learning models with reasoning in order to handle more sophisticated learning tasks.
2 code implementations • NeurIPS 2020 • Ziqi Liu, Zhengwei Wu, Zhiqiang Zhang, Jun Zhou, Shuang Yang, Le Song, Yuan Qi
However, due to the intractable computation of optimal sampling distribution, these sampling algorithms are suboptimal for GCNs and are not applicable to more general graph neural networks (GNNs) where the message aggregator contains learned weights rather than fixed weights, such as Graph Attention Networks (GAT).
Ranked #1 on Node Property Prediction on ogbn-proteins
1 code implementation • ICML 2020 • Xinshi Chen, Hanjun Dai, Yu Li, Xin Gao, Le Song
Similar to algorithms, the optimal depth of a deep architecture may be different for different input instances, either to avoid ``over-thinking'', or because we want to compute less for operations converged already.
1 code implementation • ICLR 2020 • Elizabeth Dinella, Hanjun Dai, Ziyang Li, Mayur Naik, Le Song, Ke Wang
We present a learning-based approach to detect and fix a broad range of bugs in Javascript programs.
no code implementations • 19 Apr 2020 • Chao Qu, Hui Li, Chang Liu, Junwu Xiong, James Zhang, Wei Chu, Weiqiang Wang, Yuan Qi, Le Song
We propose a \emph{collaborative} multi-agent reinforcement learning algorithm named variational policy propagation (VPP) to learn a \emph{joint} policy through the interactions over agents.
Multi-agent Reinforcement Learning reinforcement-learning +2
1 code implementation • CVPR 2021 • Weiyang Liu, Rongmei Lin, Zhen Liu, James M. Rehg, Liam Paull, Li Xiong, Le Song, Adrian Weller
The inductive bias of a neural network is largely determined by the architecture and the training algorithm.
no code implementations • 28 Feb 2020 • Yuyu Zhang, Ping Nie, Xiubo Geng, Arun Ramamurthy, Le Song, Daxin Jiang
Recent studies on open-domain question answering have achieved prominent performance improvement using pre-trained language models such as BERT.
1 code implementation • 27 Feb 2020 • Ziqi Liu, Chaochao Chen, Xinxing Yang, Jun Zhou, Xiaolong Li, Le Song
We present, GEM, the first heterogeneous graph neural network approach for detecting malicious accounts at Alipay, one of the world's leading mobile cashless payment platform.
1 code implementation • ICLR 2020 • Xinshi Chen, Yu Li, Ramzan Umarov, Xin Gao, Le Song
The key idea of E2Efold is to directly predict the RNA base-pairing matrix, and use an unrolled algorithm for constrained programming as the template for deep architectures to enforce constraints.
1 code implementation • ICLR 2020 • Yuyu Zhang, Xinshi Chen, Yuan Yang, Arun Ramamurthy, Bo Li, Yuan Qi, Le Song
In this paper, we explore the combination of MLNs and GNNs, and use graph neural networks for variational inference in MLN.
1 code implementation • NeurIPS 2019 • Hanjun Dai, Chengtao Li, Connor W. Coley, Bo Dai, Le Song
Retrosynthesis is one of the fundamental problems in organic chemistry.
Ranked #25 on Single-step retrosynthesis on USPTO-50k
no code implementations • 13 Nov 2019 • Xuesong Shi, Dongjiang Li, Pengpeng Zhao, Qinbin Tian, Yuxin Tian, Qiwei Long, Chunhao Zhu, Jingwei Song, Fei Qiao, Le Song, Yangquan Guo, Zhigang Wang, Yimin Zhang, Baoxing Qin, Wei Yang, Fangshi Wang, Rosa H. M. Chan, Qi She
We also design benchmarking metrics for lifelong SLAM, with which the robustness and accuracy of pose estimation are evaluated separately.
1 code implementation • NeurIPS 2019 • Weiyang Liu, Zhen Liu, James M. Rehg, Le Song
By generalizing inner product with a bilinear matrix, we propose the neural similarity which serves as a learnable parametric similarity measure for CNNs.
1 code implementation • ICLR 2020 • Yuan Yang, Le Song
The capability of making interpretable and self-explanatory decisions is essential for developing responsible machine learning systems.
no code implementations • 25 Sep 2019 • xiaofu Chang, Jianfeng Wen, Xuqin Liu, Yanming Fang, Le Song, Yuan Qi
To model the dependency between latent dynamic representations of each node, we define a mixture of temporal cascades in which a node's neural representation depends on not only this node's previous representations but also the previous representations of related nodes that have interacted with this node.
no code implementations • ACL 2019 • Yuyu Zhang, Le Song
Sequential recurrent neural networks have achieved superior performance on language modeling, but overlook the structure information in natural language.
no code implementations • 23 Jun 2019 • Jian-Ya Ding, Chao Zhang, Lei Shen, Shengyin Li, Bing Wang, Yinghui Xu, Le Song
In many applications, a similar MIP model is solved on a regular basis, maintaining remarkable similarities in model structures and solution appearances but differing in formulation coefficients.
1 code implementation • CVPR 2020 • Rongmei Lin, Weiyang Liu, Zhen Liu, Chen Feng, Zhiding Yu, James M. Rehg, Li Xiong, Le Song
Inspired by the Thomson problem in physics where the distribution of multiple propelling electrons on a unit sphere can be modeled via minimizing some potential energy, hyperspherical energy minimization has demonstrated its potential in regularizing neural networks and improving their generalization power.
no code implementations • 5 Jun 2019 • Yuyu Zhang, Xinshi Chen, Yuan Yang, Arun Ramamurthy, Bo Li, Yuan Qi, Le Song
Effectively combining logic reasoning and probabilistic inference has been a long-standing goal of machine learning: the former has the ability to generalize with small training data, while the latter provides a principled framework for dealing with noisy data.
1 code implementation • ICLR 2020 • Harsh Shrivastava, Xinshi Chen, Binghong Chen, Guanghui Lan, Srinvas Aluru, Han Liu, Le Song
Recently, there is a surge of interest to learn algorithms directly based on data, and in this case, learn to map empirical covariance to the sparse precision matrix.
no code implementations • ICLR 2019 • Xujie Si, Yuan Yang, Hanjun Dai, Mayur Naik, Le Song
Our framework consists of three components: 1) an encoder, which embeds both the logical specification and grammar at the same time using a graph neural network; 2) a grammar adaptive policy network which enables learning a transferable policy; and 3) a reinforcement learning algorithm that jointly trains the specification and grammar embedding and adaptive policy.
1 code implementation • NeurIPS 2019 • Bo Dai, Zhen Liu, Hanjun Dai, Niao He, Arthur Gretton, Le Song, Dale Schuurmans
We present an efficient algorithm for maximum likelihood estimation (MLE) of exponential family models, with a general parametrization of the energy function that includes neural networks.
1 code implementation • ICLR 2020 • Binghong Chen, Bo Dai, Qinjie Lin, Guo Ye, Han Liu, Le Song
We propose a meta path planning algorithm named \emph{Neural Exploration-Exploitation Trees~(NEXT)} for learning from prior experience for solving new path planning problems in high dimensional continuous state and action spaces.
no code implementations • 7 Feb 2019 • Romain Lopez, Chenchen Li, Xiang Yan, Junwu Xiong, Michael. I. Jordan, Yuan Qi, Le Song
We address a practical problem ubiquitous in modern marketing campaigns, in which a central agent tries to learn a policy for allocating strategic financial incentives to customers and observes only bandit feedback.
2 code implementations • 2 Feb 2019 • Xinshi Chen, Hanjun Dai, Le Song
We present a particle flow realization of Bayes' rule, where an ODE-based neural operator is used to transport particles from a prior to its posterior after a new observation.
no code implementations • NeurIPS 2019 • Chao Qu, Shie Mannor, Huan Xu, Yuan Qi, Le Song, Junwu Xiong
To the best of our knowledge, it is the first MARL algorithm with convergence guarantee in the control, off-policy and non-linear function approximation setting.
Multi-agent Reinforcement Learning reinforcement-learning +1
1 code implementation • 27 Dec 2018 • Xinshi Chen, Shuang Li, Hui Li, Shaohua Jiang, Yuan Qi, Le Song
There are great interests as well as many challenges in applying reinforcement learning (RL) to recommendation systems.
Generative Adversarial Network Model-based Reinforcement Learning +3
no code implementations • ICLR 2020 • Hui Li, Kailiang Hu, Zhibang Ge, Tao Jiang, Yuan Qi, Le Song
Counterfactual Regret Minimization (CRF) is a fundamental and effective technique for solving Imperfect Information Games (IIG).
1 code implementation • NeurIPS 2019 • Albert Shaw, Wei Wei, Weiyang Liu, Le Song, Bo Dai
Neural Architecture Search (NAS) has been quite successful in constructing state-of-the-art models on a variety of tasks.
1 code implementation • NeurIPS 2018 • Xujie Si, Hanjun Dai, Mukund Raghothaman, Mayur Naik, Le Song
A fundamental problem in program verification concerns inferring loop invariants.
1 code implementation • NeurIPS 2018 • Bo Dai, Hanjun Dai, Niao He, Weiyang Liu, Zhen Liu, Jianshu Chen, Lin Xiao, Le Song
This flexible function class couples the variational distribution with the original parameters in the graphical models, allowing end-to-end learning of the graphical models by back-propagation through the variational distribution.
1 code implementation • 26 Nov 2018 • Chenchen Li, Xiang Yan, Xiaotie Deng, Yuan Qi, Wei Chu, Le Song, Junlong Qiao, Jianshan He, Junwu Xiong
Uplift modeling aims to directly model the incremental impact of a treatment on an individual response.
no code implementations • NeurIPS 2018 • Shuang Li, Shuai Xiao, Shixiang Zhu, Nan Du, Yao Xie, Le Song
Social goods, such as healthcare, smart city, and information networks, often produce ordered event data in continuous time.
1 code implementation • 6 Nov 2018 • Bo Dai, Hanjun Dai, Arthur Gretton, Le Song, Dale Schuurmans, Niao He
We investigate penalized maximum log-likelihood estimation for exponential family distributions whose natural parameter resides in a reproducing kernel Hilbert space.
no code implementations • 27 Sep 2018 • Xiaojun Xu, Yue Yu, Bo Li, Le Song, Chengfeng Liu, Carl Gunter
Extensive experiments are conducted to show that the proposed detection mechanism can achieve AUC above 90% against the two attack strategies on both Cora and Citeseer datasets.
no code implementations • 23 Aug 2018 • Chenchen Li, Xiang Yan, Xiaotie Deng, Yuan Qi, Wei Chu, Le Song, Junlong Qiao, Jianshan He, Junwu Xiong
Then we develop a variant of Latent Dirichlet Allocation (LDA) to infer latent variables under the current market environment, which represents the preferences of customers and strategies of competitors.
1 code implementation • ICLR 2019 • Jianbo Chen, Le Song, Martin J. Wainwright, Michael. I. Jordan
We study instancewise feature importance scoring as a method for model interpretation.
no code implementations • ICML 2018 • Hanjun Dai, Zornitsa Kozareva, Bo Dai, Alex Smola, Le Song
Many graph analytics problems can be solved via iterative algorithms where the solutions are often characterized by a set of steady-state conditions.
1 code implementation • ICML 2018 • Hanjun Dai, Hui Li, Tian Tian, Xin Huang, Lin Wang, Jun Zhu, Le Song
Deep learning on graph structures has shown exciting results in various applications.
no code implementations • 31 May 2018 • Yuyu Zhang, Hanjun Dai, Kamil Toraman, Le Song
Our model learns to reason with neural embeddings of both knowledge graphs.
4 code implementations • NeurIPS 2018 • Weiyang Liu, Rongmei Lin, Zhen Liu, Lixin Liu, Zhiding Yu, Bo Dai, Le Song
In light of this intuition, we reduce the redundancy regularization problem to generic energy minimization, and propose a minimum hyperspherical energy (MHE) objective as generic regularization for neural networks.
no code implementations • 22 May 2018 • Yichen Wang, Le Song, Hongyuan Zha
We first propose a unified KL framework that generalizes existing maximum entropy inverse optimal control methods.
1 code implementation • CVPR 2018 • Weiyang Liu, Zhen Liu, Zhiding Yu, Bo Dai, Rongmei Lin, Yisen Wang, James M. Rehg, Le Song
Inner product-based convolution has been a central component of convolutional neural networks (CNNs) and the key to learning visual representations.
1 code implementation • CVPR 2018 • Yisen Wang, Weiyang Liu, Xingjun Ma, James Bailey, Hongyuan Zha, Le Song, Shu-Tao Xia
We refer to this more complex scenario as the \textbf{open-set noisy label} problem and show that it is nontrivial in order to make accurate predictions.
1 code implementation • ICLR 2018 • Hanjun Dai, Yingtao Tian, Bo Dai, Steven Skiena, Le Song
Deep generative models have been enjoying success in modeling continuous data.
3 code implementations • ICML 2018 • Jianbo Chen, Le Song, Martin J. Wainwright, Michael. I. Jordan
We introduce instancewise feature selection as a methodology for model interpretation.
3 code implementations • 3 Feb 2018 • Ziqi Liu, Chaochao Chen, Longfei Li, Jun Zhou, Xiaolong Li, Le Song, Yuan Qi
We present, GeniePath, a scalable approach for learning adaptive receptive fields of neural networks defined on permutation invariant graph data.
no code implementations • ICLR 2018 • Bo Dai, Albert Shaw, Niao He, Lihong Li, Le Song
This paper proposes a new actor-critic-style algorithm called Dual Actor-Critic or Dual-AC.
no code implementations • ICML 2018 • Bo Dai, Albert Shaw, Lihong Li, Lin Xiao, Niao He, Zhen Liu, Jianshu Chen, Le Song
When function approximation is used, solving the Bellman optimality equation with stability guarantees has remained a major open problem in reinforcement learning for decades.
no code implementations • NeurIPS 2017 • Yichen Wang, Xiaojing Ye, Hongyuan Zha, Le Song
Point processes are powerful tools to model user activities and have a plethora of applications in social sciences.
no code implementations • NeurIPS 2017 • Weiyang Liu, Yan-Ming Zhang, Xingguo Li, Zhiding Yu, Bo Dai, Tuo Zhao, Le Song
In light of such challenges, we propose hyperspherical convolution (SphereConv), a novel learning framework that gives angular representations on hyperspheres.
2 code implementations • ICML 2018 • Jianfei Chen, Jun Zhu, Le Song
Previous attempts on reducing the receptive field size by subsampling neighbors do not have a convergence guarantee, and their receptive field size per node is still in the order of hundreds.
no code implementations • ICML 2018 • Weiyang Liu, Bo Dai, Xingguo Li, Zhen Liu, James M. Rehg, Le Song
We propose an active teacher model that can actively query the learner (i. e., make the learner take exams) for estimating the learner's status and provably guide the learner to achieve faster convergence.
1 code implementation • 12 Sep 2017 • Yuyu Zhang, Hanjun Dai, Zornitsa Kozareva, Alexander J. Smola, Le Song
Knowledge graph (KG) is known to be helpful for the task of question answering (QA), since it provides well-structured relational information between entities, and allows one to further infer indirect facts.
1 code implementation • 22 Aug 2017 • Xiaojun Xu, Chang Liu, Qian Feng, Heng Yin, Le Song, Dawn Song
The problem of cross-platform binary code similarity detection aims at detecting whether two binary functions coming from different platforms are similar or not.
no code implementations • NeurIPS 2017 • Le Song, Santosh Vempala, John Wilmes, Bo Xie
Moreover, this hard family of functions is realizable with a small (sublinear in dimension) number of activation units in the single hidden layer.
2 code implementations • ICML 2017 • Weiyang Liu, Bo Dai, Ahmad Humayun, Charlene Tay, Chen Yu, Linda B. Smith, James M. Rehg, Le Song
Different from traditional machine teaching which views the learners as batch algorithms, we study a new paradigm where the learner uses an iterative algorithm and a teacher can feed examples sequentially and intelligently based on the current performance of the learner.
1 code implementation • NeurIPS 2017 • Shuai Xiao, Mehrdad Farajtabar, Xiaojing Ye, Junchi Yan, Le Song, Hongyuan Zha
Point processes are becoming very popular in modeling asynchronous sequential data due to their sound mathematical foundation and strength in modeling a variety of real-world phenomena.
2 code implementations • ICML 2017 • Rakshit Trivedi, Hanjun Dai, Yichen Wang, Le Song
The occurrence of a fact (edge) is modeled as a multivariate point process whose intensity function is modulated by the score for that fact computed based on the learned entity embeddings.
20 code implementations • CVPR 2017 • Weiyang Liu, Yandong Wen, Zhiding Yu, Ming Li, Bhiksha Raj, Le Song
This paper addresses deep face recognition (FR) problem under open-set protocol, where ideal face features are expected to have smaller maximal intra-class distance than minimal inter-class distance under a suitably chosen metric space.
Ranked #1 on Face Verification on CK+
8 code implementations • NeurIPS 2017 • Hanjun Dai, Elias B. Khalil, Yuyu Zhang, Bistra Dilkina, Le Song
The design of good heuristics or approximation algorithms for NP-hard combinatorial optimization problems often requires significant specialized knowledge and trial-and-error.
no code implementations • 24 Mar 2017 • Shuai Xiao, Junchi Yan, Mehrdad Farajtabar, Le Song, Xiaokang Yang, Hongyuan Zha
A variety of real-world processes (over networks) produce sequences of data whose complex temporal dynamics need to be studied.
no code implementations • ICML 2017 • Mehrdad Farajtabar, Jiachen Yang, Xiaojing Ye, Huan Xu, Rakshit Trivedi, Elias Khalil, Shuang Li, Le Song, Hongyuan Zha
We propose the first multistage intervention framework that tackles fake news in social networks by combining reinforcement learning with a point process network activity model.
1 code implementation • 28 Feb 2017 • Kenji Kawaguchi, Bo Xie, Vikas Verma, Le Song
For deep models, with no unrealistic assumptions, we prove universal approximation ability, a lower bound on approximation error, a partial optimization guarantee, and a generalization bound.
no code implementations • ICML 2017 • Yichen Wang, Grady Williams, Evangelos Theodorou, Le Song
Temporal point processes have been widely applied to model event sequence data generated by online users.
2 code implementations • ICML 2017 • Bo Dai, Ruiqi Guo, Sanjiv Kumar, Niao He, Le Song
Learning-based binary hashing has become a powerful paradigm for fast search and retrieval in massive databases.
no code implementations • 8 Dec 2016 • Nan Du, YIngyu Liang, Maria-Florina Balcan, Manuel Gomez-Rodriguez, Hongyuan Zha, Le Song
A typical viral marketing model identifies influential users in a social network to maximize a single product adoption assuming unlimited user attention, campaign budgets, and time.
no code implementations • NeurIPS 2016 • Yichen Wang, Nan Du, Rakshit Trivedi, Le Song
Matching users to the right items at the right time is a fundamental task in recommendation systems.
1 code implementation • 21 Nov 2016 • Edward Choi, Mohammad Taha Bahadori, Le Song, Walter F. Stewart, Jimeng Sun
-Interpretation:The representations learned by deep learning methods should align with medical knowledge.
no code implementations • 9 Nov 2016 • Bo Xie, YIngyu Liang, Le Song
In this paper, we answer these questions by analyzing one-hidden-layer neural networks with ReLU activation, and show that despite the non-convexity, neural networks with diverse units have no spurious local minima.
no code implementations • 24 Oct 2016 • Behzad Tabibian, Isabel Valera, Mehrdad Farajtabar, Le Song, Bernhard Schölkopf, Manuel Gomez-Rodriguez
Then, we propose a temporal point process modeling framework that links these temporal traces to robust, unbiased and interpretable notions of information reliability and source trustworthiness.
no code implementations • 14 Oct 2016 • Shuang Li, Yao Xie, Le Song
We present a novel distribution-free approach, the data-driven threshold machine (DTM), for a fundamental problem at the core of many learning tasks: choose a threshold for a given pre-specified level that bounds the tail probability of the maximum of a (possibly dependent but stationary) random sequence.
no code implementations • 13 Sep 2016 • Hanjun Dai, Yichen Wang, Rakshit Trivedi, Le Song
DeepCoevolve use recurrent neural network (RNN) over evolving networks to define the intensity function in point processes, which allows the model to capture complex mutual influence between users and items, and the feature evolution over time.
no code implementations • 3 Aug 2016 • Niao He, Zaid Harchaoui, Yichen Wang, Le Song
Since almost all gradient-based optimization algorithms rely on Lipschitz-continuity, optimizing Poisson likelihood models with a guarantee of convergence can be challenging, especially for large-scale problems.
no code implementations • 15 Jul 2016 • Bo Dai, Niao He, Yunpeng Pan, Byron Boots, Le Song
In such problems, each sample $x$ itself is associated with a conditional distribution $p(z|x)$ represented by samples $\{z_i\}_{i=1}^M$, and the goal is to learn a function $f$ that links these conditional distributions to target values $y$.
no code implementations • 22 May 2016 • Mohammad Reza Karimi, Erfan Tavakoli, Mehrdad Farajtabar, Le Song, Manuel Gomez-Rodriguez
Many users in online social networks are constantly trying to gain attention from their followers by broadcasting posts to them.
no code implementations • 29 Mar 2016 • Shuang Li, Yao Xie, Mehrdad Farajtabar, Apurv Verma, Le Song
Large volume of networked streaming event data are becoming increasingly available in a wide variety of applications, such as social network analysis, Internet traffic monitoring and healthcare analytics.
1 code implementation • 17 Mar 2016 • Hanjun Dai, Bo Dai, Le Song
Kernel classifiers and regressors designed for structured data, such as sequences, trees and graphs, have significantly advanced a number of interdisciplinary areas such as computational biology and drug design.
no code implementations • NeurIPS 2015 • Nan Du, Yichen Wang, Niao He, Jimeng Sun, Le Song
By making personalized suggestions, a recommender system is playing a crucial role in improving the engagement of users in modern web-services.
no code implementations • NeurIPS 2015 • Yu-Ying Liu, Shuang Li, Fuxin Li, Le Song, James M. Rehg
The Continuous-Time Hidden Markov Model (CT-HMM) is an attractive approach to modeling disease progression due to its ability to describe noisy observations arriving irregularly in time.
no code implementations • NeurIPS 2015 • Shuang Li, Yao Xie, Hanjun Dai, Le Song
Detecting the emergence of an abrupt change-point is a classic problem in statistics and machine learning.
no code implementations • 13 Nov 2015 • Mehrdad Farajtabar, Safoora Yousefi, Long Q. Tran, Le Song, Hongyuan Zha
In our experiments, we demonstrate that our algorithm is able to achieve the-state-of-the-art performance in terms of analyzing, predicting, and prioritizing events.
no code implementations • 1 Sep 2015 • Yao Xie, Ruiyang Song, Hanjun Dai, Qingbin Li, Le Song
The optimization problem for OSDR is non-convex and hard to analyze in general; we provide convergence analysis of OSDR in a simple linear regression setting.
1 code implementation • NeurIPS 2015 • Mehrdad Farajtabar, Yichen Wang, Manuel Gomez Rodriguez, Shuang Li, Hongyuan Zha, Le Song
Information diffusion in online social networks is affected by the underlying network topology, but it also has the power to change it.
1 code implementation • 5 Jul 2015 • Shuang Li, Yao Xie, Hanjun Dai, Le Song
A novel theoretical result of the paper is the characterization of the tail probability of these statistics using the change-of-measure technique, which focuses on characterizing the tail of the detection statistics rather than obtaining its asymptotic distribution under the null distribution.
no code implementations • 9 Jun 2015 • Bo Dai, Niao He, Hanjun Dai, Le Song
Bayesian methods are appealing in their flexibility in modeling complex data and ability in capturing uncertainty in parameters.
no code implementations • NeurIPS 2015 • Bo Xie, YIngyu Liang, Le Song
We propose a simple, computationally efficient, and memory friendly algorithm based on the "doubly stochastic gradients" to scale up a range of kernel nonlinear component analysis, such as kernel PCA, CCA and SVD.
no code implementations • 23 Mar 2015 • Maria-Florina Balcan, YIngyu Liang, Le Song, David Woodruff, Bo Xie
Can we perform kernel PCA on the entire dataset in a distributed and communication efficient fashion while maintaining provable and strong guarantees in solution quality?
1 code implementation • ICCV 2015 • Zichao Yang, Marcin Moczulski, Misha Denil, Nando de Freitas, Alex Smola, Le Song, Ziyu Wang
The fully connected layers of a deep convolutional neural network typically contain over 90% of the network parameters, and consume the majority of the memory required to store the network parameters.
Ranked #43 on Image Classification on MNIST
no code implementations • 19 Dec 2014 • Zichao Yang, Alexander J. Smola, Le Song, Andrew Gordon Wilson
Kernel methods have great promise for learning rich statistical representations of large modern datasets.
no code implementations • NeurIPS 2014 • Nan Du, YIngyu Liang, Maria-Florina F. Balcan, Le Song
Coverage functions are an important class of discrete functions that capture laws of diminishing returns.
no code implementations • NeurIPS 2014 • Mehrdad Farajtabar, Nan Du, Manuel Gomez Rodriguez, Isabel Valera, Hongyuan Zha, Le Song
Events in an online social network can be categorized roughly into endogenous events, where users just respond to the actions of their neighbors within the network, or exogenous events, where users take actions due to drives external to the network.
1 code implementation • NeurIPS 2014 • Bo Dai, Bo Xie, Niao He, YIngyu Liang, Anant Raj, Maria-Florina Balcan, Le Song
The general perception is that kernel methods are not scalable, and neural nets are the methods of choice for nonlinear learning problems.
no code implementations • NeurIPS 2014 • Maria-Florina Balcan, Chris Berlind, Avrim Blum, Emma Cohen, Kaushik Patnaik, Le Song
We examine an important setting for engineered systems in which low-power distributed sensors are each making highly noisy measurements of some unknown target function.
no code implementations • 12 May 2014 • Hadi Daneshmand, Manuel Gomez-Rodriguez, Le Song, Bernhard Schoelkopf
Can we recover the hidden network structures from these observed cascades?
no code implementations • 16 Jan 2014 • Le Song, Han Liu, Ankur Parikh, Eric Xing
Tree structured graphical models are powerful at expressing long range or hierarchical dependency among many variables, and have been widely applied in different areas of computer science and statistics.
no code implementations • 8 Dec 2013 • Nan Du, YIngyu Liang, Maria Florina Balcan, Le Song
The typical algorithmic problem in viral marketing aims to identify a set of influential users in a social network, who, when convinced to adopt a product, shall influence other users in the network and trigger a large cascade of adoptions.
no code implementations • NeurIPS 2013 • Le Song, Bo Dai
Kernel embedding of distributions has led to many recent advances in machine learning.
no code implementations • NeurIPS 2013 • Nan Du, Le Song, Manuel Gomez Rodriguez, Hongyuan Zha
If a piece of information is released from a media site, can it spread, in 1 month, to a million web pages?
no code implementations • 13 Nov 2013 • Le Song, Animashree Anandkumar, Bo Dai, Bo Xie
We establish that the sample complexity for the proposed method is quadratic in the number of latent components and is a low order polynomial in the other relevant parameters.
no code implementations • 7 Oct 2013 • Alekh Agarwal, Sham M. Kakade, Nikos Karampatziakis, Le Song, Gregory Valiant
This work provides simple algorithms for multi-class (and multi-label) prediction in settings where both the number of examples n and the data dimension d are relatively large.
no code implementations • NeurIPS 2012 • Nan Du, Le Song, Ming Yuan, Alex J. Smola
However, the underlying transmission networks are often hidden and incomplete, and we observe only the time stamps when cascades of events happen.
no code implementations • NeurIPS 2011 • Animashree Anandkumar, Kamalika Chaudhuri, Daniel J. Hsu, Sham M. Kakade, Le Song, Tong Zhang
The setting is one where we only have samples from certain observed variables in the tree, and our goal is to estimate the tree structure (i. e., the graph of how the underlying hidden variables are connected to each other and to the observed variables).
no code implementations • NeurIPS 2011 • Kenji Fukumizu, Le Song, Arthur Gretton
A nonparametric kernel-based method for realizing Bayes' rule is proposed, based on kernel representations of probabilities in reproducing kernel Hilbert spaces.
no code implementations • NeurIPS 2011 • Le Song, Eric P. Xing, Ankur P. Parikh
Latent tree graphical models are natural tools for expressing long range and hierarchical dependencies among many variables which are common in computer vision, bioinformatics and natural language processing problems.
no code implementations • NeurIPS 2009 • Mladen Kolar, Le Song, Eric P. Xing
In this paper, we investigate sparsistent learning of a sub-family of this model --- piecewise constant VCVS models.
no code implementations • NeurIPS 2009 • Le Song, Mladen Kolar, Eric P. Xing
In this paper, we propose a time-varying dynamic Bayesian network (TV-DBN) for modeling the structurally varying directed dependency structures underlying non-stationary biological/neural time series.
no code implementations • 31 Dec 2008 • Eric P. Xing, Wenjie Fu, Le Song
In a dynamic social or biological environment, the interactions between the actors can undergo large and systematic changes.
no code implementations • NeurIPS 2008 • Novi Quadrianto, Le Song, Alex J. Smola
Object matching is a fundamental operation in data analysis.
no code implementations • NeurIPS 2008 • Xinhua Zhang, Le Song, Arthur Gretton, Alex J. Smola
Many machine learning algorithms can be formulated in the framework of statistical independence such as the Hilbert Schmidt Independence Criterion.