Search Results for author: Jingling Li

Found 10 papers, 6 papers with code

CSRec: Rethinking Sequential Recommendation from A Causal Perspective

1 code implementation23 Aug 2024 Xiaoyu Liu, Jiaxin Yuan, YuHang Zhou, Jingling Li, Furong Huang, Wei Ai

The essence of sequential recommender systems (RecSys) lies in understanding how users make decisions.

Sequential Recommendation

How to Solve Contextual Goal-Oriented Problems with Offline Datasets?

1 code implementation14 Aug 2024 Ying Fan, Jingling Li, Adith Swaminathan, Aditya Modi, Ching-An Cheng

We present a novel method, Contextual goal-Oriented Data Augmentation (CODA), which uses commonly available unlabeled trajectories and context-goal pairs to solve Contextual Goal-Oriented (CGO) problems.

Data Augmentation

Hindsight Learning for MDPs with Exogenous Inputs

1 code implementation13 Jul 2022 Sean R. Sinclair, Felipe Frujeri, Ching-An Cheng, Luke Marshall, Hugo Barbalho, Jingling Li, Jennifer Neville, Ishai Menache, Adith Swaminathan

Many resource management problems require sequential decision-making under uncertainty, where the only uncertainty affecting the decision outcomes are exogenous variables outside the control of the decision-maker.

counterfactual Decision Making +4

How Does a Neural Network's Architecture Impact Its Robustness to Noisy Labels?

no code implementations NeurIPS 2021 Jingling Li, Mozhi Zhang, Keyulu Xu, John P. Dickerson, Jimmy Ba

Our framework measures a network's robustness via the predictive power in its representations -- the test performance of a linear model trained on the learned representations using a small set of clean labels.

Learning with noisy labels

How Neural Networks Extrapolate: From Feedforward to Graph Neural Networks

3 code implementations ICLR 2021 Keyulu Xu, Mozhi Zhang, Jingling Li, Simon S. Du, Ken-ichi Kawarabayashi, Stefanie Jegelka

Second, in connection to analyzing the successes and limitations of GNNs, these results suggest a hypothesis for which we provide theoretical and empirical evidence: the success of GNNs in extrapolating algorithmic tasks to new data (e. g., larger graphs or edge weights) relies on encoding task-specific non-linearities in the architecture or features.

Understanding Generalization in Deep Learning via Tensor Methods

no code implementations14 Jan 2020 Jingling Li, Yanchao Sun, Jiahao Su, Taiji Suzuki, Furong Huang

Recently proposed complexity measures have provided insights to understanding the generalizability in neural networks from perspectives of PAC-Bayes, robustness, overparametrization, compression and so on.

Deep Learning

Tensorial Neural Networks: Generalization of Neural Networks and Application to Model Compression

no code implementations25 May 2018 Jiahao Su, Jingling Li, Bobby Bhattacharjee, Furong Huang

We propose tensorial neural networks (TNNs), a generalization of existing neural networks by extending tensor operations on low order operands to those on high order ones.

Model Compression Tensor Decomposition

Cannot find the paper you are looking for? You can Submit a new open access paper.