Search Results for author: Kaustubh Sridhar

Found 8 papers, 5 papers with code

Memory-Consistent Neural Networks for Imitation Learning

no code implementations9 Oct 2023 Kaustubh Sridhar, Souradeep Dutta, Dinesh Jayaraman, James Weimer, Insup Lee

Imitation learning considerably simplifies policy synthesis compared to alternative approaches by exploiting access to expert demonstrations.

Imitation Learning

Guaranteed Conformance of Neurosymbolic Models to Natural Constraints

1 code implementation2 Dec 2022 Kaustubh Sridhar, Souradeep Dutta, James Weimer, Insup Lee

Next, using these memories we partition the state space into disjoint subsets and compute bounds that should be respected by the neural network in each subset.

Predict-and-Critic: Accelerated End-to-End Predictive Control for Cloud Computing through Reinforcement Learning

no code implementations2 Dec 2022 Kaustubh Sridhar, Vikramank Singh, Balakrishnan Narayanaswamy, Abishek Sankararaman

PnC jointly trains a prediction model and a terminal Q function that approximates cost-to-go over a long horizon, by back-propagating the cost of decisions through the optimization problem \emph{and from the future}.

Cloud Computing Model Predictive Control

CODiT: Conformal Out-of-Distribution Detection in Time-Series Data

1 code implementation24 Jul 2022 Ramneet Kaur, Kaustubh Sridhar, Sangdon Park, Susmit Jha, Anirban Roy, Oleg Sokolsky, Insup Lee

Machine learning models are prone to making incorrect predictions on inputs that are far from the training distribution.

Anomaly Detection Autonomous Driving +6

Towards Alternative Techniques for Improving Adversarial Robustness: Analysis of Adversarial Training at a Spectrum of Perturbations

1 code implementation13 Jun 2022 Kaustubh Sridhar, Souradeep Dutta, Ramneet Kaur, James Weimer, Oleg Sokolsky, Insup Lee

Algorithm design of AT and its variants are focused on training models at a specified perturbation strength $\epsilon$ and only using the feedback from the performance of that $\epsilon$-robust model to improve the algorithm.

Adversarial Robustness Quantization

Real-Time Detectors for Digital and Physical Adversarial Inputs to Perception Systems

no code implementations23 Feb 2020 Yiannis Kantaros, Taylor Carpenter, Kaustubh Sridhar, Yahan Yang, Insup Lee, James Weimer

To highlight this, we demonstrate the efficiency of the proposed detector on ImageNet, a task that is computationally challenging for the majority of relevant defenses, and on physically attacked traffic signs that may be encountered in real-time autonomy applications.

Cannot find the paper you are looking for? You can Submit a new open access paper.