Search Results for author: Lior Horesh

Found 32 papers, 6 papers with code

AI Hilbert: A New Paradigm for Scientific Discovery by Unifying Data and Background Knowledge

no code implementations18 Aug 2023 Ryan Cory-Wright, Bachir El Khadir, Cristina Cornelio, Sanjeeb Dash, Lior Horesh

The discovery of scientific formulae that parsimoniously explain natural phenomena and align with existing background theory is a key goal in science.

Value-based Fast and Slow AI Nudging

no code implementations14 Jul 2023 Marianna B. Ganapini, Francesco Fabiano, Lior Horesh, Andrea Loreggia, Nicholas Mattei, Keerthiram Murugesan, Vishal Pallagani, Francesca Rossi, Biplav Srivastava, Brent Venable

Values that are relevant to a specific decision scenario are used to decide when and how to use each of these nudging modalities.

Plansformer: Generating Symbolic Plans using Transformers

no code implementations16 Dec 2022 Vishal Pallagani, Bharath Muppasani, Keerthiram Murugesan, Francesca Rossi, Lior Horesh, Biplav Srivastava, Francesco Fabiano, Andrea Loreggia

Large Language Models (LLMs) have been the subject of active research, significantly advancing the field of Natural Language Processing (NLP).

Question Answering Text Generation +2

Bayesian Experimental Design for Symbolic Discovery

no code implementations29 Nov 2022 Kenneth L. Clarkson, Cristina Cornelio, Sanjeeb Dash, Joao Goncalves, Lior Horesh, Nimrod Megiddo

This study concerns the formulation and application of Bayesian optimal experimental design to symbolic discovery, which is the inference from observational data of predictive models taking general functional forms.

Experimental Design Numerical Integration

Topological data analysis on noisy quantum computers

no code implementations19 Sep 2022 Ismail Yunus Akhalwaya, Shashanka Ubaru, Kenneth L. Clarkson, Mark S. Squillante, Vishnu Jejjala, Yang-Hui He, Kugendran Naidoo, Vasileios Kalantzis, Lior Horesh

In this study, we present NISQ-TDA, a fully implemented end-to-end quantum machine learning algorithm needing only a short circuit-depth, that is applicable to high-dimensional classical data, and with provable asymptotic speedup for certain classes of problems.

Quantum Machine Learning Topological Data Analysis

Distributed Adversarial Training to Robustify Deep Neural Networks at Scale

2 code implementations13 Jun 2022 Gaoyuan Zhang, Songtao Lu, Yihua Zhang, Xiangyi Chen, Pin-Yu Chen, Quanfu Fan, Lee Martie, Lior Horesh, Mingyi Hong, Sijia Liu

Spurred by that, we propose distributed adversarial training (DAT), a large-batch adversarial training framework implemented over multiple machines.

Distributed Optimization

PCENet: High Dimensional Surrogate Modeling for Learning Uncertainty

no code implementations10 Feb 2022 Paz Fink Shustin, Shashanka Ubaru, Vasileios Kalantzis, Lior Horesh, Haim Avron

In this paper, we present a novel surrogate model for representation learning and uncertainty quantification, which aims to deal with data of moderate to high dimensions.

Dimensionality Reduction Representation Learning +2

AI Descartes: Combining Data and Theory for Derivable Scientific Discovery

1 code implementation3 Sep 2021 Cristina Cornelio, Sanjeeb Dash, Vernon Austel, Tyler Josephson, Joao Goncalves, Kenneth Clarkson, Nimrod Megiddo, Bachir El Khadir, Lior Horesh

We develop a method to enable principled derivations of models of natural phenomena from axiomatic knowledge and experimental data by combining logical reasoning with symbolic regression.

Automated Theorem Proving BIG-bench Machine Learning +2

Quantum Topological Data Analysis with Linear Depth and Exponential Speedup

no code implementations5 Aug 2021 Shashanka Ubaru, Ismail Yunus Akhalwaya, Mark S. Squillante, Kenneth L. Clarkson, Lior Horesh

In this paper, we completely overhaul the QTDA algorithm to achieve an improved exponential speedup and depth complexity of $O(n\log(1/(\delta\epsilon)))$.

Quantum Machine Learning Topological Data Analysis

E-PDDL: A Standardized Way of Defining Epistemic Planning Problems

no code implementations19 Jul 2021 Francesco Fabiano, Biplav Srivastava, Jonathan Lenchner, Lior Horesh, Francesca Rossi, Marianna Bergamaschi Ganapini

Epistemic Planning (EP) refers to an automated planning setting where the agent reasons in the space of knowledge states and tries to find a plan to reach a desirable state from the current state.

Denoising quantum states with Quantum Autoencoders -- Theory and Applications

1 code implementation29 Dec 2020 Tom Achache, Lior Horesh, John Smolin

We implement a Quantum Autoencoder (QAE) as a quantum circuit capable of correcting Greenberger-Horne-Zeilinger (GHZ) states subject to various noisy quantum channels : the bit-flip channel and the more general quantum depolarizing channel.

Denoising

Overcoming Catastrophic Forgetting via Direction-Constrained Optimization

1 code implementation25 Nov 2020 Yunfei Teng, Anna Choromanska, Murray Campbell, Songtao Lu, Parikshit Ram, Lior Horesh

We study the principal directions of the trajectory of the optimizer after convergence and show that traveling along a few top principal directions can quickly bring the parameters outside the cone but this is not the case for the remaining directions.

Continual Learning

Quantum-Inspired Algorithms from Randomized Numerical Linear Algebra

no code implementations9 Nov 2020 Nadiia Chepurko, Kenneth L. Clarkson, Lior Horesh, Honghao Lin, David P. Woodruff

We create classical (non-quantum) dynamic data structures supporting queries for recommender systems and least-squares regression that are comparable to their quantum analogues.

Recommendation Systems

Projection techniques to update the truncated SVD of evolving matrices

no code implementations13 Oct 2020 Vassilis Kalantzis, Georgios Kollias, Shashanka Ubaru, Athanasios N. Nikolakopoulos, Lior Horesh, Kenneth L. Clarkson

This paper considers the problem of updating the rank-k truncated Singular Value Decomposition (SVD) of matrices subject to the addition of new rows and/or columns over time.

Recommendation Systems

Dynamic graph and polynomial chaos based models for contact tracing data analysis and optimal testing prescription

no code implementations10 Sep 2020 Shashanka Ubaru, Lior Horesh, Guy Cohen

Thus, estimation of state uncertainty is paramount for both eminent risk assessment, as well as for closing the tracing-testing loop by optimal testing prescription.

Uncertainty Quantification

Symbolic Regression using Mixed-Integer Nonlinear Optimization

no code implementations11 Jun 2020 Vernon Austel, Cristina Cornelio, Sanjeeb Dash, Joao Goncalves, Lior Horesh, Tyler Josephson, Nimrod Megiddo

The Symbolic Regression (SR) problem, where the goal is to find a regression function that does not have a pre-specified form but is any function that can be composed of a list of operators, is a hard problem in machine learning, both theoretically and computationally.

regression Symbolic Regression

Dynamic Graph Convolutional Networks Using the Tensor M-Product

1 code implementation ICLR 2020 Osman Asif Malik, Shashanka Ubaru, Lior Horesh, Misha E. Kilmer, Haim Avron

In recent years, a variety of graph neural networks (GNNs) have been successfully applied for representation learning and prediction on such graphs.

Edge Classification Link Prediction +1

Recurrent Neural Networks in the Eye of Differential Equations

no code implementations29 Apr 2019 Murphy Yuezhen Niu, Lior Horesh, Isaac Chuang

To understand the fundamental trade-offs between training stability, temporal dynamics and architectural complexity of recurrent neural networks~(RNNs), we directly analyze RNN architectures using numerical methods of ordinary differential equations~(ODEs).

Numerical Integration

Stable Tensor Neural Networks for Rapid Deep Learning

no code implementations15 Nov 2018 Elizabeth Newman, Lior Horesh, Haim Avron, Misha Kilmer

To exemplify the elegant, matrix-mimetic algebraic structure of our $t$-NNs, we expand on recent work (Haber and Ruthotto, 2017) which interprets deep neural networks as discretizations of non-linear differential equations and introduces stable neural networks which promote superior generalization.

Unification of Recurrent Neural Network Architectures and Quantum Inspired Stable Design

no code implementations27 Sep 2018 Murphy Yuezhen Niu, Lior Horesh, Michael O'Keeffe, Isaac Chuang

We show that most of the existing proposals of RNN architectures belong to different orders of $n$-$t$-ORNNs.

Should You Derive, Or Let the Data Drive? An Optimization Framework for Hybrid First-Principles Data-Driven Modeling

no code implementations12 Nov 2017 Remi R. Lam, Lior Horesh, Haim Avron, Karen E. Willcox

This work takes a different perspective and targets the construction of a correction model operator with implicit attributes.

Decision Making

Globally Optimal Symbolic Regression

no code implementations29 Oct 2017 Vernon Austel, Sanjeeb Dash, Oktay Gunluk, Lior Horesh, Leo Liberti, Giacomo Nannicini, Baruch Schieber

In this study we introduce a new technique for symbolic regression that guarantees global optimality.

regression Symbolic Regression

Breaking the 49-Qubit Barrier in the Simulation of Quantum Circuits

4 code implementations16 Oct 2017 Edwin Pednault, John A. Gunnels, Giacomo Nannicini, Lior Horesh, Thomas Magerlein, Edgar Solomonik, Robert Wisnieff

With the current rate of progress in quantum computing technologies, 50-qubit systems will soon become a reality.

Quantum Physics

Image classification using local tensor singular value decompositions

no code implementations29 Jun 2017 Elizabeth Newman, Misha Kilmer, Lior Horesh

From linear classifiers to neural networks, image classification has been a widely explored topic in mathematics, and many algorithms have proven to be effective classifiers.

Classification General Classification +3

Experimental Design for Non-Parametric Correction of Misspecified Dynamical Models

no code implementations2 May 2017 Gal Shulkind, Lior Horesh, Haim Avron

We consider a class of misspecified dynamical models where the governing term is only approximately known.

Experimental Design

Accelerating Hessian-free optimization for deep neural networks by implicit preconditioning and sampling

no code implementations5 Sep 2013 Tara N. Sainath, Lior Horesh, Brian Kingsbury, Aleksandr Y. Aravkin, Bhuvana Ramabhadran

This study aims at speeding up Hessian-free training, both by means of decreasing the amount of data used for training, as well as through reduction of the number of Krylov subspace solver iterations used for implicit estimation of the Hessian.

Cannot find the paper you are looking for? You can Submit a new open access paper.