Search Results for author: Marin Soljačić

Found 24 papers, 15 papers with code

TENG: Time-Evolving Natural Gradient for Solving PDEs with Deep Neural Net

no code implementations16 Apr 2024 Zhuo Chen, Jacob McCarran, Esteban Vizcaino, Marin Soljačić, Di Luo

Partial differential equations (PDEs) are instrumental for modeling dynamical systems in science and engineering.

Multimodal Learning for Materials

no code implementations30 Nov 2023 Viggo Moro, Charlotte Loh, Rumen Dangovski, Ali Ghorashi, Andrew Ma, Zhuo Chen, Samuel Kim, Peter Y. Lu, Thomas Christensen, Marin Soljačić

Artificial intelligence is transforming computational materials science, improving the prediction of material properties, and accelerating the discovery of novel materials.

Property Prediction

ANTN: Bridging Autoregressive Neural Networks and Tensor Networks for Quantum Many-Body Simulation

1 code implementation NeurIPS 2023 Zhuo Chen, Laker Newhouse, Eddie Chen, Di Luo, Marin Soljačić

Quantum many-body physics simulation has important impacts on understanding fundamental science and has applications to quantum materials design and quantum technology.

Inductive Bias Tensor Networks

Q-Flow: Generative Modeling for Differential Equations of Open Quantum Dynamics with Normalizing Flows

no code implementations23 Feb 2023 Owen Dugan, Peter Y. Lu, Rumen Dangovski, Di Luo, Marin Soljačić

Studying the dynamics of open quantum systems can enable breakthroughs both in fundamental physics and applications to quantum engineering and quantum computation.

Learning to Optimize Quasi-Newton Methods

no code implementations11 Oct 2022 Isaac Liao, Rumen R. Dangovski, Jakob N. Foerster, Marin Soljačić

This paper introduces a novel machine learning optimizer called LODO, which tries to online meta-learn the best preconditioner during optimization.

Discovering Conservation Laws using Optimal Transport and Manifold Learning

1 code implementation31 Aug 2022 Peter Y. Lu, Rumen Dangovski, Marin Soljačić

We test this new approach on a variety of physical systems and demonstrate that our method is able to both identify the number of conserved quantities and extract their values.

Deep Learning and Symbolic Regression for Discovering Parametric Equations

1 code implementation1 Jul 2022 Michael Zhang, Samuel Kim, Peter Y. Lu, Marin Soljačić

Symbolic regression is a machine learning technique that can learn the governing formulas of data and thus has the potential to transform scientific discovery.

BIG-bench Machine Learning regression +1

Topogivity: A Machine-Learned Chemical Rule for Discovering Topological Materials

1 code implementation10 Feb 2022 Andrew Ma, Yang Zhang, Thomas Christensen, Hoi Chun Po, Li Jing, Liang Fu, Marin Soljačić

Topological materials present unconventional electronic properties that make them attractive for both basic science and next-generation technological applications.

End-to-End Optimization of Metasurfaces for Imaging with Compressed Sensing

no code implementations28 Jan 2022 Gaurav Arya, William F. Li, Charles Roques-Carmes, Marin Soljačić, Steven G. Johnson, Zin Lin

We present a framework for the end-to-end optimization of metasurface imaging systems that reconstruct targets using compressed sensing, a technique for solving underdetermined imaging problems when the target object exhibits sparsity (i. e. the object can be described by a small number of non-zero values, but the positions of these values are unknown).

Object Super-Resolution

Meta-Learning and Self-Supervised Pretraining for Real World Image Translation

no code implementations22 Dec 2021 Ileana Rugina, Rumen Dangovski, Mark Veillette, Pooya Khorrami, Brian Cheung, Olga Simek, Marin Soljačić

In recent years, emerging fields such as meta-learning or self-supervised learning have been closing the gap between proof-of-concept results and real-life applications of machine learning by extending deep-learning to the semi-supervised and few-shot domains.

Image-to-Image Translation Meta-Learning +2

Equivariant Contrastive Learning

2 code implementations28 Oct 2021 Rumen Dangovski, Li Jing, Charlotte Loh, Seungwook Han, Akash Srivastava, Brian Cheung, Pulkit Agrawal, Marin Soljačić

In state-of-the-art self-supervised learning (SSL) pre-training produces semantically good representations by encouraging them to be invariant under meaningful transformations prescribed from human knowledge.

Contrastive Learning Self-Supervised Learning

Discovering Sparse Interpretable Dynamics from Partial Observations

1 code implementation22 Jul 2021 Peter Y. Lu, Joan Ariño, Marin Soljačić

Identifying the governing equations of a nonlinear dynamical system is key to both understanding the physical features of the system and constructing an accurate model of the dynamics that generalizes well beyond the available data.

Deep Learning for Bayesian Optimization of Scientific Problems with High-Dimensional Structure

2 code implementations23 Apr 2021 Samuel Kim, Peter Y. Lu, Charlotte Loh, Jamie Smith, Jasper Snoek, Marin Soljačić

Bayesian optimization (BO) is a popular paradigm for global optimization of expensive black-box functions, but there are many domains where the function is not completely a black-box.

Bayesian Optimization Gaussian Processes

Casimir light in dispersive nanophotonics

no code implementations4 Jan 2021 Jamison Sloan, Nicholas Rivera, John D. Joannopoulos, Marin Soljačić

Despite interest for their potential applications as sources of quantum light, DVEs are generally very weak, providing many opportunities for enhancement through modern techniques in nanophotonics, such as using media which support excitations such as plasmon and phonon polaritons.

Optics

On a Novel Application of Wasserstein-Procrustes for Unsupervised Cross-Lingual Learning

1 code implementation18 Jul 2020 Guillem Ramírez, Rumen Dangovski, Preslav Nakov, Marin Soljačić

We believe that our rethinking of the Wasserstein-Procrustes problem could enable further research, thus helping to develop better algorithms for aligning word embeddings across languages.

Word Embeddings

Contextualizing Enhances Gradient Based Meta Learning

no code implementations17 Jul 2020 Evan Vogelbaum, Rumen Dangovski, Li Jing, Marin Soljačić

We propose the implementation of contextualizers, which are generalizable prototypes that adapt to given examples and play a larger role in classification for gradient-based models.

Few-Shot Learning

OccamNet: A Fast Neural Model for Symbolic Regression at Scale

4 code implementations16 Jul 2020 Owen Dugan, Rumen Dangovski, Allan Costa, Samuel Kim, Pawan Goyal, Joseph Jacobson, Marin Soljačić

Neural networks' expressiveness comes at the cost of complex, black-box models that often extrapolate poorly beyond the domain of the training dataset, conflicting with the goal of finding compact analytic expressions to describe scientific data.

Image Classification regression +1

Integration of Neural Network-Based Symbolic Regression in Deep Learning for Scientific Discovery

1 code implementation10 Dec 2019 Samuel Kim, Peter Y. Lu, Srijon Mukherjee, Michael Gilbert, Li Jing, Vladimir Čeperić, Marin Soljačić

We find that the EQL-based architecture can extrapolate quite well outside of the training data set compared to a standard neural network-based architecture, paving the way for deep learning to be applied in scientific exploration and discovery.

Explainable Models regression +1

Extracting Interpretable Physical Parameters from Spatiotemporal Systems using Unsupervised Learning

1 code implementation13 Jul 2019 Peter Y. Lu, Samuel Kim, Marin Soljačić

Our method for discovering interpretable latent parameters in spatiotemporal systems will allow us to better analyze and understand real-world phenomena and datasets, which often have unknown and uncontrolled variables that alter the system dynamics and cause varying behaviors that are difficult to disentangle.

Gated Orthogonal Recurrent Units: On Learning to Forget

1 code implementation8 Jun 2017 Li Jing, Caglar Gulcehre, John Peurifoy, Yichen Shen, Max Tegmark, Marin Soljačić, Yoshua Bengio

We present a novel recurrent neural network (RNN) based model that combines the remembering ability of unitary RNNs with the ability of gated RNNs to effectively forget redundant/irrelevant information in its memory.

Ranked #7 on Question Answering on bAbi (Accuracy (trained on 1k) metric)

Denoising Question Answering

Tunable Efficient Unitary Neural Networks (EUNN) and their application to RNNs

4 code implementations ICML 2017 Li Jing, Yichen Shen, Tena Dubček, John Peurifoy, Scott Skirlo, Yann Lecun, Max Tegmark, Marin Soljačić

Using unitary (instead of general) matrices in artificial neural networks (ANNs) is a promising way to solve the gradient explosion/vanishing problem, as well as to enable ANNs to learn long-term correlations in the data.

Permuted-MNIST

Binary matrices of optimal autocorrelations as alignment marks

no code implementations29 Aug 2014 Scott A. Skirlo, Ling Lu, Marin Soljačić

We define a new class of binary matrices by maximizing the peak-sidelobe distances in the aperiodic autocorrelations.

Position

Cannot find the paper you are looking for? You can Submit a new open access paper.