no code implementations • 20 Mar 2023 • Adriano Hernandez, Rumen Dangovski, Peter Y. Lu, Marin Soljacic
Model stitching (Lenc & Vedaldi 2015) is a compelling methodology to compare different neural network representations, because it allows us to measure to what degree they may be interchanged.
no code implementations • 4 Mar 2023 • Charlotte Loh, Seungwook Han, Shivchander Sudalairaj, Rumen Dangovski, Kai Xu, Florian Wenzel, Marin Soljacic, Akash Srivastava
In this work, we present Multi-Symmetry Ensembles (MSE), a framework for constructing diverse ensembles by capturing the multiplicity of hypotheses along symmetry axes, which explore the hypothesis space beyond stochastic perturbations of model weights and hyperparameters.
no code implementations • 10 Oct 2022 • Charlotte Loh, Rumen Dangovski, Shivchander Sudalairaj, Seungwook Han, Ligong Han, Leonid Karlinsky, Marin Soljacic, Akash Srivastava
State-of-the-art (SOTA) semi-supervised learning (SSL) methods have been highly successful in leveraging a mix of labeled and unlabeled data by combining techniques of consistency regularization and pseudo-labeling.
1 code implementation • 2 Oct 2022 • Julia Balla, Sihao Huang, Owen Dugan, Rumen Dangovski, Marin Soljacic
In social science, formal and quantitative models, such as ones describing economic growth and collective action, are used to formulate mechanistic explanations, provide predictions, and uncover questions about observed phenomena.
1 code implementation • 15 Oct 2021 • Charlotte Loh, Thomas Christensen, Rumen Dangovski, Samuel Kim, Marin Soljacic
Deep learning techniques have been increasingly applied to the natural sciences, e. g., for property prediction and optimization or material discovery.
no code implementations • ICLR 2022 • Rumen Dangovski, Li Jing, Charlotte Loh, Seungwook Han, Akash Srivastava, Brian Cheung, Pulkit Agrawal, Marin Soljacic
In state-of-the-art self-supervised learning (SSL) pre-training produces semantically good representations by encouraging them to be invariant under meaningful transformations prescribed from human knowledge.
no code implementations • NeurIPS Workshop AI4Scien 2021 • Oreoluwa Alao, Peter Y Lu, Marin Soljacic
Reservoir computing architectures known as echo state networks (ESNs) have been shown to have exceptional predictive capabilities when trained on chaotic systems.
no code implementations • 28 Nov 2018 • Li Jing, Rumen Dangovski, Marin Soljacic
We present a logarithmic-scale efficient convolutional neural network architecture for edge devices, named WaveletNet.
no code implementations • 27 Aug 2018 • Yurui Qu, Li Jing, Yichen Shen, Min Qiu, Marin Soljacic
First, we demonstrate that in predicting the transmission from multilayer photonic film, the relative error rate is reduced by 46. 8% (26. 5%) when the source data comes from 10-layer (8-layer) films and the target data comes from 8-layer (10-layer) films.
2 code implementations • ICLR 2018 • Rumen Dangovski, Li Jing, Marin Soljacic
We evaluate our model on synthetic memorization, question answering and language modeling tasks.
Ranked #5 on
Question Answering
on bAbi
(Accuracy (trained on 1k) metric)
1 code implementation • 18 Oct 2017 • John Peurifoy, Yichen Shen, Li Jing, Yi Yang, Fidel Cano-Renteria, Brendan Delacy, Max Tegmark, John D. Joannopoulos, Marin Soljacic
We propose a method to use artificial neural networks to approximate light scattering by multilayer nanoparticles.
Computational Physics Applied Physics Optics