no code implementations • 6 Jan 2025 • Andrew Ma, Owen Dugan, Marin Soljačić
In solid-state materials science, substantial efforts have been devoted to the calculation and modeling of the electronic band gap.
no code implementations • 4 Jun 2024 • Owen Dugan, Donato Manuel Jimenez Beneto, Charlotte Loh, Zhuo Chen, Rumen Dangovski, Marin Soljačić
Despite significant advancements in text generation and reasoning, Large Language Models (LLMs) still face challenges in accurately performing complex arithmetic operations.
1 code implementation • 31 May 2024 • Zhuo Chen, Rumen Dangovski, Charlotte Loh, Owen Dugan, Di Luo, Marin Soljačić
We propose Quantum-informed Tensor Adaptation (QuanTA), a novel, easy-to-implement, fine-tuning method with no inference overhead for large-scale pre-trained language models.
24 code implementations • 30 Apr 2024 • Ziming Liu, YiXuan Wang, Sachin Vaidya, Fabian Ruehle, James Halverson, Marin Soljačić, Thomas Y. Hou, Max Tegmark
Inspired by the Kolmogorov-Arnold representation theorem, we propose Kolmogorov-Arnold Networks (KANs) as promising alternatives to Multi-Layer Perceptrons (MLPs).
1 code implementation • 16 Apr 2024 • Zhuo Chen, Jacob McCarran, Esteban Vizcaino, Marin Soljačić, Di Luo
Partial differential equations (PDEs) are instrumental for modeling dynamical systems in science and engineering.
no code implementations • 30 Nov 2023 • Viggo Moro, Charlotte Loh, Rumen Dangovski, Ali Ghorashi, Andrew Ma, Zhuo Chen, Samuel Kim, Peter Y. Lu, Thomas Christensen, Marin Soljačić
Artificial intelligence is transforming computational materials science, improving the prediction of material properties, and accelerating the discovery of novel materials.
1 code implementation • NeurIPS 2023 • Zhuo Chen, Laker Newhouse, Eddie Chen, Di Luo, Marin Soljačić
Quantum many-body physics simulation has important impacts on understanding fundamental science and has applications to quantum materials design and quantum technology.
no code implementations • 23 Feb 2023 • Owen Dugan, Peter Y. Lu, Rumen Dangovski, Di Luo, Marin Soljačić
Studying the dynamics of open quantum systems can enable breakthroughs both in fundamental physics and applications to quantum engineering and quantum computation.
1 code implementation • NeurIPS 2023 • Di Luo, Jiayu Shen, Rumen Dangovski, Marin Soljačić
Quantum optimization, a key application of quantum computing, has traditionally been stymied by the linearly increasing complexity of gradient calculations with an increasing number of parameters.
no code implementations • 11 Oct 2022 • Isaac Liao, Rumen R. Dangovski, Jakob N. Foerster, Marin Soljačić
This paper introduces a novel machine learning optimizer called LODO, which tries to online meta-learn the best preconditioner during optimization.
1 code implementation • 31 Aug 2022 • Peter Y. Lu, Rumen Dangovski, Marin Soljačić
We test this new approach on a variety of physical systems and demonstrate that our method is able to both identify the number of conserved quantities and extract their values.
1 code implementation • 1 Jul 2022 • Michael Zhang, Samuel Kim, Peter Y. Lu, Marin Soljačić
Symbolic regression is a machine learning technique that can learn the governing formulas of data and thus has the potential to transform scientific discovery.
1 code implementation • NAACL 2022 • Yung-Sung Chuang, Rumen Dangovski, Hongyin Luo, Yang Zhang, Shiyu Chang, Marin Soljačić, Shang-Wen Li, Wen-tau Yih, Yoon Kim, James Glass
We propose DiffCSE, an unsupervised contrastive learning framework for learning sentence embeddings.
Ranked #13 on
Semantic Textual Similarity
on STS16
1 code implementation • 10 Feb 2022 • Andrew Ma, Yang Zhang, Thomas Christensen, Hoi Chun Po, Li Jing, Liang Fu, Marin Soljačić
Topological materials present unconventional electronic properties that make them attractive for both basic science and next-generation technological applications.
1 code implementation • 28 Jan 2022 • Gaurav Arya, William F. Li, Charles Roques-Carmes, Marin Soljačić, Steven G. Johnson, Zin Lin
We present a framework for the end-to-end optimization of metasurface imaging systems that reconstruct targets using compressed sensing, a technique for solving underdetermined imaging problems when the target object exhibits sparsity (i. e. the object can be described by a small number of non-zero values, but the positions of these values are unknown).
no code implementations • 22 Dec 2021 • Ileana Rugina, Rumen Dangovski, Mark Veillette, Pooya Khorrami, Brian Cheung, Olga Simek, Marin Soljačić
In recent years, emerging fields such as meta-learning or self-supervised learning have been closing the gap between proof-of-concept results and real-life applications of machine learning by extending deep-learning to the semi-supervised and few-shot domains.
2 code implementations • 28 Oct 2021 • Rumen Dangovski, Li Jing, Charlotte Loh, Seungwook Han, Akash Srivastava, Brian Cheung, Pulkit Agrawal, Marin Soljačić
In state-of-the-art self-supervised learning (SSL) pre-training produces semantically good representations by encouraging them to be invariant under meaningful transformations prescribed from human knowledge.
1 code implementation • 22 Jul 2021 • Peter Y. Lu, Joan Ariño, Marin Soljačić
Identifying the governing equations of a nonlinear dynamical system is key to both understanding the physical features of the system and constructing an accurate model of the dynamics that generalizes well beyond the available data.
2 code implementations • 23 Apr 2021 • Samuel Kim, Peter Y. Lu, Charlotte Loh, Jamie Smith, Jasper Snoek, Marin Soljačić
Bayesian optimization (BO) is a popular paradigm for global optimization of expensive black-box functions, but there are many domains where the function is not completely a black-box.
no code implementations • 4 Jan 2021 • Jamison Sloan, Nicholas Rivera, John D. Joannopoulos, Marin Soljačić
Despite interest for their potential applications as sources of quantum light, DVEs are generally very weak, providing many opportunities for enhancement through modern techniques in nanophotonics, such as using media which support excitations such as plasmon and phonon polaritons.
Optics
2 code implementations • 20 Nov 2020 • Ileana Rugina, Rumen Dangovski, Li Jing, Preslav Nakov, Marin Soljačić
Attention mechanisms play a crucial role in the neural revolution of Natural Language Processing (NLP).
1 code implementation • 18 Jul 2020 • Guillem Ramírez, Rumen Dangovski, Preslav Nakov, Marin Soljačić
We believe that our rethinking of the Wasserstein-Procrustes problem could enable further research, thus helping to develop better algorithms for aligning word embeddings across languages.
no code implementations • 17 Jul 2020 • Evan Vogelbaum, Rumen Dangovski, Li Jing, Marin Soljačić
We propose the implementation of contextualizers, which are generalizable prototypes that adapt to given examples and play a larger role in classification for gradient-based models.
4 code implementations • 16 Jul 2020 • Owen Dugan, Rumen Dangovski, Allan Costa, Samuel Kim, Pawan Goyal, Joseph Jacobson, Marin Soljačić
Neural networks' expressiveness comes at the cost of complex, black-box models that often extrapolate poorly beyond the domain of the training dataset, conflicting with the goal of finding compact analytic expressions to describe scientific data.
1 code implementation • 10 Dec 2019 • Samuel Kim, Peter Y. Lu, Srijon Mukherjee, Michael Gilbert, Li Jing, Vladimir Čeperić, Marin Soljačić
We find that the EQL-based architecture can extrapolate quite well outside of the training data set compared to a standard neural network-based architecture, paving the way for deep learning to be applied in scientific exploration and discovery.
1 code implementation • 13 Jul 2019 • Peter Y. Lu, Samuel Kim, Marin Soljačić
Our method for discovering interpretable latent parameters in spatiotemporal systems will allow us to better analyze and understand real-world phenomena and datasets, which often have unknown and uncontrolled variables that alter the system dynamics and cause varying behaviors that are difficult to disentangle.
1 code implementation • 8 Jun 2017 • Li Jing, Caglar Gulcehre, John Peurifoy, Yichen Shen, Max Tegmark, Marin Soljačić, Yoshua Bengio
We present a novel recurrent neural network (RNN) based model that combines the remembering ability of unitary RNNs with the ability of gated RNNs to effectively forget redundant/irrelevant information in its memory.
Ranked #7 on
Question Answering
on bAbi
(Accuracy (trained on 1k) metric)
4 code implementations • ICML 2017 • Li Jing, Yichen Shen, Tena Dubček, John Peurifoy, Scott Skirlo, Yann Lecun, Max Tegmark, Marin Soljačić
Using unitary (instead of general) matrices in artificial neural networks (ANNs) is a promising way to solve the gradient explosion/vanishing problem, as well as to enable ANNs to learn long-term correlations in the data.
no code implementations • 29 Aug 2014 • Scott A. Skirlo, Ling Lu, Marin Soljačić
We define a new class of binary matrices by maximizing the peak-sidelobe distances in the aperiodic autocorrelations.