Search Results for author: Rumen Dangovski

Found 21 papers, 10 papers with code

Model Stitching: Looking For Functional Similarity Between Representations

no code implementations20 Mar 2023 Adriano Hernandez, Rumen Dangovski, Peter Y. Lu, Marin Soljacic

Model stitching (Lenc & Vedaldi 2015) is a compelling methodology to compare different neural network representations, because it allows us to measure to what degree they may be interchanged.

Multi-Symmetry Ensembles: Improving Diversity and Generalization via Opposing Symmetries

1 code implementation4 Mar 2023 Charlotte Loh, Seungwook Han, Shivchander Sudalairaj, Rumen Dangovski, Kai Xu, Florian Wenzel, Marin Soljacic, Akash Srivastava

In this work, we present Multi-Symmetry Ensembles (MSE), a framework for constructing diverse ensembles by capturing the multiplicity of hypotheses along symmetry axes, which explore the hypothesis space beyond stochastic perturbations of model weights and hyperparameters.

Representation Learning

Q-Flow: Generative Modeling for Differential Equations of Open Quantum Dynamics with Normalizing Flows

no code implementations23 Feb 2023 Owen Dugan, Peter Y. Lu, Rumen Dangovski, Di Luo, Marin Soljačić

Studying the dynamics of open quantum systems can enable breakthroughs both in fundamental physics and applications to quantum engineering and quantum computation.

QuACK: Accelerating Gradient-Based Quantum Optimization with Koopman Operator Learning

no code implementations2 Nov 2022 Di Luo, Jiayu Shen, Rumen Dangovski, Marin Soljačić

Quantum optimization, a key application of quantum computing, has traditionally been stymied by the linearly increasing complexity of gradient calculations with an increasing number of parameters.

Operator learning Quantum Machine Learning

On the Importance of Calibration in Semi-supervised Learning

no code implementations10 Oct 2022 Charlotte Loh, Rumen Dangovski, Shivchander Sudalairaj, Seungwook Han, Ligong Han, Leonid Karlinsky, Marin Soljacic, Akash Srivastava

State-of-the-art (SOTA) semi-supervised learning (SSL) methods have been highly successful in leveraging a mix of labeled and unlabeled data by combining techniques of consistency regularization and pseudo-labeling.

AI-Assisted Discovery of Quantitative and Formal Models in Social Science

1 code implementation2 Oct 2022 Julia Balla, Sihao Huang, Owen Dugan, Rumen Dangovski, Marin Soljacic

In social science, formal and quantitative models, such as ones describing economic growth and collective action, are used to formulate mechanistic explanations, provide predictions, and uncover questions about observed phenomena.

Sociology Symbolic Regression

Discovering Conservation Laws using Optimal Transport and Manifold Learning

1 code implementation31 Aug 2022 Peter Y. Lu, Rumen Dangovski, Marin Soljačić

We test this new approach on a variety of physical systems and demonstrate that our method is able to both identify the number of conserved quantities and extract their values.

Meta-Learning and Self-Supervised Pretraining for Real World Image Translation

no code implementations22 Dec 2021 Ileana Rugina, Rumen Dangovski, Mark Veillette, Pooya Khorrami, Brian Cheung, Olga Simek, Marin Soljačić

In recent years, emerging fields such as meta-learning or self-supervised learning have been closing the gap between proof-of-concept results and real-life applications of machine learning by extending deep-learning to the semi-supervised and few-shot domains.

Image-to-Image Translation Meta-Learning +2

Equivariant Contrastive Learning

2 code implementations28 Oct 2021 Rumen Dangovski, Li Jing, Charlotte Loh, Seungwook Han, Akash Srivastava, Brian Cheung, Pulkit Agrawal, Marin Soljačić

In state-of-the-art self-supervised learning (SSL) pre-training produces semantically good representations by encouraging them to be invariant under meaningful transformations prescribed from human knowledge.

Contrastive Learning Self-Supervised Learning

Surrogate- and invariance-boosted contrastive learning for data-scarce applications in science

1 code implementation15 Oct 2021 Charlotte Loh, Thomas Christensen, Rumen Dangovski, Samuel Kim, Marin Soljacic

Deep learning techniques have been increasingly applied to the natural sciences, e. g., for property prediction and optimization or material discovery.

Contrastive Learning Property Prediction

Equivariant Self-Supervised Learning: Encouraging Equivariance in Representations

no code implementations ICLR 2022 Rumen Dangovski, Li Jing, Charlotte Loh, Seungwook Han, Akash Srivastava, Brian Cheung, Pulkit Agrawal, Marin Soljacic

In state-of-the-art self-supervised learning (SSL) pre-training produces semantically good representations by encouraging them to be invariant under meaningful transformations prescribed from human knowledge.

Self-Supervised Learning

On a Novel Application of Wasserstein-Procrustes for Unsupervised Cross-Lingual Learning

1 code implementation18 Jul 2020 Guillem Ramírez, Rumen Dangovski, Preslav Nakov, Marin Soljačić

We believe that our rethinking of the Wasserstein-Procrustes problem could enable further research, thus helping to develop better algorithms for aligning word embeddings across languages.

Word Embeddings

Contextualizing Enhances Gradient Based Meta Learning

no code implementations17 Jul 2020 Evan Vogelbaum, Rumen Dangovski, Li Jing, Marin Soljačić

We propose the implementation of contextualizers, which are generalizable prototypes that adapt to given examples and play a larger role in classification for gradient-based models.

Few-Shot Learning

Rotational Unit of Memory: A Novel Representation Unit for RNNs with Scalable Applications

no code implementations TACL 2019 Rumen Dangovski, Li Jing, Preslav Nakov, Mi{\'c}o Tatalovi{\'c}, Marin Solja{\v{c}}i{\'c}

Stacking long short-term memory (LSTM) cells or gated recurrent units (GRUs) as part of a recurrent neural network (RNN) has become a standard approach to solving a number of tasks ranging from language modeling to text summarization.

Language Modelling Text Summarization

WaveletNet: Logarithmic Scale Efficient Convolutional Neural Networks for Edge Devices

no code implementations28 Nov 2018 Li Jing, Rumen Dangovski, Marin Soljacic

We present a logarithmic-scale efficient convolutional neural network architecture for edge devices, named WaveletNet.

General Classification

Rotational Unit of Memory

2 code implementations ICLR 2018 Rumen Dangovski, Li Jing, Marin Soljacic

We evaluate our model on synthetic memorization, question answering and language modeling tasks.

Ranked #5 on Question Answering on bAbi (Accuracy (trained on 1k) metric)

Language Modelling Machine Translation +4

Cannot find the paper you are looking for? You can Submit a new open access paper.