Search Results for author: Micha Livne

Found 8 papers, 3 papers with code

nach0: Multimodal Natural and Chemical Languages Foundation Model

1 code implementation21 Nov 2023 Micha Livne, Zulfat Miftahutdinov, Elena Tutubalina, Maksim Kuznetsov, Daniil Polykovskiy, Annika Brundyn, Aastha Jhunjhunwala, Anthony Costa, Alex Aliper, Alán Aspuru-Guzik, Alex Zhavoronkov

Large Language Models (LLMs) have substantially driven scientific progress in various domains, and many papers have demonstrated their ability to tackle complex problems with creative solutions.

Decoder named-entity-recognition +2

Improving Small Molecule Generation using Mutual Information Machine

no code implementations18 Aug 2022 Danny Reidenbach, Micha Livne, Rajesh K. Ilango, Michelle Gill, Johnny Israeli

We achieve state-of-the-art results in several constrained single property optimization tasks as well as in the challenging task of multi-objective optimization, improving over previous success rate SOTA by more than 5\% .

Attribute Decoder +1

SentenceMIM: A Latent Variable Language Model

1 code implementation18 Feb 2020 Micha Livne, Kevin Swersky, David J. Fleet

MIM learning encourages high mutual information between observations and latent variables, and is robust against posterior collapse.

 Ranked #1 on Question Answering on YahooCQA (using extra training data)

Language Modelling Question Answering +1

MIM: Mutual Information Machine

1 code implementation8 Oct 2019 Micha Livne, Kevin Swersky, David J. Fleet

Experiments show that MIM learns representations with high mutual information, consistent encoding and decoding distributions, effective latent clustering, and data log likelihood comparable to VAE, while avoiding posterior collapse.

Clustering Decoder

High Mutual Information in Representation Learning with Symmetric Variational Inference

no code implementations4 Oct 2019 Micha Livne, Kevin Swersky, David J. Fleet

We introduce the Mutual Information Machine (MIM), a novel formulation of representation learning, using a joint distribution over the observations and latent state in an encoder/decoder framework.

Decoder Representation Learning +2

TzK: Flow-Based Conditional Generative Model

no code implementations5 Feb 2019 Micha Livne, David Fleet

We formulate a new class of conditional generative models based on probability flows.

Attribute

Walking on Thin Air: Environment-Free Physics-based Markerless Motion Capture

no code implementations4 Dec 2018 Micha Livne, Leonid Sigal, Marcus A. Brubaker, David J. Fleet

To our knowledge, this is the first approach to take physics into account without explicit {\em a priori} knowledge of the environment or body dimensions.

Markerless Motion Capture

TzK Flow - Conditional Generative Model

no code implementations5 Nov 2018 Micha Livne, David J. Fleet

Unlike autoencoders, the bottleneck does not limit model expressiveness, similar to flow-based ML.

Cannot find the paper you are looking for? You can Submit a new open access paper.