Search Results for author: Daniel Berenberg

Found 5 papers, 2 papers with code

OpenProteinSet: Training data for structural biology at scale

1 code implementation NeurIPS 2023 Gustaf Ahdritz, Nazim Bouatta, Sachin Kadyan, Lukas Jarosch, Daniel Berenberg, Ian Fisk, Andrew M. Watkins, Stephen Ra, Richard Bonneau, Mohammed AlQuraishi

Multiple sequence alignments (MSAs) of proteins encode rich biological information and have been workhorses in bioinformatic methods for tasks like protein design and protein structure prediction for decades.

Protein Design Protein Structure Prediction

Protein Discovery with Discrete Walk-Jump Sampling

1 code implementation8 Jun 2023 Nathan C. Frey, Daniel Berenberg, Karina Zadorozhny, Joseph Kleinhenz, Julien Lafrance-Vanasse, Isidro Hotzel, Yan Wu, Stephen Ra, Richard Bonneau, Kyunghyun Cho, Andreas Loukas, Vladimir Gligorijevic, Saeed Saremi

We resolve difficulties in training and sampling from a discrete generative model by learning a smoothed energy function, sampling from the smoothed data manifold with Langevin Markov chain Monte Carlo (MCMC), and projecting back to the true data manifold with one-step denoising.

Denoising

Neural language representations predict outcomes of scientific research

no code implementations17 May 2018 James P. Bagrow, Daniel Berenberg, Joshua Bongard

Many research fields codify their findings in standard formats, often by reporting correlations between quantities of interest.

Inferring the size of the causal universe: features and fusion of causal attribution networks

no code implementations14 Dec 2018 Daniel Berenberg, James P. Bagrow

Further, the total size of the collective causal attribution network held by humans is currently unknown, making it challenging to assess the progress of these surveys.

Multi-segment preserving sampling for deep manifold sampler

no code implementations9 May 2022 Daniel Berenberg, Jae Hyeon Lee, Simon Kelow, Ji Won Park, Andrew Watkins, Vladimir Gligorijević, Richard Bonneau, Stephen Ra, Kyunghyun Cho

We introduce an alternative approach to this guided sampling procedure, multi-segment preserving sampling, that enables the direct inclusion of domain-specific knowledge by designating preserved and non-preserved segments along the input sequence, thereby restricting variation to only select regions.

Language Modelling

Cannot find the paper you are looking for? You can Submit a new open access paper.