Search Results for author: Joshua Meier

Found 7 papers, 4 papers with code

Language models enable zero-shot prediction of the effects of mutations on protein function

1 code implementation NeurIPS 2021 Joshua Meier, Roshan Rao, Robert Verkuil, Jason Liu, Tom Sercu, Alex Rives

Modeling the effect of sequence variation on function is a fundamental problem for understanding and designing proteins.

MSA Transformer

1 code implementation13 Feb 2021 Roshan Rao, Jason Liu, Robert Verkuil, Joshua Meier, John F. Canny, Pieter Abbeel, Tom Sercu, Alexander Rives

Unsupervised protein language models trained across millions of diverse sequences learn structure and function of proteins.

Language Modeling Masked Language Modeling +2

Neural Potts Model

no code implementations1 Jan 2021 Tom Sercu, Robert Verkuil, Joshua Meier, Brandon Amos, Zeming Lin, Caroline Chen, Jason Liu, Yann Lecun, Alexander Rives

We propose the Neural Potts Model objective as an amortized optimization problem.

model

Transformer protein language models are unsupervised structure learners

no code implementations ICLR 2021 Roshan Rao, Joshua Meier, Tom Sercu, Sergey Ovchinnikov, Alexander Rives

Unsupervised contact prediction is central to uncovering physical, structural, and functional constraints for protein structure determination and design.

Language Modeling Language Modelling

Biological structure and function emerge from scaling unsupervised learning to 250 million protein sequences

1 code implementation Proceedings of the National Academy of Sciences 2020 Alexander Rives, Joshua Meier, Tom Sercu, Siddharth Goyal, Zeming Lin, Demi Guo, Myle Ott, C. Lawrence Zitnick, Jerry Ma, Rob Fergus

In the field of artificial intelligence, a combination of scale in data and model capacity enabled by unsupervised learning has led to major advances in representation learning and statistical generation.

Diversity Language Modeling +2

Cannot find the paper you are looking for? You can Submit a new open access paper.