Modeling the effect of sequence variation on function is a fundamental problem for understanding and designing proteins.
Unsupervised protein language models trained across millions of diverse sequences learn structure and function of proteins.
Unsupervised contact prediction is central to uncovering physical, structural, and functional constraints for protein structure determination and design.
In the field of artificial intelligence, a combination of scale in data and model capacity enabled by unsupervised learning has led to major advances in representation learning and statistical generation.
We propose an energy-based model (EBM) of protein conformations that operates at atomic scale.
no code implementations • 31 May 2018 • Omer Gottesman, Fredrik Johansson, Joshua Meier, Jack Dent, Dong-hun Lee, Srivatsan Srinivasan, Linying Zhang, Yi Ding, David Wihl, Xuefeng Peng, Jiayu Yao, Isaac Lage, Christopher Mosch, Li-wei H. Lehman, Matthieu Komorowski, Aldo Faisal, Leo Anthony Celi, David Sontag, Finale Doshi-Velez
Much attention has been devoted recently to the development of machine learning algorithms with the goal of improving treatment policies in healthcare.