Search Results for author: Ali Madani

Found 11 papers, 5 papers with code

Towards Joint Sequence-Structure Generation of Nucleic Acid and Protein Complexes with SE(3)-Discrete Diffusion

1 code implementation21 Dec 2023 Alex Morehead, Jeffrey Ruffolo, Aadyot Bhatnagar, Ali Madani

In this work, we introduce MMDiff, a generative model that jointly designs sequences and structures of nucleic acid and protein complexes, independently or in complex, using joint SE(3)-discrete diffusion noise.

ProGen2: Exploring the Boundaries of Protein Language Models

4 code implementations27 Jun 2022 Erik Nijkamp, Jeffrey Ruffolo, Eli N. Weinstein, Nikhil Naik, Ali Madani

Attention-based models trained on protein sequences have demonstrated incredible success at classification and generation tasks relevant for artificial intelligence-driven protein design.

Protein Design

Don’t throw away that linear head: Few-shot protein fitness prediction with generative models

no code implementations29 Sep 2021 Ben Krause, Nikhil Naik, Wenhao Liu, Ali Madani

Predicting the fitness, i. e. functional value, of a protein sequence is an important and challenging task in biology, particularly due to the scarcity of assay-labeled data.

Transfer Learning

Deep Extrapolation for Attribute-Enhanced Generation

1 code implementation NeurIPS 2021 Alvin Chan, Ali Madani, Ben Krause, Nikhil Naik

Attribute extrapolation in sample generation is challenging for deep neural networks operating beyond the training distribution.

Attribute

Profile Prediction: An Alignment-Based Pre-Training Task for Protein Sequence Models

no code implementations1 Dec 2020 Pascal Sturmfels, Jesse Vig, Ali Madani, Nazneen Fatema Rajani

Recent deep-learning approaches to protein prediction have shown that pre-training on unlabeled data can yield useful representations for downstream tasks.

Language Modelling Masked Language Modeling +1

BERTology Meets Biology: Interpreting Attention in Protein Language Models

2 code implementations ICLR 2021 Jesse Vig, Ali Madani, Lav R. Varshney, Caiming Xiong, Richard Socher, Nazneen Fatema Rajani

Transformer architectures have proven to learn useful representations for protein classification and generation tasks.

ProGen: Language Modeling for Protein Generation

2 code implementations8 Mar 2020 Ali Madani, Bryan McCann, Nikhil Naik, Nitish Shirish Keskar, Namrata Anand, Raphael R. Eguchi, Po-Ssu Huang, Richard Socher

Generative modeling for protein engineering is key to solving fundamental problems in synthetic biology, medicine, and material science.

Language Modelling

ProDyn0: Inferring calponin homology domain stretching behavior using graph neural networks

no code implementations22 Oct 2019 Ali Madani, Cyna Shirazinejad, Jia Rui Ong, Hengameh Shams, Mohammad Mofrad

Graph neural networks are a quickly emerging field for non-Euclidean data that leverage the inherent graphical structure to predict node, edge, and global-level properties of a system.

Bimodal network architectures for automatic generation of image annotation from text

no code implementations5 Sep 2018 Mehdi Moradi, Ali Madani, Yaniv Gur, Yufan Guo, Tanveer Syeda-Mahmood

The source of big data is typically large image collections and clinical reports recorded for these images.

Fast and accurate classification of echocardiograms using deep learning

no code implementations27 Jun 2017 Ali Madani, Ramy Arnaout, Mohammad Mofrad, Rima Arnaout

The essential first step toward comprehensive computer assisted echocardiographic interpretation is determining whether computers can learn to recognize standard views.

Classification General Classification +1

Cannot find the paper you are looking for? You can Submit a new open access paper.