Search Results for author: Jannis Born

Found 13 papers, 7 papers with code

Unifying Molecular and Textual Representations via Multi-task Language Modelling

no code implementations29 Jan 2023 Dimitrios Christofidellis, Giorgio Giannone, Jannis Born, Ole Winther, Teodoro Laino, Matteo Manica

Here, we propose a multi-domain, multi-task language model to solve a wide range of tasks in both the chemical and natural language domains.

Language Modelling Multi-Task Learning

Domain-agnostic and Multi-level Evaluation of Generative Models

no code implementations20 Jan 2023 Girmaw Abebe Tadesse, Jannis Born, Celia Cintas, William Ogallo, Dmitry Zubarev, Matteo Manica, Komminist Weldemariam

To this end, we propose a framework for Multi-level Performance Evaluation of Generative mOdels (MPEGO), which could be employed across different domains.

Regression Transformer: Concurrent sequence regression and generation for molecular language modeling

1 code implementation1 Feb 2022 Jannis Born, Matteo Manica

To that end, we propose the Regression Transformer (RT), a novel method that abstracts regression as a conditional sequence modeling problem.

Conditional Text Generation Inductive Bias +2

TITAN: T Cell Receptor Specificity Prediction with Bimodal Attention Networks

no code implementations21 Apr 2021 Anna Weber, Jannis Born, María Rodríguez Martínez

Scarcity of data and a large sequence space make this task challenging, and to date only models limited to a small set of epitopes have achieved good performance.

Data Augmentation Specificity +1

On the Importance of Looking at the Manifold

no code implementations1 Jan 2021 Nil Adell Mill, Jannis Born, Nathaniel Park, James Hedrick, María Rodríguez Martínez, Matteo Manica

We explore a spectrum of models, ranging from uniquely learning representations based on the isolated features of the nodes (focusing on Variational Autoencoders), to uniquely learning representations based on the topology (using node2vec) passing through models that integrate both node features and topological information in a hybrid fashion.

Representation Learning

PaccMann$^{RL}$ on SARS-CoV-2: Designing antiviral candidates with conditional generative models

1 code implementation27 May 2020 Jannis Born, Matteo Manica, Joris Cadow, Greta Markert, Nil Adell Mill, Modestas Filipavicius, María Rodríguez Martínez

With the fast development of COVID-19 into a global pandemic, scientists around the globe are desperately searching for effective antiviral therapeutic agents.

Drug Discovery

POCOVID-Net: Automatic Detection of COVID-19 From a New Lung Ultrasound Imaging Dataset (POCUS)

5 code implementations25 Apr 2020 Jannis Born, Gabriel Brändle, Manuel Cossio, Marion Disdier, Julie Goulet, Jérémie Roulin, Nina Wiedemann

For detecting COVID-19 in particular, the model performs with a sensitivity of 0. 96, a specificity of 0. 79 and F1-score of 0. 92 in a 5-fold cross validation.


CogMol: Target-Specific and Selective Drug Design for COVID-19 Using Deep Generative Models

no code implementations NeurIPS 2020 Vijil Chenthamarakshan, Payel Das, Samuel C. Hoffman, Hendrik Strobelt, Inkit Padhi, Kar Wai Lim, Benjamin Hoover, Matteo Manica, Jannis Born, Teodoro Laino, Aleksandra Mojsilovic

CogMol also includes insilico screening for assessing toxicity of parent molecules and their metabolites with a multi-task toxicity classifier, synthetic feasibility with a chemical retrosynthesis predictor, and target structure binding with docking simulations.


PaccMann$^{RL}$: Designing anticancer drugs from transcriptomic data via reinforcement learning

no code implementations29 Aug 2019 Jannis Born, Matteo Manica, Ali Oskooei, Joris Cadow, Karsten Borgwardt, María Rodríguez Martínez

The generative process is optimized through PaccMann, a previously developed drug sensitivity prediction model to obtain effective anticancer compounds for the given context (i. e., transcriptomic profile).

reinforcement-learning Reinforcement Learning (RL)

Towards Explainable Anticancer Compound Sensitivity Prediction via Multimodal Attention-based Convolutional Encoders

1 code implementation25 Apr 2019 Matteo Manica, Ali Oskooei, Jannis Born, Vigneshwari Subramanian, Julio Sáez-Rodríguez, María Rodríguez Martínez

In line with recent advances in neural drug design and sensitivity prediction, we propose a novel architecture for interpretable prediction of anticancer compound sensitivity using a multimodal attention-based convolutional encoder.

PaccMann: Prediction of anticancer compound sensitivity with multi-modal attention-based neural networks

1 code implementation16 Nov 2018 Ali Oskooei, Jannis Born, Matteo Manica, Vigneshwari Subramanian, Julio Sáez-Rodríguez, María Rodríguez Martínez

Our models ingest a drug-cell pair consisting of SMILES encoding of a compound and the gene expression profile of a cancer cell and predicts an IC50 sensitivity value.

Cannot find the paper you are looking for? You can Submit a new open access paper.