no code implementations • 29 Jan 2023 • Dimitrios Christofidellis, Giorgio Giannone, Jannis Born, Ole Winther, Teodoro Laino, Matteo Manica
Here, we propose a multi-domain, multi-task language model to solve a wide range of tasks in both the chemical and natural language domains.
no code implementations • 20 Jan 2023 • Girmaw Abebe Tadesse, Jannis Born, Celia Cintas, William Ogallo, Dmitry Zubarev, Matteo Manica, Komminist Weldemariam
To this end, we propose a framework for Multi-level Performance Evaluation of Generative mOdels (MPEGO), which could be employed across different domains.
1 code implementation • 8 Jul 2022 • Matteo Manica, Jannis Born, Joris Cadow, Dimitrios Christofidellis, Ashish Dave, Dean Clarke, Yves Gaetan Nana Teukam, Giorgio Giannone, Samuel C. Hoffman, Matthew Buchan, Vijil Chenthamarakshan, Timothy Donovan, Hsiang Han Hsu, Federico Zipoli, Oliver Schilter, Akihiro Kishimoto, Lisa Hamada, Inkit Padhi, Karl Wehden, Lauren McHugh, Alexy Khrabrov, Payel Das, Seiji Takeda, John R. Smith
With the growing availability of data within various scientific domains, generative models hold enormous potential to accelerate scientific discovery.
1 code implementation • 1 Feb 2022 • Jannis Born, Matteo Manica
To that end, we propose the Regression Transformer (RT), a novel method that abstracts regression as a conditional sequence modeling problem.
no code implementations • 21 Apr 2021 • Anna Weber, Jannis Born, María Rodríguez Martínez
Scarcity of data and a large sequence space make this task challenging, and to date only models limited to a small set of epitopes have achieved good performance.
no code implementations • 1 Jan 2021 • Nil Adell Mill, Jannis Born, Nathaniel Park, James Hedrick, María Rodríguez Martínez, Matteo Manica
We explore a spectrum of models, ranging from uniquely learning representations based on the isolated features of the nodes (focusing on Variational Autoencoders), to uniquely learning representations based on the topology (using node2vec) passing through models that integrate both node features and topological information in a hybrid fashion.
2 code implementations • 13 Sep 2020 • Jannis Born, Nina Wiedemann, Gabriel Brändle, Charlotte Buhre, Bastian Rieck, Karsten Borgwardt
Controlling the COVID-19 pandemic largely hinges upon the existence of fast, safe, and highly-available diagnostic tools.
1 code implementation • 27 May 2020 • Jannis Born, Matteo Manica, Joris Cadow, Greta Markert, Nil Adell Mill, Modestas Filipavicius, María Rodríguez Martínez
With the fast development of COVID-19 into a global pandemic, scientists around the globe are desperately searching for effective antiviral therapeutic agents.
5 code implementations • 25 Apr 2020 • Jannis Born, Gabriel Brändle, Manuel Cossio, Marion Disdier, Julie Goulet, Jérémie Roulin, Nina Wiedemann
For detecting COVID-19 in particular, the model performs with a sensitivity of 0. 96, a specificity of 0. 79 and F1-score of 0. 92 in a 5-fold cross validation.
no code implementations • NeurIPS 2020 • Vijil Chenthamarakshan, Payel Das, Samuel C. Hoffman, Hendrik Strobelt, Inkit Padhi, Kar Wai Lim, Benjamin Hoover, Matteo Manica, Jannis Born, Teodoro Laino, Aleksandra Mojsilovic
CogMol also includes insilico screening for assessing toxicity of parent molecules and their metabolites with a multi-task toxicity classifier, synthetic feasibility with a chemical retrosynthesis predictor, and target structure binding with docking simulations.
no code implementations • 29 Aug 2019 • Jannis Born, Matteo Manica, Ali Oskooei, Joris Cadow, Karsten Borgwardt, María Rodríguez Martínez
The generative process is optimized through PaccMann, a previously developed drug sensitivity prediction model to obtain effective anticancer compounds for the given context (i. e., transcriptomic profile).
1 code implementation • 25 Apr 2019 • Matteo Manica, Ali Oskooei, Jannis Born, Vigneshwari Subramanian, Julio Sáez-Rodríguez, María Rodríguez Martínez
In line with recent advances in neural drug design and sensitivity prediction, we propose a novel architecture for interpretable prediction of anticancer compound sensitivity using a multimodal attention-based convolutional encoder.
1 code implementation • 16 Nov 2018 • Ali Oskooei, Jannis Born, Matteo Manica, Vigneshwari Subramanian, Julio Sáez-Rodríguez, María Rodríguez Martínez
Our models ingest a drug-cell pair consisting of SMILES encoding of a compound and the gene expression profile of a cancer cell and predicts an IC50 sensitivity value.