1 code implementation • 20 Apr 2022 • Milena Pavlović, Ghadi S. Al Hajj, Johan Pensar, Mollie Wood, Ludvig M. Sollid, Victor Greiff, Geir Kjetil Sandve
Machine learning is increasingly used to discover diagnostic and prognostic biomarkers from high-dimensional molecular data.
no code implementations • 29 Jan 2022 • Asif Khan, Alexander I. Cowen-Rivers, Derrick-Goh-Xin Deik, Antoine Grosnit, Kamil Dreczkowski, Philippe A. Robert, Victor Greiff, Rasul Tutunov, Dany Bou-Ammar, Jun Wang, Haitham Bou-Ammar
Therefore, it is a priority to design optimal antigen-specific CDRH3 regions to develop therapeutic antibodies to combat harmful pathogens.
no code implementations • 20 Oct 2020 • Kerui Peng, Yana Safonova, Mikhail Shugay, Alice Popejoy, Oscar Rodriguez, Felix Breden, Petter Brodin, Amanda M. Burkhardt, Carlos Bustamante, Van-Mai Cao-Lormeau, Martin M. Corcoran, Darragh Duffy, Macarena Fuentes Guajardo, Ricardo Fujita, Victor Greiff, Vanessa D. Jonsson, Xiao Liu, Lluis Quintana-Murci, Maura Rossetti, Jianming Xie, Gur Yaari, Wei zhang, Malak S. Abedalthagafi, Khalid O. Adekoya, Rahaman A. Ahmed, Wei-Chiao Chang, Clive Gray, Yusuke Nakamura, William D. Lees, Purvesh Khatri, Houda Alachkar, Cathrine Scheepers, Corey T. Watson, Gunilla B. Karlsson Hedestam, Serghei Mangul
With the advent of high-throughput sequencing technologies, the fields of immunogenomics and adaptive immune receptor repertoire research are facing both opportunities and challenges.
2 code implementations • ICLR 2021 • Hubert Ramsauer, Bernhard Schäfl, Johannes Lehner, Philipp Seidl, Michael Widrich, Thomas Adler, Lukas Gruber, Markus Holzleitner, Milena Pavlović, Geir Kjetil Sandve, Victor Greiff, David Kreil, Michael Kopp, Günter Klambauer, Johannes Brandstetter, Sepp Hochreiter
The new update rule is equivalent to the attention mechanism used in transformers.
1 code implementation • NeurIPS 2020 • Michael Widrich, Bernhard Schäfl, Hubert Ramsauer, Milena Pavlović, Lukas Gruber, Markus Holzleitner, Johannes Brandstetter, Geir Kjetil Sandve, Victor Greiff, Sepp Hochreiter, Günter Klambauer
We show that the attention mechanism of transformer architectures is actually the update rule of modern Hopfield networks that can store exponentially many patterns.