no code implementations • 10 Jun 2024 • David Berghaus, Kostadin Cvejoski, Patrick Seifner, Cesar Ojeda, Ramses J. Sanchez
Markov jump processes are continuous-time stochastic processes which describe dynamical systems evolving in discrete state spaces.
no code implementations • 12 Feb 2024 • Patrick Seifner, Kostadin Cvejoski, Ramses J. Sanchez
The resulting models, which we call foundational inference models (FIM), can be (i) copied and matched along the time dimension to increase their resolution; and (ii) copied and composed to build inference models of any dimensionality, without the need of any finetuning.
1 code implementation • 26 Jan 2023 • Kostadin Cvejoski, Ramsés J. Sánchez, César Ojeda
Topic models and all their variants analyse text by learning meaningful representations through word co-occurrences.
no code implementations • 1 Nov 2022 • Kostadin Cvejoski, Ramsés J. Sánchez, César Ojeda
Our models display performance drops of only about 40% in the worst cases (2% in the best ones) when predicting the popularity of future posts, while using only about 7% of the total number of parameters of LPLM and providing interpretable representations that offer insight into real-world events, like the GameStop short squeeze of 2021
no code implementations • 8 Jul 2022 • Ramsés J. Sánchez, Lukas Conrads, Pascal Welke, Kostadin Cvejoski, César Ojeda
Large, pretrained language models infer powerful representations that encode rich semantic and syntactic content, albeit implicitly.
no code implementations • 23 May 2022 • Laura von Rueden, Sebastian Houben, Kostadin Cvejoski, Christian Bauckhage, Nico Piatkowski
In this paper, we propose a novel informed machine learning approach and suggest to pre-train on prior knowledge.
no code implementations • 10 May 2022 • Julian Wörmann, Daniel Bogdoll, Christian Brunner, Etienne Bührle, Han Chen, Evaristus Fuh Chuo, Kostadin Cvejoski, Ludger van Elst, Philip Gottschall, Stefan Griesche, Christian Hellert, Christian Hesels, Sebastian Houben, Tim Joseph, Niklas Keil, Johann Kelsch, Mert Keser, Hendrik Königshof, Erwin Kraft, Leonie Kreuser, Kevin Krone, Tobias Latka, Denny Mattern, Stefan Matthes, Franz Motzkus, Mohsin Munir, Moritz Nekolla, Adrian Paschke, Stefan Pilar von Pilchau, Maximilian Alexander Pintz, Tianming Qiu, Faraz Qureishi, Syed Tahseen Raza Rizvi, Jörg Reichardt, Laura von Rueden, Alexander Sagel, Diogo Sasdelli, Tobias Scholl, Gerhard Schunk, Gesina Schwalbe, Hao Shen, Youssef Shoeb, Hendrik Stapelbroek, Vera Stehr, Gurucharan Srinivas, Anh Tuan Tran, Abhishek Vivekanandan, Ya Wang, Florian Wasserrab, Tino Werner, Christian Wirth, Stefan Zwicklbauer
The availability of representative datasets is an essential prerequisite for many successful artificial intelligence and machine learning models.
no code implementations • 27 Oct 2021 • Kostadin Cvejoski, Ramses J. Sanchez, Christian Bauckhage, Cesar Ojeda
In the present work we leverage the known power of reviews to enhance rating predictions in a way that (i) respects the causality of review generation and (ii) includes, in a bidirectional fashion, the ability of ratings to inform language review models and vice-versa, language representations that help predict ratings end-to-end.
no code implementations • 26 Oct 2021 • Kostadin Cvejoski, Jannis Schuecker, Anne-Katrin Mahlein, Bogdan Georgiev
In this work we combine representation learning capabilities of neural network with agricultural knowledge from experts to model environmental heat and drought stresses.
no code implementations • 10 Dec 2020 • David Biesner, Kostadin Cvejoski, Bogdan Georgiev, Rafet Sifa, Erik Krupicka
Password guessing approaches via deep learning have recently been investigated with significant breakthroughs in their ability to generate novel, realistic password candidates.
1 code implementation • 10 Dec 2020 • Kostadin Cvejoski, Ramses J. Sanchez, Bogdan Georgiev, Christian Bauckhage, Cesar Ojeda
Specifically, we use the dynamic representations of recurrent point process models, which encode the history of how business or service reviews are received in time, to generate instantaneous language models with improved prediction capabilities.
no code implementations • 9 Dec 2019 • Kostadin Cvejoski, Ramses J. Sanchez, Bogdan Georgiev, Jannis Schuecker, Christian Bauckhage, Cesar Ojeda
Recent progress in recommender system research has shown the importance of including temporal representations to improve interpretability and performance.
no code implementations • 17 Jun 2017 • Christian Bauckhage, Eduardo Brito, Kostadin Cvejoski, Cesar Ojeda, Rafet Sifa, Stefan Wrobel
Quantum computing for machine learning attracts increasing attention and recent technological developments suggest that especially adiabatic quantum computing may soon be of practical interest.