Search Results for author: Kostadin Cvejoski

Found 12 papers, 2 papers with code

Foundational Inference Models for Dynamical Systems

no code implementations12 Feb 2024 Patrick Seifner, Kostadin Cvejoski, Ramses J. Sanchez

The resulting models, which we call foundational inference models (FIM), can be (i) copied and matched along the time dimension to increase their resolution; and (ii) copied and composed to build inference models of any dimensionality, without the need of any finetuning.

Neural Dynamic Focused Topic Model

1 code implementation26 Jan 2023 Kostadin Cvejoski, Ramsés J. Sánchez, César Ojeda

Topic models and all their variants analyse text by learning meaningful representations through word co-occurrences.

Topic Models Variational Inference

The future is different: Large pre-trained language models fail in prediction tasks

no code implementations1 Nov 2022 Kostadin Cvejoski, Ramsés J. Sánchez, César Ojeda

Our models display performance drops of only about 40% in the worst cases (2% in the best ones) when predicting the popularity of future posts, while using only about 7% of the total number of parameters of LPLM and providing interpretable representations that offer insight into real-world events, like the GameStop short squeeze of 2021

Language Modelling Topic Models

Hidden Schema Networks

no code implementations8 Jul 2022 Ramsés J. Sánchez, Lukas Conrads, Pascal Welke, Kostadin Cvejoski, César Ojeda

Large, pretrained language models infer powerful representations that encode rich semantic and syntactic content, albeit implicitly.

Language Modelling

Informed Pre-Training on Prior Knowledge

no code implementations23 May 2022 Laura von Rueden, Sebastian Houben, Kostadin Cvejoski, Christian Bauckhage, Nico Piatkowski

In this paper, we propose a novel informed machine learning approach and suggest to pre-train on prior knowledge.

Dynamic Review-based Recommenders

no code implementations27 Oct 2021 Kostadin Cvejoski, Ramses J. Sanchez, Christian Bauckhage, Cesar Ojeda

In the present work we leverage the known power of reviews to enhance rating predictions in a way that (i) respects the causality of review generation and (ii) includes, in a bidirectional fashion, the ability of ratings to inform language review models and vice-versa, language representations that help predict ratings end-to-end.

Recommendation Systems Review Generation

Combining expert knowledge and neural networks to model environmental stresses in agriculture

no code implementations26 Oct 2021 Kostadin Cvejoski, Jannis Schuecker, Anne-Katrin Mahlein, Bogdan Georgiev

In this work we combine representation learning capabilities of neural network with agricultural knowledge from experts to model environmental heat and drought stresses.

Clustering Representation Learning

Generative Deep Learning Techniques for Password Generation

no code implementations10 Dec 2020 David Biesner, Kostadin Cvejoski, Bogdan Georgiev, Rafet Sifa, Erik Krupicka

Password guessing approaches via deep learning have recently been investigated with significant breakthroughs in their ability to generate novel, realistic password candidates.

Recurrent Point Review Models

1 code implementation10 Dec 2020 Kostadin Cvejoski, Ramses J. Sanchez, Bogdan Georgiev, Christian Bauckhage, Cesar Ojeda

Specifically, we use the dynamic representations of recurrent point process models, which encode the history of how business or service reviews are received in time, to generate instantaneous language models with improved prediction capabilities.

Recommendation Systems

Recurrent Point Processes for Dynamic Review Models

no code implementations9 Dec 2019 Kostadin Cvejoski, Ramses J. Sanchez, Bogdan Georgiev, Jannis Schuecker, Christian Bauckhage, Cesar Ojeda

Recent progress in recommender system research has shown the importance of including temporal representations to improve interpretability and performance.

Point Processes Recommendation Systems

Adiabatic Quantum Computing for Binary Clustering

no code implementations17 Jun 2017 Christian Bauckhage, Eduardo Brito, Kostadin Cvejoski, Cesar Ojeda, Rafet Sifa, Stefan Wrobel

Quantum computing for machine learning attracts increasing attention and recent technological developments suggest that especially adiabatic quantum computing may soon be of practical interest.

BIG-bench Machine Learning Clustering

Cannot find the paper you are looking for? You can Submit a new open access paper.