no code implementations • 27 Oct 2021 • Kostadin Cvejoski, Ramses J. Sanchez, Christian Bauckhage, Cesar Ojeda
In the present work we leverage the known power of reviews to enhance rating predictions in a way that (i) respects the causality of review generation and (ii) includes, in a bidirectional fashion, the ability of ratings to inform language review models and vice-versa, language representations that help predict ratings end-to-end.
no code implementations • 20 May 2021 • Noa Malem-Shinitski, Cesar Ojeda, Manfred Opper
We continue the line of work of Bayesian inference for Hawkes processes, and our approach dispenses with the necessity of estimating a branching structure for the posterior, as we perform inference on an aggregated sum of Gaussian Processes.
1 code implementation • 10 Dec 2020 • Kostadin Cvejoski, Ramses J. Sanchez, Bogdan Georgiev, Christian Bauckhage, Cesar Ojeda
Specifically, we use the dynamic representations of recurrent point process models, which encode the history of how business or service reviews are received in time, to generate instantaneous language models with improved prediction capabilities.
no code implementations • 9 Dec 2019 • Kostadin Cvejoski, Ramses J. Sanchez, Bogdan Georgiev, Jannis Schuecker, Christian Bauckhage, Cesar Ojeda
Recent progress in recommender system research has shown the importance of including temporal representations to improve interpretability and performance.
no code implementations • 17 Jun 2017 • Christian Bauckhage, Eduardo Brito, Kostadin Cvejoski, Cesar Ojeda, Rafet Sifa, Stefan Wrobel
Quantum computing for machine learning attracts increasing attention and recent technological developments suggest that especially adiabatic quantum computing may soon be of practical interest.