1 code implementation • ACL 2022 • Andrea Papaluca, Daniel Krefl, Hanna Suominen, Artem Lenskiy
In this work we put forward to combine pretrained knowledge base graph embeddings with transformer based language models to improve performance on the sentential Relation Extraction task in natural language processing.
no code implementations • 4 Dec 2023 • Andrea Papaluca, Daniel Krefl, Sergio Mendez Rodriguez, Artem Lensky, Hanna Suominen
In this work, we tested the Triplet Extraction (TE) capabilities of a variety of Large Language Models (LLMs) of different sizes in the Zero- and Few-Shots settings.
no code implementations • 10 Mar 2023 • Andrea Pasquale, Daniel Krefl, Stefano Carrazza, Frank Nielsen
The estimation of probability density functions is a non trivial task that over the last years has been tackled with machine learning techniques.
1 code implementation • 27 May 2019 • Stefano Carrazza, Daniel Krefl, Andrea Papaluca
The probability density function for the visible sector of a Riemann-Theta Boltzmann machine can be taken conditional on a subset of the visible units.
no code implementations • 20 Apr 2018 • Stefano Carrazza, Daniel Krefl
We show that the visible sector probability density function of the Riemann-Theta Boltzmann machine corresponds to a gaussian mixture model consisting of an infinite number of component multi-variate gaussians.
1 code implementation • 20 Dec 2017 • Daniel Krefl, Stefano Carrazza, Babak Haghighat, Jens Kahlen
Both the Boltzmann machine and the derived feedforward neural network can be successfully trained via standard gradient- and non-gradient-based optimization techniques.