Search Results for author: Daniel Krefl

Found 6 papers, 3 papers with code

Pretrained Knowledge Base Embeddings for improved Sentential Relation Extraction

1 code implementation ACL 2022 Andrea Papaluca, Daniel Krefl, Hanna Suominen, Artem Lenskiy

In this work we put forward to combine pretrained knowledge base graph embeddings with transformer based language models to improve performance on the sentential Relation Extraction task in natural language processing.

Relation Relation Extraction

Zero- and Few-Shots Knowledge Graph Triplet Extraction with Large Language Models

no code implementations4 Dec 2023 Andrea Papaluca, Daniel Krefl, Sergio Mendez Rodriguez, Artem Lensky, Hanna Suominen

In this work, we tested the Triplet Extraction (TE) capabilities of a variety of Large Language Models (LLMs) of different sizes in the Zero- and Few-Shots settings.

Sentence

Product Jacobi-Theta Boltzmann machines with score matching

no code implementations10 Mar 2023 Andrea Pasquale, Daniel Krefl, Stefano Carrazza, Frank Nielsen

The estimation of probability density functions is a non trivial task that over the last years has been tackled with machine learning techniques.

Modelling conditional probabilities with Riemann-Theta Boltzmann Machines

1 code implementation27 May 2019 Stefano Carrazza, Daniel Krefl, Andrea Papaluca

The probability density function for the visible sector of a Riemann-Theta Boltzmann machine can be taken conditional on a subset of the visible units.

Sampling the Riemann-Theta Boltzmann Machine

no code implementations20 Apr 2018 Stefano Carrazza, Daniel Krefl

We show that the visible sector probability density function of the Riemann-Theta Boltzmann machine corresponds to a gaussian mixture model consisting of an infinite number of component multi-variate gaussians.

Riemann-Theta Boltzmann Machine

1 code implementation20 Dec 2017 Daniel Krefl, Stefano Carrazza, Babak Haghighat, Jens Kahlen

Both the Boltzmann machine and the derived feedforward neural network can be successfully trained via standard gradient- and non-gradient-based optimization techniques.

Cannot find the paper you are looking for? You can Submit a new open access paper.