2 code implementations • 27 Mar 2024 • Lisa Raithel, Hui-Syuan Yeh, Shuntaro Yada, Cyril Grouin, Thomas Lavergne, Aurélie Névéol, Patrick Paroubek, Philippe Thomas, Tomohiro Nishiyama, Sebastian Möller, Eiji Aramaki, Yuji Matsumoto, Roland Roller, Pierre Zweigenbaum
User-generated data sources have gained significance in uncovering Adverse Drug Reactions (ADRs), with an increasing number of discussions occurring in the digital world.
1 code implementation • 12 Dec 2022 • Tomohiro Nishiyama
For arbitrary two probability measures on real d-space with given means and variances (covariance matrices), we provide lower bounds for their total variation distance.
no code implementations • 16 Feb 2020 • Tomohiro Nishiyama
In information theory, some optimization problems result in convex optimization problems on strictly convex functionals of probability densities.
1 code implementation • 29 Jun 2019 • Tomohiro Nishiyama
By using the relation between the KL-divergence and the Chi-square divergence, we show that the lower bound for the KL-divergence which only depends on the expectation value and the variance of a function we choose.
no code implementations • 30 Oct 2018 • Tomohiro Nishiyama
In this paper, we introduce directed networks called `divergence network' in order to perform graphical calculation of divergence functions.
no code implementations • 3 Oct 2018 • Tomohiro Nishiyama
Divergence functions play a key role as to measure the discrepancy between two points in the field of machine learning, statistics and signal processing.
no code implementations • 19 Aug 2018 • Tomohiro Nishiyama
In this paper, we introduce new classes of divergences by extending the definitions of the Bregman divergence and the skew Jensen divergence.