no code implementations • 21 Mar 2024 • Soroush Ghandi, Benjamin Quost, Cassio de Campos
This paper focuses on the main algorithm for training PCs, LearnSPN, a gold standard due to its efficiency, performance, and ease of use, in particular for tabular data.
no code implementations • 19 Mar 2024 • Soroush Ghandi, Benjamin Quost, Cassio de Campos
This work addresses integrating probabilistic propositional logic constraints into the distribution encoded by a probabilistic circuit (PC).
no code implementations • 25 Oct 2023 • Gennaro Gala, Cassio de Campos, Robert Peharz, Antonio Vergari, Erik Quaeghebeur
In contrast, probabilistic circuits (PCs) are hierarchical discrete mixtures represented as computational graphs composed of input, sum and product units.
1 code implementation • 10 Jun 2023 • Vu-Linh Nguyen, Yang Yang, Cassio de Campos
We propose a formal framework for probabilistic MDC in which learning an optimal multi-dimensional classifier can be decomposed, without loss of generality, into learning a set of (smaller) single-variable multi-class probabilistic classifiers and a directed acyclic graph.
1 code implementation • 21 Sep 2022 • Alvaro H. C. Correia, Gennaro Gala, Erik Quaeghebeur, Cassio de Campos, Robert Peharz
Meanwhile, tractable probabilistic models such as probabilistic circuits (PCs) can be understood as hierarchical discrete mixture models, and thus are capable of performing exact inference efficiently but often show subpar performance in comparison to continuous latent-space models.
no code implementations • 9 May 2021 • Alessio Benavoli, Cassio de Campos
A fundamental task in AI is to assess (in)dependence between mixed-type variables (text, image, sound).
1 code implementation • 11 Jul 2020 • Alvaro H. C. Correia, Robert Peharz, Cassio de Campos
Decision Trees and Random Forests are among the most widely used machine learning models, and often achieve state-of-the-art performance in tabular, domain-agnostic datasets.
1 code implementation • NeurIPS 2020 • Alvaro H. C. Correia, Robert Peharz, Cassio de Campos
Decision Trees (DTs) and Random Forests (RFs) are powerful discriminative learners and tools of central importance to the everyday machine learning practitioner and data scientist.
1 code implementation • 23 May 2019 • Alvaro H. C. Correia, James Cussens, Cassio de Campos
Many algorithms for score-based Bayesian network structure learning (BNSL), in particular exact ones, take as input a collection of potentially optimal parent sets for each variable in the data.