Search Results for author: Junior Barrera

Found 11 papers, 4 papers with code

An Algorithm to Train Unrestricted Sequential Discrete Morphological Neural Networks

1 code implementation6 Oct 2023 Diego Marcondes, Mariana Feldman, Junior Barrera

We also proposed a stochastic lattice descent algorithm (SLDA) to learn the parameters of Canonical Discrete Morphological Neural Networks (CDMNN), whose architecture is composed only of operators that can be decomposed as the supremum, infimum, and complement of erosions and dilations.

Discrete Morphological Neural Networks

1 code implementation1 Sep 2023 Diego Marcondes, Junior Barrera

We propose the Discrete Morphological Neural Networks (DMNN) for binary image analysis to represent W-operators and estimate them via machine learning.

The role of prior information and computational power in Machine Learning

no code implementations31 Oct 2022 Diego Marcondes, Adilson Simonis, Junior Barrera

Science consists on conceiving hypotheses, confronting them with empirical evidence, and keeping only hypotheses which have not yet been falsified.

Learning the hypotheses space from data through a U-curve algorithm

no code implementations8 Sep 2021 Diego Marcondes, Adilson Simonis, Junior Barrera

This paper proposes a data-driven systematic, consistent and non-exhaustive approach to Model Selection, that is an extension of the classical agnostic PAC learning model.

Model Selection PAC learning

Learning the Hypotheses Space from data Part II: Convergence and Feasibility

no code implementations30 Jan 2020 Diego Marcondes, Adilson Simonis, Junior Barrera

In this paper, we carry further our agenda, by showing the consistency of a model selection framework based on Learning Spaces, in which one selects from data the Hypotheses Space on which to learn.

Model Selection

Learning the Hypotheses Space from data: Learning Space and U-curve Property

no code implementations26 Jan 2020 Diego Marcondes, Adilson Simonis, Junior Barrera

A remarkable, formally proved, consequence of this approach are conditions on $\mathbb{L}(\mathcal{H})$ and on the loss function that lead to estimated out-of-sample error surfaces which are true U-curves on $\mathbb{L}(\mathcal{H})$ chains, enabling a more efficient search on $\mathbb{L}(\mathcal{H})$.

Model Selection Neural Architecture Search +1

Feature Selection based on the Local Lift Dependence Scale

no code implementations11 Nov 2017 Diego Marcondes, Adilson Simonis, Junior Barrera

The main contribution of this paper is to define and apply this local measure, which permits to analyse local properties of joint distributions that are neglected by the classical Shanon's global measure.

feature selection

The U-curve optimization problem: improvements on the original algorithm and time complexity analysis

no code implementations22 Jul 2014 Marcelo S. Reis, Carlos E. Ferreira, Junior Barrera

The U-curve optimization problem is characterized by a decomposable in U-shaped curves cost function over the chains of a Boolean lattice.

feature selection

Cannot find the paper you are looking for? You can Submit a new open access paper.