no code implementations • 17 Feb 2021 • Haik Manukian, Massimiliano Di Ventra
The deep extension of the restricted Boltzmann machine (RBM), known as the deep Boltzmann machine (DBM), is an expressive family of machine learning models which can serve as compact representations of complex probability distributions.
no code implementations • 15 Jan 2020 • Haik Manukian, Yan Ru Pei, Sean R. B. Bearden, Massimiliano Di Ventra
Restricted Boltzmann machines (RBMs) are a powerful class of generative models, but their training requires computing a gradient that, unlike supervised backpropagation on typical loss functions, is notoriously difficult even to approximate.
1 code implementation • 6 Nov 2019 • Judith Clymo, Haik Manukian, Nathanaël Fijalkow, Adrià Gascón, Brooks Paige
A particular challenge lies in generating meaningful sets of inputs and outputs, which well-characterize a given program and accurately demonstrate its behavior.
1 code implementation • 14 May 2019 • Yan Ru Pei, Haik Manukian, Massimiliano Di Ventra
Many optimization problems can be cast into the maximum satisfiability (MAX-SAT) form, and many solvers have been developed for tackling such problems.
no code implementations • 1 Jan 2018 • Haik Manukian, Fabio L. Traversa, Massimiliano Di Ventra
In fact, the acceleration of pretraining achieved by simulating DMMs is comparable to, in number of iterations, the recently reported hardware application of the quantum annealing method on the same network and data set.
no code implementations • 13 Dec 2016 • Haik Manukian, Fabio L. Traversa, Massimiliano Di Ventra
We propose to use Digital Memcomputing Machines (DMMs), implemented with self-organizing logic gates (SOLGs), to solve the problem of numerical inversion.