Search Results for author: Mihailo Stojnic

Found 13 papers, 0 papers with code

Exact objectives of random linear programs and mean widths of random polyhedrons

no code implementations6 Mar 2024 Mihailo Stojnic

We consider \emph{random linear programs} (rlps) as a subclass of \emph{random optimization problems} (rops) and study their typical behavior.

Capacity of the Hebbian-Hopfield network associative memory

no code implementations4 Mar 2024 Mihailo Stojnic

In \cite{Hop82}, Hopfield introduced a \emph{Hebbian} learning rule based neural network model and suggested how it can efficiently operate as an associative memory.

Exact capacity of the \emph{wide} hidden layer treelike neural networks with generic activations

no code implementations8 Feb 2024 Mihailo Stojnic

We obtain explicit, closed form, capacity characterizations for a very generic class of the hidden layer activations.

Fixed width treelike neural networks capacity analysis -- generic activations

no code implementations8 Feb 2024 Mihailo Stojnic

In more concrete terms, for each of these activations, we obtain both the RDT and pl RDT based memory capacities upper bound characterization for \emph{any} given (even) number of the hidden layer neurons, $d$.

Fl RDT based ultimate lowering of the negative spherical perceptron capacity

no code implementations27 Dec 2023 Mihailo Stojnic

We here first show that the \emph{negative spherical perceptrons} can be fitted into the frame of the fl RDT and then employ the whole fl RDT machinery to characterize the capacity.

\emph{Lifted} RDT based capacity analysis of the 1-hidden layer treelike \emph{sign} perceptrons neural networks

no code implementations13 Dec 2023 Mihailo Stojnic

Moreover, for particular \emph{treelike committee machines} (TCM) architectures with $d\leq 5$ neurons in the hidden layer, \cite{Stojnictcmspnncaprdt23} made a very first mathematically rigorous progress in over 30 years by lowering the previously best known capacity bounds of \cite{MitchDurb89}.

Memorization

Capacity of the treelike sign perceptrons neural networks with one hidden layer -- RDT based upper bounds

no code implementations13 Dec 2023 Mihailo Stojnic

We study the capacity of \emph{sign} perceptrons neural networks (SPNN) and particularly focus on 1-hidden layer \emph{treelike committee machine} (TCM) architectures.

Binary perceptrons capacity via fully lifted random duality theory

no code implementations29 Nov 2023 Mihailo Stojnic

In particular, we rely on \emph{fully lifted} random duality theory (fl RDT) established in \cite{Stojnicflrdt23} to create a general framework for studying the perceptrons' capacities.

Causal Inference (C-inf) -- asymmetric scenario of typical phase transitions

no code implementations2 Jan 2023 Agostino Capponi, Mihailo Stojnic

In this paper, we revisit and further explore a mathematically rigorous connection between Causal inference (C-inf) and the Low-rank recovery (LRR) established in [10].

Causal Inference

Causal Inference (C-inf) -- closed form worst case typical phase transitions

no code implementations2 Jan 2023 Agostino Capponi, Mihailo Stojnic

In this paper we establish a mathematically rigorous connection between Causal inference (C-inf) and the low-rank recovery (LRR).

Causal Inference

Discrete perceptrons

no code implementations17 Jun 2013 Mihailo Stojnic

An introductory statistical mechanics treatment of such perceptrons was given in \cite{GutSte90}.

Spherical perceptron as a storage memory with limited errors

no code implementations17 Jun 2013 Mihailo Stojnic

In Gardner's original work the statistical mechanics predictions in this directions were presented sa well.

A problem dependent analysis of SOCP algorithms in noisy compressed sensing

no code implementations29 Mar 2013 Mihailo Stojnic

In our recent work \cite{StojnicGenSocp10} we established an alternative framework that can be used for statistical performance analysis of the SOCP algorithms.

Cannot find the paper you are looking for? You can Submit a new open access paper.