Search Results for author: Michael Bukatin

Found 7 papers, 6 papers with code

Dataflow Matrix Machines and V-values: a Bridge between Programs and Neural Nets

1 code implementation20 Dec 2017 Michael Bukatin, Jon Anthony

1) Dataflow matrix machines (DMMs) generalize neural nets by replacing streams of numbers with linear streams (streams supporting linear combinations), allowing arbitrary input and output arities for activation functions, countable-sized networks with finite dynamically changeable active part capable of unbounded growth, and a very expressive self-referential mechanism.

Dataflow Matrix Machines as a Model of Computations with Linear Streams

1 code implementation3 May 2017 Michael Bukatin, Jon Anthony

We overview dataflow matrix machines as a Turing complete generalization of recurrent neural networks and as a programming platform.

Notes on Pure Dataflow Matrix Machines: Programming with Self-referential Matrix Transformations

1 code implementation4 Oct 2016 Michael Bukatin, Steve Matthews, Andrey Radul

Dataflow matrix machines are self-referential generalized recurrent neural nets.

Programming Languages

Programming Patterns in Dataflow Matrix Machines and Generalized Recurrent Neural Nets

1 code implementation30 Jun 2016 Michael Bukatin, Steve Matthews, Andrey Radul

Dataflow matrix machines arise naturally in the context of synchronous dataflow programming with linear streams.

Linear Models of Computation and Program Learning

no code implementations15 Dec 2015 Michael Bukatin, Steve Matthews

We consider two classes of computations which admit taking linear combinations of execution runs: probabilistic sampling and generalized animation.

Probabilistic Programming

Cannot find the paper you are looking for? You can Submit a new open access paper.