Search Results for author: Luke R Lloyd-Jones

Found 1 papers, 0 papers with code

A Universal Approximation Theorem for Mixture of Experts Models

no code implementations11 Feb 2016 Hien D. Nguyen, Luke R Lloyd-Jones, Geoffrey J. McLachlan

The mixture of experts (MoE) model is a popular neural network architecture for nonlinear regression and classification.

General Classification regression

Cannot find the paper you are looking for? You can Submit a new open access paper.