no code implementations • 11 Feb 2016 • Hien D. Nguyen, Luke R Lloyd-Jones, Geoffrey J. McLachlan
The mixture of experts (MoE) model is a popular neural network architecture for nonlinear regression and classification.
General Classification regression