Search Results for author: Michael S. Gashler

Found 8 papers, 1 papers with code

Leveraging Product as an Activation Function in Deep Networks

no code implementations19 Oct 2018 Luke B. Godfrey, Michael S. Gashler

We present windowed product unit neural networks (WPUNNs), a simple method of leveraging product as a nonlinearity in a neural network.

Neural Decomposition of Time-Series Data for Effective Generalization

no code implementations25 May 2017 Luke B. Godfrey, Michael S. Gashler

We present a neural network technique for the analysis and extrapolation of time-series data called Neural Decomposition (ND).

Time Series Time Series Forecasting

A continuum among logarithmic, linear, and exponential functions, and its potential to improve generalization in neural networks

1 code implementation3 Feb 2016 Luke B. Godfrey, Michael S. Gashler

We present the soft exponential activation function for artificial neural networks that continuously interpolates between logarithmic, linear, and exponential functions.

A Minimal Architecture for General Cognition

no code implementations31 Jul 2015 Michael S. Gashler, Zachariah Kindle, Michael R. Smith

From this perspective, MANIC offers an alternate approach to a long-standing objective of artificial intelligence.

Training Deep Fourier Neural Networks To Fit Time-Series Data

no code implementations9 May 2014 Michael S. Gashler, Stephen C. Ashmore

We present a method for training a deep neural network containing sinusoidal activation functions to fit to time-series data.

Time Series Time Series Analysis

Missing Value Imputation With Unsupervised Backpropagation

no code implementations19 Dec 2013 Michael S. Gashler, Michael R. Smith, Richard Morris, Tony Martinez

We evaluate UBP with the task of imputing missing values in datasets, and show that UBP is able to predict missing values with significantly lower sum-squared error than other collaborative filtering and imputation techniques.

Collaborative Filtering General Classification +1

Cannot find the paper you are looking for? You can Submit a new open access paper.