no code implementations • 13 May 2022 • Orsolya Csiszár, Luca Sára Pusztaházi, Lehel Dénes-Fazakas, Michael S. Gashler, Vladik Kreinovich, Gábor Csiszár
We present a deep learning model for finding human-understandable connections between input features.
no code implementations • 19 Oct 2018 • Luke B. Godfrey, Michael S. Gashler
We present windowed product unit neural networks (WPUNNs), a simple method of leveraging product as a nonlinearity in a neural network.
no code implementations • 28 Aug 2017 • Luke B. Godfrey, Michael S. Gashler
We present a deep learning architecture for learning fuzzy logic expressions.
no code implementations • 25 May 2017 • Luke B. Godfrey, Michael S. Gashler
We present a neural network technique for the analysis and extrapolation of time-series data called Neural Decomposition (ND).
1 code implementation • 3 Feb 2016 • Luke B. Godfrey, Michael S. Gashler
We present the soft exponential activation function for artificial neural networks that continuously interpolates between logarithmic, linear, and exponential functions.
no code implementations • 31 Jul 2015 • Michael S. Gashler, Zachariah Kindle, Michael R. Smith
From this perspective, MANIC offers an alternate approach to a long-standing objective of artificial intelligence.
no code implementations • 9 May 2014 • Michael S. Gashler, Stephen C. Ashmore
We present a method for training a deep neural network containing sinusoidal activation functions to fit to time-series data.
no code implementations • 19 Dec 2013 • Michael S. Gashler, Michael R. Smith, Richard Morris, Tony Martinez
We evaluate UBP with the task of imputing missing values in datasets, and show that UBP is able to predict missing values with significantly lower sum-squared error than other collaborative filtering and imputation techniques.