no code implementations • 22 Jan 2021 • Namig J. Guliyev, Vugar E. Ismailov
We algorithmically construct a two hidden layer feedforward neural network (TLFN) model with the weights fixed as the unit coordinate vectors of the $d$-dimensional Euclidean space and having $3d+2$ number of hidden neurons in total, which can approximate any continuous $d$-variable function with an arbitrary precision.
no code implementations • 21 Aug 2017 • Namig J. Guliyev, Vugar E. Ismailov
Feedforward neural networks have wide applicability in various disciplines of science due to their universal approximation property.
no code implementations • 31 Dec 2015 • Namig J. Guliyev, Vugar E. Ismailov
The possibility of approximating a continuous function on a compact subset of the real line by a feedforward single hidden layer neural network with a sigmoidal activation function has been studied in many papers.