no code implementations • 6 Jul 2017 • Varun Kumar Ojha, Ajith Abraham, Vaclav Snasel
A comprehensive performance analysis of the underlying parameters such as: selection strategy, distance measure metric and pheromone evaporation rate of the ACO suggests that the Roulette Wheel Selection strategy enhances the performance of the ACO due to its ability to provide non-uniformity and adequate diversity in the selection of a solution.
no code implementations • 6 Jul 2017 • Varun Kumar Ojha, Ajith Abraham, Vaclav Snasel
Optimization of neural network (NN) significantly influenced by the transfer function used in its active nodes.
no code implementations • 6 Jul 2017 • Varun Kumar Ojha, Paramartha Dutta, Atal Chaudhuri, Hiranmay Saha
The cross sensitivity is a crucial issue to this problem and it is viewed as pattern recognition and noise reduction problem.
1 code implementation • 16 May 2017 • Varun Kumar Ojha, Ajith Abraham, Václav Snášel
Machine learning algorithms are inherently multiobjective in nature, where approximation error minimization and model's complexity simplification are two conflicting objectives.
no code implementations • 16 May 2017 • Varun Kumar Ojha, Ajith Abraham, Václav Snášel
The FNN optimization is often viewed from the various perspectives: the optimization of weights, network architecture, activation nodes, learning parameters, learning environment, etc.
no code implementations • 16 May 2017 • Varun Kumar Ojha, Serena Schiano, Chuan-Yu Wu, Václav Snášel, Ajith Abraham
It was found that the FNT-based CI model (for both CV methods) performed much better than other CI models.
no code implementations • 16 May 2017 • Varun Kumar Ojha, Paramartha Dutta, Atal Chaudhuri
The primary goal of this work was to identify the hazardousness of sewer pipeline to offer safe and non-hazardous access to sewer pipeline workers so that the human fatalities, which occurs due to the toxic exposure of sewer gas components, can be avoided.
1 code implementation • 16 May 2017 • Varun Kumar Ojha, Vaclav Snasel, Ajith Abraham
Hence, the proposed HFIT is an efficient and competitive alternative to the other FISs for function approximation and feature selection.