Search Results for author: AbdElRahman ElSaid

Found 8 papers, 1 papers with code

Colony-Enhanced Recurrent Neural Architecture Search: Collaborative Ant-Based Optimization

no code implementations30 Jan 2024 AbdElRahman ElSaid

Crafting neural network architectures manually is a formidable challenge often leading to suboptimal and inefficient structures.

Neural Architecture Search

Backpropagation-Free 4D Continuous Ant-Based Neural Topology Search

1 code implementation11 May 2023 AbdElRahman ElSaid, Karl Ricanek, Zeming Lyu, Alexander Ororbia, Travis Desell

Continuous Ant-based Topology Search (CANTS) is a previously introduced novel nature-inspired neural architecture search (NAS) algorithm that is based on ant colony optimization (ACO).

Neural Architecture Search

Addressing Tactic Volatility in Self-Adaptive Systems Using Evolved Recurrent Neural Networks and Uncertainty Reduction Tactics

no code implementations21 Apr 2022 Aizaz Ul Haq, Niranjana Deshpande, AbdElRahman ElSaid, Travis Desell, Daniel E. Krutz

Simulations using 52, 106 tactic records demonstrate that: I) eRNN is an effective prediction mechanism, II) TVA-E represents an improvement over existing state-of-the-art processes in accounting for tactic volatility, and III) Uncertainty reduction tactics are beneficial in accounting for tactic volatility.

Decision Making

Continuous Ant-Based Neural Topology Search

no code implementations21 Nov 2020 AbdElRahman ElSaid, Joshua Karns, Zimeng Lyu, Alexander Ororbia, Travis Desell

This work introduces a novel, nature-inspired neural architecture search (NAS) algorithm based on ant colony optimization, Continuous Ant-based Neural Topology Search (CANTS), which utilizes synthetic ants that move over a continuous search space based on the density and distribution of pheromones, is strongly inspired by how ants move in the real world.

Neural Architecture Search Time Series +1

An Experimental Study of Weight Initialization and Weight Inheritance Effects on Neuroevolution

no code implementations21 Sep 2020 Zimeng Lyu, AbdElRahman ElSaid, Joshua Karns, Mohamed Mkaouer, Travis Desell

Weight initialization is critical in being able to successfully train artificial neural networks (ANNs), and even more so for recurrent neural networks (RNNs) which can easily suffer from vanishing and exploding gradients.

Evolutionary Algorithms Neural Architecture Search

Neuroevolutionary Transfer Learning of Deep Recurrent Neural Networks through Network-Aware Adaptation

no code implementations4 Jun 2020 AbdElRahman ElSaid, Joshua Karns, Alexander Ororbia II, Daniel Krutz, Zimeng Lyu, Travis Desell

Transfer learning entails taking an artificial neural network (ANN) that is trained on a source dataset and adapting it to a new target dataset.

Transfer Learning

Improving Neuroevolution Using Island Extinction and Repopulation

no code implementations15 May 2020 Zimeng Lyu, Joshua Karns, AbdElRahman ElSaid, Travis Desell

This island based strategy is additionally compared to NEAT's (NeuroEvolution of Augmenting Topologies) speciation strategy.

Evolutionary Algorithms Time Series +1

Optimizing Long Short-Term Memory Recurrent Neural Networks Using Ant Colony Optimization to Predict Turbine Engine Vibration

no code implementations10 Oct 2017 AbdElRahman ElSaid, Travis Desell, Fatima El Jamiy, James Higgins, Brandon Wild

This research improves the performance of the most effective LSTM network design proposed in the previous work by using a promising neuroevolution method based on ant colony optimization (ACO) to develop and enhance the LSTM cell structure of the network.

Cannot find the paper you are looking for? You can Submit a new open access paper.