Search Results for author: Travis Desell

Found 25 papers, 4 papers with code

Transfer Learning Methods for Domain Adaptation in Technical Logbook Datasets

no code implementations LREC 2022 Farhad Akhbardeh, Marcos Zampieri, Cecilia Ovesdotter Alm, Travis Desell

Event identification in technical logbooks poses challenges given the limited logbook data available in specific technical domains, the large set of possible classes, and logbook entries typically being in short form and non-standard technical language.

Domain Adaptation Transfer Learning

Neuro-mimetic Task-free Unsupervised Online Learning with Continual Self-Organizing Maps

no code implementations19 Feb 2024 Hitesh Vaidya, Travis Desell, Ankur Mali, Alexander Ororbia

The major challenge that makes crafting such a system difficult is known as catastrophic forgetting - an agent, such as one based on artificial neural networks (ANNs), struggles to retain previously acquired knowledge when learning from new samples.

Minimally Supervised Topological Projections of Self-Organizing Maps for Phase of Flight Identification

no code implementations17 Feb 2024 Zimeng Lyu, Pujan Thapa, Travis Desell

General aviation flight data for phase of flight identification is usually per-second data, comes on a large scale, and is class imbalanced.

Minimally Supervised Learning using Topological Projections in Self-Organizing Maps

no code implementations12 Jan 2024 Zimeng Lyu, Alexander Ororbia, Rui Li, Travis Desell

In this work, we introduce a semi-supervised learning approach based on topological projections in self-organizing maps (SOMs), which significantly reduces the required number of labeled data points to perform parameter prediction, effectively exploiting information contained in large unlabeled datasets.

Decision Making Parameter Prediction +1

Backpropagation-Free 4D Continuous Ant-Based Neural Topology Search

1 code implementation11 May 2023 AbdElRahman ElSaid, Karl Ricanek, Zeming Lyu, Alexander Ororbia, Travis Desell

Continuous Ant-based Topology Search (CANTS) is a previously introduced novel nature-inspired neural architecture search (NAS) algorithm that is based on ant colony optimization (ACO).

Neural Architecture Search

Online Evolutionary Neural Architecture Search for Multivariate Non-Stationary Time Series Forecasting

no code implementations20 Feb 2023 Zimeng Lyu, Alexander Ororbia, Travis Desell

Results demonstrate that ONE-NAS outperforms traditional statistical time series forecasting methods, including online linear regression, fixed long short-term memory (LSTM) and gated recurrent unit (GRU) models trained online, as well as state-of-the-art, online ARIMA strategies.

Neural Architecture Search Time Series +1

A Large-Scale Annotated Multivariate Time Series Aviation Maintenance Dataset from the NGAFID

1 code implementation13 Oct 2022 Hong Yang, Travis Desell

To overcome this, we use the National General Aviation Flight Information Database (NGAFID), which contains flights recorded during regular operation of aircraft, and maintenance logs to construct a part failure dataset.

Management Time Series +1

Addressing Tactic Volatility in Self-Adaptive Systems Using Evolved Recurrent Neural Networks and Uncertainty Reduction Tactics

no code implementations21 Apr 2022 Aizaz Ul Haq, Niranjana Deshpande, AbdElRahman ElSaid, Travis Desell, Daniel E. Krutz

Simulations using 52, 106 tactic records demonstrate that: I) eRNN is an effective prediction mechanism, II) TVA-E represents an improvement over existing state-of-the-art processes in accounting for tactic volatility, and III) Uncertainty reduction tactics are beneficial in accounting for tactic volatility.

Decision Making

ONE-NAS: An Online NeuroEvolution based Neural Architecture Search for Time Series Forecasting

no code implementations27 Feb 2022 Zimeng Lyu, Travis Desell

Time series forecasting (TSF) is one of the most important tasks in data science, as accurate time series (TS) predictions can drive and advance a wide variety of domains including finance, transportation, health care, and power systems.

Neural Architecture Search Time Series +1

Robust Augmentation for Multivariate Time Series Classification

no code implementations27 Jan 2022 Hong Yang, Travis Desell

We also show that augmentation improves accuracy for recurrent and self attention based architectures.

Classification Time Series +2

Reducing Catastrophic Forgetting in Self Organizing Maps with Internally-Induced Generative Replay

no code implementations9 Dec 2021 Hitesh Vaidya, Travis Desell, Alexander Ororbia

A lifelong learning agent is able to continually learn from potentially infinite streams of pattern sensory data.

Dimensionality Reduction

Predictive Maintenance for General Aviation Using Convolutional Transformers

no code implementations7 Oct 2021 Hong Yang, Aidan LaBella, Travis Desell

However, the development of such systems has been limited due to a lack of publicly labeled multivariate time series (MTS) sensor data.

Classification Computational Efficiency +1

Handling Extreme Class Imbalance in Technical Logbook Datasets

no code implementations ACL 2021 Farhad Akhbardeh, Cecilia Ovesdotter Alm, Marcos Zampieri, Travis Desell

In this paper we focus on the problem of technical issue classification by considering logbook datasets from the automotive, aviation, and facilities maintenance domains.

Continuous Ant-Based Neural Topology Search

no code implementations21 Nov 2020 AbdElRahman ElSaid, Joshua Karns, Zimeng Lyu, Alexander Ororbia, Travis Desell

This work introduces a novel, nature-inspired neural architecture search (NAS) algorithm based on ant colony optimization, Continuous Ant-based Neural Topology Search (CANTS), which utilizes synthetic ants that move over a continuous search space based on the density and distribution of pheromones, is strongly inspired by how ants move in the real world.

Neural Architecture Search Time Series +1

An Experimental Study of Weight Initialization and Weight Inheritance Effects on Neuroevolution

no code implementations21 Sep 2020 Zimeng Lyu, AbdElRahman ElSaid, Joshua Karns, Mohamed Mkaouer, Travis Desell

Weight initialization is critical in being able to successfully train artificial neural networks (ANNs), and even more so for recurrent neural networks (RNNs) which can easily suffer from vanishing and exploding gradients.

Evolutionary Algorithms Neural Architecture Search

Neuroevolutionary Transfer Learning of Deep Recurrent Neural Networks through Network-Aware Adaptation

no code implementations4 Jun 2020 AbdElRahman ElSaid, Joshua Karns, Alexander Ororbia II, Daniel Krutz, Zimeng Lyu, Travis Desell

Transfer learning entails taking an artificial neural network (ANN) that is trained on a source dataset and adapting it to a new target dataset.

Transfer Learning

MaintNet: A Collaborative Open-Source Library for Predictive Maintenance Language Resources

no code implementations COLING 2020 Farhad Akhbardeh, Travis Desell, Marcos Zampieri

Furthermore, it provides a way to encourage discussion on and sharing of new datasets and tools for logbook data analysis.


Improving Neuroevolution Using Island Extinction and Repopulation

no code implementations15 May 2020 Zimeng Lyu, Joshua Karns, AbdElRahman ElSaid, Travis Desell

This island based strategy is additionally compared to NEAT's (NeuroEvolution of Augmenting Topologies) speciation strategy.

Evolutionary Algorithms Time Series +1

Improving the Decision-Making Process of Self-Adaptive Systems by Accounting for Tactic Volatility

no code implementations23 Apr 2020 Jeffrey Palmerino, Qi Yu, Travis Desell, Daniel E. Krutz

Unfortunately, current self-adaptive approaches do not account for tactic volatility in their decision-making processes, and merely assume that tactics do not experience volatility.

Decision Making Time Series +1

Investigating Recurrent Neural Network Memory Structures using Neuro-Evolution

1 code implementation6 Feb 2019 Alexander Ororbia, Ahmed Ahmed Elsaid, Travis Desell

This paper presents a new algorithm, Evolutionary eXploration of Augmenting Memory Models (EXAMM), which is capable of evolving recurrent neural networks (RNNs) using a wide variety of memory structures, such as Delta-RNN, GRU, LSTM, MGU and UGRNN cells.

Time Series Time Series Analysis

Accelerating the Evolution of Convolutional Neural Networks with Node-Level Mutations and Epigenetic Weight Initialization

1 code implementation17 Nov 2018 Travis Desell

This paper examines three generic strategies for improving the performance of neuro-evolution techniques aimed at evolving convolutional neural networks (CNNs).


Optimizing Long Short-Term Memory Recurrent Neural Networks Using Ant Colony Optimization to Predict Turbine Engine Vibration

no code implementations10 Oct 2017 AbdElRahman ElSaid, Travis Desell, Fatima El Jamiy, James Higgins, Brandon Wild

This research improves the performance of the most effective LSTM network design proposed in the previous work by using a promising neuroevolution method based on ant colony optimization (ACO) to develop and enhance the LSTM cell structure of the network.

Large Scale Evolution of Convolutional Neural Networks Using Volunteer Computing

no code implementations15 Mar 2017 Travis Desell

EXACT is in part modeled after the neuroevolution of augmenting topologies (NEAT) algorithm, with notable exceptions to allow it to scale to large scale distributed computing environments and evolve networks with convolutional filters.

Distributed Computing L2 Regularization +1

Cannot find the paper you are looking for? You can Submit a new open access paper.