no code implementations • 18 Apr 2024 • Raz Lapid, Almog Dubin, Moshe Sipper
This paper presents RADAR-Robust Adversarial Detection via Adversarial Retraining-an approach designed to enhance the robustness of adversarial detectors against adaptive attacks, while maintaining classifier performance.
no code implementations • 5 Mar 2024 • Ben Pinhasov, Raz Lapid, Rony Ohayon, Moshe Sipper, Yehudit Aperstein
Furthermore, this approach does not change the performance of the deepfake detector.
no code implementations • 1 Feb 2024 • Moshe Sipper, Jason H. Moore
The GPTP workshop series, which began in 2003, has served over the years as a focal meeting for genetic programming (GP) researchers.
1 code implementation • 20 Jan 2024 • Moshe Sipper, Jason H. Moore, Ryan J. Urbanowicz
We have recently developed OMNIREP, a coevolutionary algorithm to discover both a representation and an interpreter that solve a particular problem of interest.
no code implementations • 19 Jan 2024 • Moshe Sipper, Jason H. Moore, Ryan J. Urbanowicz
The simultaneous evolution of two or more species with coupled fitness -- coevolution -- has been put to good use in the field of evolutionary computation.
no code implementations • 3 Jan 2024 • Moshe Sipper
Explainability in deep networks has gained increased importance in recent years.
1 code implementation • 6 Sep 2023 • Itai Tzruia, Tomer Halperin, Moshe Sipper, Achiya Elyasaf
We present a novel approach to performing fitness approximation in genetic algorithms (GAs) using machine-learning (ML) models, focusing on evolutionary agents in Gymnasium (game) simulators -- where fitness computation is costly.
no code implementations • 4 Sep 2023 • Raz Lapid, Ron Langberg, Moshe Sipper
The GA attack works by optimizing a universal adversarial prompt that -- when combined with a user's query -- disrupts the attacked model's alignment, resulting in unintended and potentially harmful outputs.
no code implementations • 13 Jun 2023 • Raz Lapid, Moshe Sipper
Through experiments conducted on the ViT-GPT2 model, which is the most-used image-to-text model in Hugging Face, and the Flickr30k dataset, we demonstrate that our proposed attack successfully generates visually similar adversarial examples, both with untargeted and targeted captions.
no code implementations • 8 Jun 2023 • Moshe Sipper, Achiya Elyasaf, Tomer Halperin, Zvika Haramaty, Raz Lapid, Eyal Segal, Itai Tzruia, Snir Vitrack Tamam
We survey eight recent works by our group, involving the successful blending of evolutionary algorithms with machine learning and deep learning: 1.
no code implementations • 7 Mar 2023 • Raz Lapid, Eylon Mizrahi, Moshe Sipper
To our knowledge this is the first and only method that performs black-box physical attacks directly on object-detection models, which results with a model-agnostic attack.
1 code implementation • 21 Feb 2023 • Moshe Sipper
We present Classy Ensemble, a novel ensemble-generation algorithm for classification tasks, which aggregates models through a weighted combination of per-class accuracy.
1 code implementation • 27 Nov 2022 • Snir Vitrack Tamam, Raz Lapid, Moshe Sipper
Our novel algorithm, AttaXAI, a model-agnostic, adversarial attack on XAI algorithms, only requires access to the output logits of a classifier and to the explanation map; these weak assumptions render our approach highly useful where real-world models and data are concerned.
1 code implementation • 8 Sep 2022 • Eyal Segal, Moshe Sipper
To that end, Novelty Search (NS) has been shown to be able to outperform gradient-following optimizers in some cases, while under-performing in others.
no code implementations • 17 Aug 2022 • Raz Lapid, Zvika Haramaty, Moshe Sipper
Deep neural networks (DNNs) are sensitive to adversarial data in a variety of scenarios, including the black-box scenario, where the attacker is only allowed to query the trained model and receive an output.
2 code implementations • 21 Jul 2022 • Moshe Sipper, Tomer Halperin, Itai Tzruia, Achiya Elyasaf
EC-KitY is a comprehensive Python library for doing evolutionary computation (EC), licensed under the BSD 3-Clause License, and compatible with scikit-learn.
no code implementations • 13 Jul 2022 • Moshe Sipper
Hyperparameters in machine learning (ML) have received a fair amount of attention, and hyperparameter tuning has come to be regarded as an important step in the ML pipeline.
no code implementations • 8 Jul 2022 • Moshe Sipper
We present a comprehensive, stacking-based framework for combining deep learning with good old-fashioned machine learning, called Deep GOld.
no code implementations • 30 Jun 2022 • Moshe Sipper, Jason H. Moore, Ryan J. Urbanowicz
When seeking a predictive model in biomedical data, one often has more than a single objective in mind, e. g., attaining both high accuracy and low complexity (to promote interpretability).
no code implementations • 25 Jun 2022 • Moshe Sipper, Jason H. Moore, Ryan J. Urbanowicz
We have recently presented SAFE -- Solution And Fitness Evolution -- a commensalistic coevolutionary algorithm that maintains two coevolving populations: a population of candidate solutions and a population of candidate objective functions.
no code implementations • 25 Jun 2022 • Moshe Sipper, Jason H. Moore, Ryan J. Urbanowicz
We recently highlighted a fundamental problem recognized to confound algorithmic optimization, namely, \textit{conflating} the objective with the objective function.
no code implementations • 25 Jun 2022 • Moshe Sipper
We present three evolutionary symbolic regression-based classification algorithms for binary and multinomial datasets: GPLearnClf, CartesianClf, and ClaSyCo.
no code implementations • 24 Jun 2022 • Raz Lapid, Moshe Sipper
Studying both standard fully connected neural networks (FCNs) and convolutional neural networks (CNNs), we propose a novel, three-population, coevolutionary algorithm to evolve AFs, and compare it to four other methods, both evolutionary and non-evolutionary.
no code implementations • 24 Jun 2022 • Moshe Sipper, Jason H Moore
Modifying standard gradient boosting by replacing the embedded weak learner in favor of a strong(er) one, we present SyRBo: Symbolic-Regression Boosting.
no code implementations • 24 Jun 2022 • Moshe Sipper
Activation functions (AFs), which are pivotal to the success (or failure) of a neural network, have received increased attention in recent years, with researchers seeking to design novel AFs that improve some aspect of network performance.
1 code implementation • 9 Jan 2018 • Patryk Orzechowski, Moshe Sipper, Xiuzhen Huang, Jason H. Moore
In this paper a novel biclustering algorithm based on artificial intelligence (AI) is introduced.
no code implementations • 13 Jun 2017 • Moshe Sipper, Weixuan Fu, Karuna Ahuja, Jason H. Moore
The practice of evolutionary algorithms involves the tuning of many parameters.
2 code implementations • 1 May 2017 • Randal S. Olson, Moshe Sipper, William La Cava, Sharon Tartarone, Steven Vitale, Weixuan Fu, Patryk Orzechowski, Ryan J. Urbanowicz, John H. Holmes, Jason H. Moore
While artificial intelligence (AI) has become widespread, many commercial AI systems are not yet accessible to individual researchers nor the general public due to the deep knowledge of the systems required to use them.