Search Results for author: Zachariah Carmichael

Found 17 papers, 5 papers with code

How Well Do Feature-Additive Explainers Explain Feature-Additive Predictors?

no code implementations27 Oct 2023 Zachariah Carmichael, Walter J. Scheirer

Surging interest in deep learning from high-stakes domains has precipitated concern over the inscrutable nature of black box neural networks.

Additive models Attribute +1

Pixel-Grounded Prototypical Part Networks

no code implementations25 Sep 2023 Zachariah Carmichael, Suhas Lohit, Anoop Cherian, Michael Jones, Walter Scheirer

Prototypical part neural networks (ProtoPartNNs), namely PROTOPNET and its derivatives, are an intrinsically interpretable approach to machine learning.


Unfooling Perturbation-Based Post Hoc Explainers

1 code implementation29 May 2022 Zachariah Carmichael, Walter J Scheirer

We propose algorithms for the detection (CAD-Detect) and defense (CAD-Defend) of these attacks, which are aided by our novel conditional anomaly detection approach, KNN-CAD.

Adversarial Attack Anomaly Detection +2

A Framework for Evaluating Post Hoc Feature-Additive Explainers

1 code implementation15 Jun 2021 Zachariah Carmichael, Walter J. Scheirer

In this work, we propose a framework for the evaluation of post hoc explainers on ground truth that is directly derived from the additive structure of a model.

Adaptive Autonomy in Human-on-the-Loop Vision-Based Robotics Systems

no code implementations28 Mar 2021 Sophia Abraham, Zachariah Carmichael, Sreya Banerjee, Rosaura VidalMata, Ankit Agrawal, Md Nafee Al Islam, Walter Scheirer, Jane Cleland-Huang

Computer vision approaches are widely used by autonomous robotic systems to sense the world around them and to guide their decision making as they perform diverse tasks such as collision avoidance, search and rescue, and object manipulation.

Collision Avoidance Decision Making

SIRNet: Understanding Social Distancing Measures with Hybrid Neural Network Model for COVID-19 Infectious Spread

2 code implementations22 Apr 2020 Nicholas Soures, David Chambers, Zachariah Carmichael, Anurag Daram, Dimpy P. Shah, Kal Clark, Lloyd Potter, Dhireesha Kudithipudi

The SARS-CoV-2 infectious outbreak has rapidly spread across the globe and precipitated varying policies to effectuate physical distancing to ameliorate its impact.

Populations and Evolution

Cheetah: Mixed Low-Precision Hardware & Software Co-Design Framework for DNNs on the Edge

no code implementations6 Aug 2019 Hamed F. Langroudi, Zachariah Carmichael, David Pastuch, Dhireesha Kudithipudi

Additionally, the framework is amenable for different quantization approaches and supports mixed-precision floating point and fixed-point numerical formats.


Deep Learning Training on the Edge with Low-Precision Posits

no code implementations30 Jul 2019 Hamed F. Langroudi, Zachariah Carmichael, Dhireesha Kudithipudi

Recently, the posit numerical format has shown promise for DNN data representation and compute with ultra-low precision ([5.. 8]-bit).

Analysis of Wide and Deep Echo State Networks for Multiscale Spatiotemporal Time Series Forecasting

no code implementations1 Jul 2019 Zachariah Carmichael, Humza Syed, Dhireesha Kudithipudi

Echo state networks are computationally lightweight reservoir models inspired by the random projections observed in cortical circuitry.

Time Series Time Series Forecasting

Performance-Efficiency Trade-off of Low-Precision Numerical Formats in Deep Neural Networks

no code implementations25 Mar 2019 Zachariah Carmichael, Hamed F. Langroudi, Char Khazanov, Jeffrey Lillie, John L. Gustafson, Dhireesha Kudithipudi

Our results indicate that posits are a natural fit for DNN inference, outperforming at $\leq$8-bit precision, and can be realized with competitive resource requirements relative to those of floating point.

Deep Positron: A Deep Neural Network Using the Posit Number System

no code implementations5 Dec 2018 Zachariah Carmichael, Hamed F. Langroudi, Char Khazanov, Jeffrey Lillie, John L. Gustafson, Dhireesha Kudithipudi

We propose a precision-adaptable FPGA soft core for exact multiply-and-accumulate for uniform comparison across three numerical formats, fixed, floating-point and posit.

PositNN: Tapered Precision Deep Learning Inference for the Edge

no code implementations20 Oct 2018 Hamed F. Langroudi, Zachariah Carmichael, John L. Gustafson, Dhireesha Kudithipudi

Conventional reduced-precision numerical formats, such as fixed-point and floating point, cannot accurately represent deep neural network parameters with a nonlinear distribution and small dynamic range.

Mod-DeepESN: Modular Deep Echo State Network

no code implementations1 Aug 2018 Zachariah Carmichael, Humza Syed, Stuart Burtner, Dhireesha Kudithipudi

Neuro-inspired recurrent neural network algorithms, such as echo state networks, are computationally lightweight and thereby map well onto untethered devices.

Time Series Time Series Prediction

Cannot find the paper you are looking for? You can Submit a new open access paper.