Search Results for author: Subutai Ahmad

Found 12 papers, 8 papers with code

Real-Time Anomaly Detection for Streaming Analytics

4 code implementations8 Jul 2016 Subutai Ahmad, Scott Purdy

Much of the worlds data is streaming, time-series data, where anomalies give significant information in critical situations.

Anomaly Detection Time Series +1

Evaluating Real-time Anomaly Detection Algorithms - the Numenta Anomaly Benchmark

3 code implementations12 Oct 2015 Alexander Lavin, Subutai Ahmad

Here we propose the Numenta Anomaly Benchmark (NAB), which attempts to provide a controlled and repeatable environment of open-source tools to test and measure anomaly detection algorithms on streaming data.

Anomaly Detection Time Series +1

Why Neurons Have Thousands of Synapses, A Theory of Sequence Memory in Neocortex

1 code implementation31 Oct 2015 Jeff Hawkins, Subutai Ahmad

Given the similarity of excitatory neurons throughout the neocortex and the importance of sequence memory in inference and behavior, we propose that this form of sequence memory is a universal property of neocortical tissue.

How Can We Be So Dense? The Benefits of Using Highly Sparse Representations

3 code implementations27 Mar 2019 Subutai Ahmad, Luiz Scheinkman

Most artificial networks today rely on dense representations, whereas biological networks rely on sparse representations.

Computational Efficiency

Grid Cell Path Integration For Movement-Based Visual Object Recognition

1 code implementation17 Feb 2021 Niels Leadholm, Marcus Lewis, Subutai Ahmad

Grid cells enable the brain to model the physical space of the world and navigate effectively via path integration, updating self-position using information from self-movement.

Few-Shot Learning Navigate +1

Avoiding Catastrophe: Active Dendrites Enable Multi-Task Learning in Dynamic Environments

1 code implementation31 Dec 2021 Abhiram Iyer, Karan Grewal, Akash Velu, Lucas Oliveira Souza, Jeremy Forest, Subutai Ahmad

Next, we study the performance of this architecture on two separate benchmarks requiring task-based adaptation: Meta-World, a multi-task reinforcement learning environment where a robotic agent must learn to solve a variety of manipulation tasks simultaneously; and a continual learning benchmark in which the model's prediction task changes throughout training.

Continual Learning Multi-Task Learning

Long Distance Relationships without Time Travel: Boosting the Performance of a Sparse Predictive Autoencoder in Sequence Modeling

1 code implementation2 Dec 2019 Jeremy Gordon, David Rawlinson, Subutai Ahmad

In sequence learning tasks such as language modelling, Recurrent Neural Networks must learn relationships between input features separated by time.

Biologically-plausible Training Language Modelling

Continuous online sequence learning with an unsupervised neural network model

1 code implementation17 Dec 2015 Yuwei Cui, Subutai Ahmad, Jeff Hawkins

In this paper, we analyze properties of HTM sequence memory and apply it to sequence learning and prediction problems with streaming data.

Anomaly Detection Temporal Sequences

How do neurons operate on sparse distributed representations? A mathematical theory of sparsity, neurons and active dendrites

no code implementations5 Jan 2016 Subutai Ahmad, Jeff Hawkins

Our model is inspired by recent experimental findings on active dendritic processing and NMDA spikes in pyramidal neurons.

Porting HTM Models to the Heidelberg Neuromorphic Computing Platform

no code implementations8 May 2015 Sebastian Billaudelle, Subutai Ahmad

Hierarchical Temporal Memory (HTM) is a computational theory of machine intelligence based on a detailed study of the neocortex.

Properties of Sparse Distributed Representations and their Application to Hierarchical Temporal Memory

no code implementations25 Mar 2015 Subutai Ahmad, Jeff Hawkins

Empirical evidence demonstrates that every region of the neocortex represents information using sparse activity patterns.

Two Sparsities Are Better Than One: Unlocking the Performance Benefits of Sparse-Sparse Networks

no code implementations27 Dec 2021 Kevin Lee Hunter, Lawrence Spracklen, Subutai Ahmad

In this article we introduce Complementary Sparsity, a novel technique that significantly improves the performance of dual sparse networks on existing hardware.

Cannot find the paper you are looking for? You can Submit a new open access paper.