Search Results for author: John Moon

Found 3 papers, 1 papers with code

RACE-IT: A Reconfigurable Analog CAM-Crossbar Engine for In-Memory Transformer Acceleration

no code implementations29 Nov 2023 Lei Zhao, Luca Buonanno, Ron M. Roth, Sergey Serebryakov, Archit Gajjar, John Moon, Jim Ignowski, Giacomo Pedretti

Transformer models represent the cutting edge of Deep Neural Networks (DNNs) and excel in a wide range of machine learning tasks.

X-TIME: An in-memory engine for accelerating machine learning on tabular data with CAMs

1 code implementation3 Apr 2023 Giacomo Pedretti, John Moon, Pedro Bruel, Sergey Serebryakov, Ron M. Roth, Luca Buonanno, Tobias Ziegler, Cong Xu, Martin Foltin, Paolo Faraboschi, Jim Ignowski, Catherine E. Graves

In this work, we focus on an overall analog-digital architecture implementing a novel increased precision analog CAM and a programmable network on chip allowing the inference of state-of-the-art tree-based ML models, such as XGBoost and CatBoost.

Hierarchical Architectures in Reservoir Computing Systems

no code implementations14 May 2021 John Moon, Wei D. Lu

Analogous to deep neural networks, stacking sub-reservoirs in series is an efficient way to enhance the nonlinearity of data transformation to high-dimensional space and expand the diversity of temporal information captured by the reservoir.

Cannot find the paper you are looking for? You can Submit a new open access paper.