Search Results for author: Nathan Youngblood

Found 3 papers, 1 papers with code

Leveraging Continuously Differentiable Activation Functions for Learning in Quantized Noisy Environments

no code implementations4 Feb 2024 Vivswan Shah, Nathan Youngblood

Real-world analog systems intrinsically suffer from noise that can impede model convergence and accuracy on a variety of deep learning models.

Quantization

AnalogVNN: A fully modular framework for modeling and optimizing photonic neural networks

1 code implementation14 Oct 2022 Vivswan Shah, Nathan Youngblood

AnalogVNN, a simulation framework built on PyTorch which can simulate the effects of optoelectronic noise, limited precision, and signal normalization present in photonic neural network accelerators.

BIG-bench Machine Learning Hyperparameter Optimization +1

Monadic Pavlovian associative learning in a backpropagation-free photonic network

no code implementations30 Nov 2020 James Y. S. Tan, Zengguang Cheng, Johannes Feldmann, Xuan Li, Nathan Youngblood, Utku E. Ali, C. David Wright, Wolfram H. P. Pernice, Harish Bhaskaran

Here we experimentally demonstrate a form of backpropagation-free learning using a single (or monadic) associative hardware element.

Cannot find the paper you are looking for? You can Submit a new open access paper.