no code implementations • 11 Dec 2023 • Taesik Gong, Si Young Jang, Utku Günay Acer, Fahim Kawsar, Chulhong Min
The advent of tiny AI accelerators opens opportunities for deep neural network deployment at the extreme edge, offering reduced latency, lower power cost, and improved privacy in on-device ML inference.