1 code implementation • 5 Jan 2023 • Surat Teerapittayanon, Marcus Comiter, Brad McDanel, H. T. Kung
We then show that these fragments can be stitched together to create neural networks with accuracy comparable to that of traditionally trained networks at a fraction of computing resource and data requirements.
no code implementations • Findings of the Association for Computational Linguistics 2020 • Wen Tai, H. T. Kung, Xin Dong, Marcus Comiter, Chang-Fu Kuo
We introduce exBERT, a training method to extend BERT pre-trained models from a general domain to a new pre-trained model for a specific domain with a new additive vocabulary under constrained training resources (i. e., constrained computation and data).
no code implementations • 17 Jun 2019 • Marcus Comiter, Surat Teerapittayanon, H. T. Kung
CheckNet is like a checksum for neural network inference: it verifies the integrity of the inference computation performed by untrusted devices to 1) ensure the inference has actually been performed, and 2) ensure the inference has not been manipulated by an attacker.