Search Results for author: Marcus Comiter

Found 3 papers, 1 papers with code

StitchNet: Composing Neural Networks from Pre-Trained Fragments

1 code implementation5 Jan 2023 Surat Teerapittayanon, Marcus Comiter, Brad McDanel, H. T. Kung

We then show that these fragments can be stitched together to create neural networks with accuracy comparable to that of traditionally trained networks at a fraction of computing resource and data requirements.

exBERT: Extending Pre-trained Models with Domain-specific Vocabulary Under Constrained Training Resources

no code implementations Findings of the Association for Computational Linguistics 2020 Wen Tai, H. T. Kung, Xin Dong, Marcus Comiter, Chang-Fu Kuo

We introduce exBERT, a training method to extend BERT pre-trained models from a general domain to a new pre-trained model for a specific domain with a new additive vocabulary under constrained training resources (i. e., constrained computation and data).

CheckNet: Secure Inference on Untrusted Devices

no code implementations17 Jun 2019 Marcus Comiter, Surat Teerapittayanon, H. T. Kung

CheckNet is like a checksum for neural network inference: it verifies the integrity of the inference computation performed by untrusted devices to 1) ensure the inference has actually been performed, and 2) ensure the inference has not been manipulated by an attacker.

Cannot find the paper you are looking for? You can Submit a new open access paper.