Search Results for author: James Lin

Found 4 papers, 2 papers with code

FTL: A universal framework for training low-bit DNNs via Feature Transfer

no code implementations ECCV 2020 Kunyuan Du, Ya zhang, Haibing Guan, Qi Tian, Shenggan Cheng, James Lin

Compared with low-bit models trained directly, the proposed framework brings 0. 5% to 3. 4% accuracy gains to three different quantization schemes.

Quantization Transfer Learning

ParaFold: Paralleling AlphaFold for Large-Scale Predictions

2 code implementations11 Nov 2021 Bozitao Zhong, Xiaoming Su, Minhua Wen, Sichen Zuo, Liang Hong, James Lin

We evaluated the accuracy and efficiency of optimizations on CPUs and GPUs, and showed the large-scale prediction capability by running ParaFold inferences of 19, 704 small proteins in five hours on one NVIDIA DGX-2.

Protein Folding

Exploiting News Article Structure for Automatic Corpus Generation of Entailment Datasets

1 code implementation22 Oct 2020 Jan Christian Blaise Cruz, Jose Kristian Resabal, James Lin, Dan John Velasco, Charibeth Cheng

Lastly, we perform analyses on transfer learning techniques to shed light on their true performance when operating in low-data domains through the use of degradation tests.

Benchmarking Natural Language Inference +2

Training Keyword Spotters with Limited and Synthesized Speech Data

no code implementations31 Jan 2020 James Lin, Kevin Kilgour, Dominik Roblek, Matthew Sharifi

With the rise of low power speech-enabled devices, there is a growing demand to quickly produce models for recognizing arbitrary sets of keywords.

Ranked #10 on Keyword Spotting on Google Speech Commands (Google Speech Commands V2 12 metric)

Keyword Spotting

Cannot find the paper you are looking for? You can Submit a new open access paper.