no code implementations • 12 Dec 2022 • Jason W. Anderson, Marcin Ziolkowski, Ken Kennedy, Amy W. Apon
Finally, we found that in use cases that benefit from incremental training or model specialization, pretraining a base model on synthetic images provided a sizeable reduction in the training cost of transfer learning, allowing up to 90\% of the model training to be front-loaded.
no code implementations • 25 Oct 2016 • Chris Gropp, Alexander Herzog, Ilya Safro, Paul W. Wilson, Amy W. Apon
In this paper, we introduce and empirically analyze Clustered Latent Dirichlet Allocation (CLDA), a method for extracting dynamic latent topics from a collection of documents.