no code implementations • 1 Feb 2024 • Aaron Mullen, Samuel E. Armstrong, Jasmine Perdeh, Bjorn Bauer, Jeffrey Talbert, V. K. Cody Bumgardner
This article proposes a system that combines results from several types of models, all of which are trained on different data signals.
1 code implementation • 3 Aug 2023 • V. K. Cody Bumgardner, Aaron Mullen, Sam Armstrong, Caylin Hickey, Jeff Talbert
This paper introduces an approach that combines the language reasoning capabilities of large language models (LLMs) with the benefits of local training to tackle complex, domain-specific tasks.