no code implementations • 29 Nov 2023 • Liya Wang, Jason Chou, Xin Zhou, Alex Tien, Diane M Baumgartner
The advent of ChatGPT and GPT-4 has captivated the world with large language models (LLMs), demonstrating exceptional performance in question-answering, summarization, and content generation.
no code implementations • 16 May 2023 • Liya Wang, Jason Chou, Dave Rouck, Alex Tien, Diane M Baumgartner
Learning effective sentence representations is crucial for many Natural Language Processing (NLP) tasks, including semantic search, semantic textual similarity (STS), and clustering.
no code implementations • 4 Nov 2021 • Liya Wang, Alex Tien, Jason Chou
Traffic, demand, weather, and traffic management actions are all critical inputs to any prediction model.
no code implementations • 24 Apr 2019 • Jason Chou
The variational autoencoder (VAE) framework is a popular option for training unsupervised generative models, featuring ease of training and latent representation of data.
1 code implementation • 23 Apr 2019 • Jason Chou, Gautam Hathi
The variational autoencoder (VAE) framework remains a popular option for training unsupervised generative models, especially for discrete data where generative adversarial networks (GANs) require workaround to create gradient for the generator.