Search Results for author: Winnie Xu

Found 10 papers, 7 papers with code

KTO: Model Alignment as Prospect Theoretic Optimization

1 code implementation2 Feb 2024 Kawin Ethayarajh, Winnie Xu, Niklas Muennighoff, Dan Jurafsky, Douwe Kiela

Kahneman & Tversky's $\textit{prospect theory}$ tells us that humans perceive random variables in a biased but well-defined manner; for example, humans are famously loss-averse.

Attribute

Deep Latent State Space Models for Time-Series Generation

1 code implementation24 Dec 2022 Linqi Zhou, Michael Poli, Winnie Xu, Stefano Massaroli, Stefano Ermon

Methods based on ordinary differential equations (ODEs) are widely used to build generative models of time-series.

Time Series Time Series Analysis +1

Multi-Game Decision Transformers

1 code implementation30 May 2022 Kuang-Huei Lee, Ofir Nachum, Mengjiao Yang, Lisa Lee, Daniel Freeman, Winnie Xu, Sergio Guadarrama, Ian Fischer, Eric Jang, Henryk Michalewski, Igor Mordatch

Specifically, we show that a single transformer-based model - with a single set of weights - trained purely offline can play a suite of up to 46 Atari games simultaneously at close-to-human performance.

Atari Games Offline RL

Self-Similarity Priors: Neural Collages as Differentiable Fractal Representations

no code implementations15 Apr 2022 Michael Poli, Winnie Xu, Stefano Massaroli, Chenlin Meng, Kuno Kim, Stefano Ermon

We investigate how to leverage the representations produced by Neural Collages in various tasks, including data compression and generation.

Data Compression

NoisyMix: Boosting Model Robustness to Common Corruptions

no code implementations2 Feb 2022 N. Benjamin Erichson, Soon Hoe Lim, Winnie Xu, Francisco Utrera, Ziang Cao, Michael W. Mahoney

For many real-world applications, obtaining stable and robust statistical performance is more important than simply achieving state-of-the-art predictive test accuracy, and thus robustness of neural networks is an increasingly important topic.

Data Augmentation

Noisy Feature Mixup

2 code implementations ICLR 2022 Soon Hoe Lim, N. Benjamin Erichson, Francisco Utrera, Winnie Xu, Michael W. Mahoney

We introduce Noisy Feature Mixup (NFM), an inexpensive yet effective method for data augmentation that combines the best of interpolation based training and noise injection schemes.

Data Augmentation

Cannot find the paper you are looking for? You can Submit a new open access paper.