1 code implementation • 23 May 2022 • Mirac Suzgun, Luke Melas-Kyriazi, Dan Jurafsky
We propose a method for arbitrary textual style transfer (TST)--the task of transforming a text into any given style--utilizing general-purpose pre-trained language models.
1 code implementation • LTEDI (ACL) 2022 • Kyle Swanson, Joy Hsu, Mirac Suzgun
Using a dataset of Reddit posts that exhibit stress, we demonstrate the ability of our MCTS algorithm to identify interpretable explanations for a person's feeling of stress in both a context-dependent and context-independent manner.
2 code implementations • 8 Nov 2019 • Mirac Suzgun, Sebastian Gehrmann, Yonatan Belinkov, Stuart M. Shieber
We introduce three memory-augmented Recurrent Neural Networks (MARNNs) and explore their capabilities on a series of simple language modeling tasks whose solutions require stack-based mechanisms.
no code implementations • WS 2019 • Mirac Suzgun, Sebastian Gehrmann, Yonatan Belinkov, Stuart M. Shieber
In this paper, we systematically assess the ability of standard recurrent networks to perform dynamic counting and to encode hierarchical representations.
1 code implementation • WS 2019 • Mirac Suzgun, Yonatan Belinkov, Stuart M. Shieber
Recurrent Neural Networks (RNNs) are theoretically Turing-complete and established themselves as a dominant model for language processing.