1 code implementation • 10 Oct 2023 • Aditya R. Vaidya, Javier Turek, Alexander G. Huth
In contrast with these findings, we present a scenario in which the performance of humans and LMs diverges.
no code implementations • 18 Apr 2023 • Shantanu Mandal, Adhrik Chethan, Vahid Janfaza, S M Farabi Mahmud, Todd A Anderson, Javier Turek, Jesmin Jahan Tithi, Abdullah Muzahid
As software systems grow in complexity and scale, the number of configurations and associated specifications required to ensure the correct operation can become large and prohibitively difficult to manipulate manually.
no code implementations • 2 Nov 2022 • Shantanu Mandal, Todd A. Anderson, Javier Turek, Justin Gottschlich, Abdullah Muzahid
In this paper, we present a novel formulation of program synthesis as a continuous optimization problem and use a state-of-the-art evolutionary approach, known as Covariance Matrix Adaptation Evolution Strategy to solve the problem.
no code implementations • ACL 2021 • Richard Antonello, Nicole Beckage, Javier Turek, Alexander Huth
Here we present a general fine-tuning method that we call information gain filtration for improving the overall training efficiency and final performance of language model fine-tuning.
1 code implementation • NeurIPS 2021 • Richard Antonello, Javier Turek, Vy Vo, Alexander Huth
We find that this representation embedding can predict how well each individual feature space maps to human brain responses to natural language stimuli recorded using fMRI.
1 code implementation • 1 May 2020 • Richard Antonello, Nicole Beckage, Javier Turek, Alexander Huth
Here we present a general fine-tuning method that we call information gain filtration for improving the overall training efficiency and final performance of language model fine-tuning.
1 code implementation • NeurIPS 2019 • Mejbah Alam, Justin Gottschlich, Nesime Tatbul, Javier Turek, Timothy Mattson, Abdullah Muzahid
This is, in part, due to the emergence of a wide range of novel techniques in machine learning.