no code implementations • 10 Oct 2024 • Mathis Pink, Vy A. Vo, Qinyuan Wu, Jianing Mu, Javier S. Turek, Uri Hasson, Kenneth A. Norman, Sebastian Michelmann, Alexander Huth, Mariya Toneva
To address the gap in evaluating memory in LLMs, we introduce Sequence Order Recall Tasks (SORT), which we adapt from tasks used to study episodic memory in cognitive psychology.
no code implementations • 12 May 2021 • Hsiang-Yun Sherry Chien, Javier S. Turek, Nicole Beckage, Vy A. Vo, Christopher J. Honey, Ted L. Willke
Altogether, we found that LSTM with the proposed forget gate can learn long-term dependencies, outperforming other recurrent networks in multiple domains; such gating mechanism can be integrated into other architectures for improving the learning of long timescale information in recurrent neural networks.
no code implementations • NeurIPS 2020 • Shailee Jain, Vy Vo, Shivangi Mahto, Amanda LeBel, Javier S. Turek, Alexander Huth
To understand how the human brain represents this information, one approach is to build encoding models that predict fMRI responses to natural language using representations extracted from neural network language models (LMs).
no code implementations • ICLR 2021 • Shivangi Mahto, Vy A. Vo, Javier S. Turek, Alexander G. Huth
Earlier work has demonstrated that dependencies in natural language tend to decay with distance between words according to a power law.
1 code implementation • ICML 2020 • Javier S. Turek, Shailee Jain, Vy Vo, Mihai Capota, Alexander G. Huth, Theodore L. Willke
In this work, we explore the delayed-RNN, which is a single-layer RNN that has a delay between the input and output.
no code implementations • 22 Aug 2019 • Shantanu Mandal, Todd A. Anderson, Javier S. Turek, Justin Gottschlich, Shengtian Zhou, Abdullah Muzahid
The problem of automatic software generation is known as Machine Programming.
no code implementations • 11 Sep 2018 • Michael J. Anderson, Jonathan I. Tamir, Javier S. Turek, Marcus T. Alley, Theodore L. Willke, Shreyas S. Vasanawala, Michael Lustig
Our improvements to the pipeline on a single machine provide a 3x overall reconstruction speedup, which allowed us to add algorithmic changes improving image quality.
1 code implementation • CVPR 2018 • Javier S. Turek, Alexander Huth
Thus for large point sets it is common to use a low-rank approximation to the distance matrix, which fits in memory and can be efficiently analyzed using methods such as multidimensional scaling (MDS).
no code implementations • 29 Sep 2016 • Hejia Zhang, Po-Hsuan Chen, Janice Chen, Xia Zhu, Javier S. Turek, Theodore L. Willke, Uri Hasson, Peter J. Ramadge
In this work, we examine a searchlight based shared response model to identify shared information in small contiguous regions (searchlights) across the whole brain.
no code implementations • 17 Aug 2016 • Po-Hsuan Chen, Xia Zhu, Hejia Zhang, Javier S. Turek, Janice Chen, Theodore L. Willke, Uri Hasson, Peter J. Ramadge
We examine two ways to combine the ideas of a factor model and a searchlight based analysis to aggregate multi-subject fMRI data while preserving spatial locality.
no code implementations • 16 Aug 2016 • Michael J. Anderson, Mihai Capotă, Javier S. Turek, Xia Zhu, Theodore L. Willke, Yida Wang, Po-Hsuan Chen, Jeremy R. Manning, Peter J. Ramadge, Kenneth A. Norman
The scale of functional magnetic resonance image data is rapidly increasing as large multi-subject datasets are becoming widely available and high-resolution scanners are adopted.
no code implementations • 1 Jul 2016 • Eran Treister, Javier S. Turek, Irad Yavneh
A multilevel framework is presented for solving such l1 regularized sparse optimization problems efficiently.
no code implementations • NeurIPS 2014 • Eran Treister, Javier S. Turek
Numerical experiments on both synthetic and real gene expression data demonstrate that our approach outperforms the existing state of the art methods, especially for large-scale problems.