Search Results for author: Cory Shain

Found 13 papers, 5 papers with code

SentSpace: Large-Scale Benchmarking and Evaluation of Text using Cognitively Motivated Lexical, Syntactic, and Semantic Features

no code implementations NAACL (ACL) 2022 Greta Tuckute, Aalok Sathe, Mingye Wang, Harley Yoder, Cory Shain, Evelina Fedorenko

The modular design of SentSpace allows researchersto easily integrate their own feature computation into the pipeline while benefiting from acommon framework for evaluation and visualization.

Benchmarking Sentence

A Deep Learning Approach to Analyzing Continuous-Time Systems

2 code implementations25 Sep 2022 Cory Shain, William Schuler

Scientists often use observational time series data to study complex natural processes, but regression analyses often assume simplistic dynamics.

Time Series Time Series Analysis

CDRNN: Discovering Complex Dynamics in Human Language Processing

1 code implementation ACL 2021 Cory Shain

The human mind is a dynamical system, yet many analysis techniques used to study it are limited in their ability to capture the complex dynamics that may characterize mental processes.

regression

Coreference information guides human expectations during natural reading

no code implementations COLING 2020 Evan Jaffe, Cory Shain, William Schuler

Models of human sentence processing effort tend to focus on costs associated with retrieving structures and discourse referents from memory (memory-based) and/or on costs associated with anticipating upcoming words and structures based on contextual cues (expectation-based) (Levy, 2008).

Retrieval Sentence

Acquiring language from speech by learning to remember and predict

1 code implementation CONLL 2020 Cory Shain, Micha Elsner

Classical accounts of child language learning invoke memory limits as a pressure to discover sparse, language-like representations of speech, while more recent proposals stress the importance of prediction for language learning.

Measuring the perceptual availability of phonological features during language acquisition using unsupervised binary stochastic autoencoders

no code implementations NAACL 2019 Cory Shain, Micha Elsner

In this paper, we deploy binary stochastic neural autoencoder networks as models of infant language learning in two typologically unrelated languages (Xitsonga and English).

Language Acquisition

A large-scale study of the effects of word frequency and predictability in naturalistic reading

1 code implementation NAACL 2019 Cory Shain

A number of psycholinguistic studies have factorially manipulated words{'} contextual predictabilities and corpus frequencies and shown separable effects of each on measures of human sentence processing, a pattern which has been used to support distinct mechanisms underlying prediction on the one hand and lexical retrieval on the other.

Retrieval Sentence

Deconvolutional Time Series Regression: A Technique for Modeling Temporally Diffuse Effects

1 code implementation EMNLP 2018 Cory Shain, William Schuler

Researchers in computational psycholinguistics frequently use linear models to study time series data generated by human subjects.

regression Time Series +1

Speech segmentation with a neural encoder model of working memory

no code implementations EMNLP 2017 Micha Elsner, Cory Shain

We present the first unsupervised LSTM speech segmenter as a cognitive model of the acquisition of words from unsegmented input.

Memory-Bounded Left-Corner Unsupervised Grammar Induction on Child-Directed Input

no code implementations COLING 2016 Cory Shain, William Bryce, Lifeng Jin, Victoria Krakovna, Finale Doshi-Velez, Timothy Miller, William Schuler, Lane Schwartz

This paper presents a new memory-bounded left-corner parsing model for unsupervised raw-text syntax induction, using unsupervised hierarchical hidden Markov models (UHHMM).

Language Acquisition Sentence

Memory access during incremental sentence processing causes reading time latency

no code implementations WS 2016 Cory Shain, Marten Van Schijndel, Richard Futrell, Edward Gibson, William Schuler

Studies on the role of memory as a predictor of reading time latencies (1) differ in their predictions about when memory effects should occur in processing and (2) have had mixed results, with strong positive effects emerging from isolated constructed stimuli and weak or even negative effects emerging from naturally-occurring stimuli.

Sentence

Cannot find the paper you are looking for? You can Submit a new open access paper.