Search Results for author: Tim Nieradzik

Found 4 papers, 0 papers with code

How does BERT process disfluency?

no code implementations SIGDIAL (ACL) 2021 Ye Tian, Tim Nieradzik, Sepehr Jalali, Da-Shan Shiu

Analysis on sentence embeddings of disfluent and fluent sentence pairs reveals that the deeper the layer, the more similar their representation (exp2).

Sentence Sentence Embeddings +1

Model agnostic meta-learning on trees

no code implementations1 Jan 2021 Jezabel Garcia, Federica Freddi, Jamie McGowan, Tim Nieradzik, Da-Shan Shiu, Ye Tian, Alberto Bernacchia

In meta-learning, the knowledge learned from previous tasks is transferred to new ones, but this transfer only works if tasks are related, and sharing information between unrelated tasks might hurt performance.

Meta-Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.