no code implementations • IJCNLP 2019 • Shashank Srivastava, Igor Labutov, Tom Mitchell
Natural language has recently been explored as a new medium of supervision for training machine learning models.
1 code implementation • IJCNLP 2019 • Zhichu Lu, Forough Arabshahi, Igor Labutov, Tom Mitchell
In this paper, we propose a semantic parser that generalizes to out-of-domain examples by learning a general strategy for parsing an unseen utterance through adapting the logical forms of seen utterances, instead of learning to generate a logical form from scratch.
no code implementations • ACL 2018 • Igor Labutov, Bishan Yang, Anusha Prakash, Amos Azaria
Question Answering (QA), as a research field, has primarily focused on either knowledge bases (KBs) or free text as a source of knowledge.
no code implementations • EMNLP 2018 • Igor Labutov, Bishan Yang, Tom Mitchell
As humans, we often rely on language to learn language.
no code implementations • EMNLP 2018 • Igor Labutov, Shashank Srivastava, Tom Mitchell
We present LIA, an intelligent personal assistant that can be programmed using natural language.
no code implementations • ACL 2018 • Shashank Srivastava, Igor Labutov, Tom Mitchell
Humans can efficiently learn new concepts using language.
no code implementations • EMNLP 2017 • Shashank Srivastava, Igor Labutov, Tom Mitchell
Natural language constitutes a predominant medium for much of human learning and pedagogy.
1 code implementation • 23 Feb 2016 • Siddharth Reddy, Igor Labutov, Siddhartha Banerjee, Thorsten Joachims
Second, we use this memory model to develop a stochastic model for spaced repetition systems.
no code implementations • 23 Feb 2016 • Siddharth Reddy, Igor Labutov, Thorsten Joachims
In this paper, we present the Latent Skill Embedding (LSE), a probabilistic model of students and educational content that can be used to recommend personalized sequences of lessons with the goal of helping students prepare for specific assessments.