Search Results for author: Ethan Prihar

Found 2 papers, 1 papers with code

MathBERT: A Pre-trained Language Model for General NLP Tasks in Mathematics Education

1 code implementation2 Jun 2021 Jia Tracy Shen, Michiharu Yamashita, Ethan Prihar, Neil Heffernan, Xintao Wu, Ben Graff, Dongwon Lee

Due to the nature of mathematical texts, which often use domain specific vocabulary along with equations and math symbols, we posit that the development of a new BERT model for mathematics would be useful for many mathematical downstream tasks.

Knowledge Tracing Language Modelling +2

Classifying Math KCs via Task-Adaptive Pre-Trained BERT

no code implementations24 May 2021 Jia Tracy Shen, Michiharu Yamashita, Ethan Prihar, Neil Heffernan, Xintao Wu, Sean McGrew, Dongwon Lee

Educational content labeled with proper knowledge components (KCs) are particularly useful to teachers or content organizers.

Math Task 2

Cannot find the paper you are looking for? You can Submit a new open access paper.