Search Results for author: Joe Davison

Found 5 papers, 5 papers with code

Commonsense Knowledge Mining from Pretrained Models

1 code implementation IJCNLP 2019 Joshua Feldman, Joe Davison, Alexander M. Rush

Inferring commonsense knowledge is a key challenge in natural language processing, but due to the sparsity of training data, previous work has shown that supervised methods for commonsense knowledge mining underperform when evaluated on novel data.

Language Modelling

Flexible and Scalable Deep Learning with MMLSpark

1 code implementation11 Apr 2018 Mark Hamilton, Sudarshan Raghunathan, Akshaya Annavajhala, Danil Kirsanov, Eduardo de Leon, Eli Barzilay, Ilya Matiach, Joe Davison, Maureen Busch, Miruna Oprescu, Ratan Sur, Roope Astala, Tong Wen, ChangYoung Park

In this work we detail a novel open source library, called MMLSpark, that combines the flexible deep learning library Cognitive Toolkit, with the distributed computing framework Apache Spark.

Distributed Computing

Cannot find the paper you are looking for? You can Submit a new open access paper.