Training Techniques | AdaDelta |
---|---|
Architecture | Dropout, Linear Layer |
Epochs | 500 |
SHOW MORE |
A reimplementation of a deep BiLSTM sequence prediction model (Stanovsky et al., 2018).
Explore live Open Information Extraction demo at AllenNLP.
from allennlp_models.pretrained import load_predictor
predictor = load_predictor("structured-prediction-srl")
sentence = "John broke the window with a rock."
preds = predictor.predict(sentence)
print(preds["verbs"][0]["description"])
# prints:
# [ARG0: John] [V: broke] [ARG1: the window] [ARGM-MNR: with a rock] .
You can also get predictions using allennlp command line interface:
echo '{"sentence": "John broke the window with a rock."}' | \
allennlp predict https://storage.googleapis.com/allennlp-public-models/openie-model.2020.03.26.tar.gz -
To train this model you can use allennlp
CLI tool and the configuration file srl.jsonnet:
allennlp train srl.jsonnet -s output_dir
See the AllenNLP Training and prediction guide for more details.
@inproceedings{Stanovsky2018SupervisedOI,
author = {Gabriel Stanovsky and Julian Michael and Luke Zettlemoyer and I. Dagan},
booktitle = {NAACL-HLT},
title = {Supervised Open Information Extraction},
year = {2018}
}