Training Techniques | AdamW |
---|---|
Architecture | BERT, Dropout, Layer Normalization, Linear Layer, Tanh |
LR | 0.00005 |
SHOW MORE |
An implementation of a BERT based model (Shi et al, 2019) with some modifications (no additional parameters apart from a linear classification layer).
Explore live Semantic Role Labeling demo at AllenNLP.
from allennlp_models.pretrained import load_predictor
predictor = load_predictor("structured-prediction-srl-bert")
sentence = "John broke the window with a rock."
preds = predictor.predict(sentence)
print(preds["verbs"][0]["description"])
# prints:
# [ARG0: John] [V: broke] [ARG1: the window] [ARG2: with a rock] .
You can also get predictions using allennlp command line interface:
echo '{"sentence": "John broke the window with a rock."}' | \
allennlp predict https://storage.googleapis.com/allennlp-public-models/structured-prediction-srl-bert.2020.12.15.tar.gz -
To train this model you can use allennlp
CLI tool and the configuration file bert_base_srl.jsonnet:
allennlp train bert_base_srl.jsonnet -s output_dir
See the AllenNLP Training and prediction guide for more details.
@article{Shi2019SimpleBM,
author = {Peng Shi and Jimmy Lin},
journal = {ArXiv},
title = {Simple BERT Models for Relation Extraction and Semantic Role Labeling},
volume = {abs/1904.05255},
year = {2019}
}
BENCHMARK | MODEL | METRIC NAME | METRIC VALUE | GLOBAL RANK |
---|---|---|---|---|
OntoNotes | SRL BERT | F1 | 86.49 | # 1 |