Training Techniques | AdamW |
---|---|
Architecture | CRF, Dropout, Layer Normalization, Linear Layer, RoBERTa, Tanh |
LR | 0.00001 |
SHOW MORE |
Fine-grained NER model
from allennlp_models.pretrained import load_predictor
predictor = load_predictor("tagging-fine-grained-transformer-crf-tagger")
sentence = "Jobs and Wozniak cofounded Apple in 1976."
preds = predictor.predict(sentence)
for word, tag in zip(preds["words"], preds["tags"]):
print(word, tag)
# prints:
# Jobs U-PERSON
# and O
# Wozniak U-PERSON
# cofounded O
# Apple U-ORG
# in O
# 1976 U-DATE
# . O
You can also get predictions using allennlp command line interface:
echo '{"sentence": "Jobs and Wozniak cofounded Apple in 1976."}' | \
allennlp predict https://storage.googleapis.com/allennlp-public-models/fgner-transformer.2021-02-11.tar.gz -
To train this model you can use allennlp
CLI tool and the configuration file fgner_transformer.jsonnet:
allennlp train fgner_transformer.jsonnet -s output_dir
See the AllenNLP Training and prediction guide for more details.
BENCHMARK | MODEL | METRIC NAME | METRIC VALUE | GLOBAL RANK |
---|---|---|---|---|
Ontonotes v5 (English) | Fine Grained Named Entity Recognition with Transformer | F1 | 88 | # 1 |