Token Classification
Transformers
PyTorch
TensorBoard
distilbert
Generated from Trainer
Eval Results (legacy)
Instructions to use autoevaluate/entity-extraction with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use autoevaluate/entity-extraction with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("token-classification", model="autoevaluate/entity-extraction")# Load model directly from transformers import AutoTokenizer, AutoModelForTokenClassification tokenizer = AutoTokenizer.from_pretrained("autoevaluate/entity-extraction") model = AutoModelForTokenClassification.from_pretrained("autoevaluate/entity-extraction") - Notebooks
- Google Colab
- Kaggle
Librarian Bot: Add base_model information to model
#10 opened over 2 years ago
by
librarian-bot
Adding `safetensors` variant of this model
#9 opened about 3 years ago
by
SFconvertbot
Add evaluation results on the autoevaluate--conll2003-sample config and test split of autoevaluate/conll2003-sample
#8 opened over 3 years ago
by
lewtun
Add evaluation results on the autoevaluate--conll2003-sample config and test split of autoevaluate/conll2003-sample
#7 opened over 3 years ago
by
lewtun
Add evaluation results on the conll2003 config of conll2003
#6 opened over 3 years ago
by
autoevaluator
Add evaluation results on the conll2003 config of conll2003
#5 opened almost 4 years ago
by
lewtun
Add evaluation results on conll2003
#4 opened almost 4 years ago
by
autoevaluator
Add evaluation results on conll2003
#3 opened almost 4 years ago
by
lewtun