Someman/hindi-summarization
Viewer • Updated • 86.3k • 96 • 4
How to use Someman/bart-hindi with Transformers:
# Use a pipeline as a high-level helper
# Warning: Pipeline type "summarization" is no longer supported in transformers v5.
# You must load the model directly (see below) or downgrade to v4.x with:
# 'pip install "transformers<5.0.0'
from transformers import pipeline
pipe = pipeline("summarization", model="Someman/bart-hindi") # Load model directly
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("Someman/bart-hindi")
model = AutoModelForSeq2SeqLM.from_pretrained("Someman/bart-hindi")This model is a fine-tuned version of facebook/bart-base on the Someman/hindi-summarization dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss |
|---|---|---|---|
| 0.6568 | 0.14 | 500 | 0.6501 |
| 0.682 | 0.29 | 1000 | 0.5757 |
| 0.5331 | 0.43 | 1500 | 0.5530 |
| 0.5612 | 0.58 | 2000 | 0.5311 |
| 0.5685 | 0.72 | 2500 | 0.5043 |
| 0.4993 | 0.87 | 3000 | 0.4985 |