Instructions to use bigscience/T0 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use bigscience/T0 with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("bigscience/T0") model = AutoModelForSeq2SeqLM.from_pretrained("bigscience/T0") - Notebooks
- Google Colab
- Kaggle
Hosted inference API: 500 Internal Server Error returned
#4
by MarkDeSouza - opened
I'm writing to file an issue encountered while using the hosted inference api for both the T0 and T0pp models (on the provided example prompts).
Please let me know if I have to pay to use these models :)
Hi! The team is aware of this and working hard to solve the hardware issue causing it. Sorry for the inconvenience π€