Instructions to use sulcan/CHATQCD with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- PEFT
How to use sulcan/CHATQCD with PEFT:
from peft import PeftModel from transformers import AutoModelForCausalLM base_model = AutoModelForCausalLM.from_pretrained("unsloth/llama-3-8b-Instruct-bnb-4bit") model = PeftModel.from_pretrained(base_model, "sulcan/CHATQCD") - Notebooks
- Google Colab
- Kaggle
Model Card for Model ID
https://github.com/sulcantonin/CHATQCD_ICHEP24
Model Details
Model Description
- Developed by: Antonin Sulc, Patrick L.S. Connor
- Model type: [More Information Needed]
- Language(s) (NLP): English
- Finetuned from model [optional]: unsloth/llama-3-8b-Instruct-bnb-4bit
Model Sources [optional]
Repository: https://github.com/sulcantonin/CHATQCD_ICHEP24
Paper: [TBD]
PEFT 0.12.0
- Downloads last month
- 3
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support
Model tree for sulcan/CHATQCD
Base model
unsloth/llama-3-8b-Instruct-bnb-4bit