Instructions to use Colby/apertus-8b-coding with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use Colby/apertus-8b-coding with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("Colby/apertus-8b-coding", dtype="auto") - Notebooks
- Google Colab
- Kaggle
| { | |
| "add_prefix_space": false, | |
| "backend": "tokenizers", | |
| "bos_token": "<s>", | |
| "clean_up_tokenization_spaces": false, | |
| "eos_token": "<|assistant_end|>", | |
| "is_local": false, | |
| "local_files_only": false, | |
| "model_input_names": [ | |
| "input_ids", | |
| "attention_mask" | |
| ], | |
| "model_max_length": 1000000000000000019884624838656, | |
| "pad_token": "<pad>", | |
| "padding_side": "left", | |
| "tokenizer_class": "TokenizersBackend", | |
| "unk_token": "<unk>" | |
| } | |