Original model: Boreas-24B-v1.25-QHE by Naphula
Available ExLlamaV3 (release v0.0.18) quantizations
| Type | Size | CLI |
|---|---|---|
| H8-4.0BPW | 13.16 GB | Copy-paste the line / Download the batch file |
| H8-6.0BPW | 18.72 GB | Copy-paste the line / Download the batch file |
| H8-8.0BPW | 24.27 GB | Copy-paste the line / Download the batch file |
Requirements: A python installation with huggingface-hub module to use CLI.
Licensing
License detected: unknown
The license for the provided quantized models is derived from the original model. For additional information see the original model's page above, or, if unavailable, the files and the page backups below.
Backups
Original page (click to expand)
A side test not part of the 1.1 -> 1.2 -> 1.3 pipeline
Merged using the chiral_qhe method.
This one may be unstable at higher context but should yield unprecedented creativity.
Model tree for DeathGodlike/Boreas-24B-v1.25-QHE_EXL3
Base model
Naphula/Boreas-24B-v1.25-QHE