Original model: Mystic-Matron-12B by Vortex5


Available ExLlamaV3 (release v0.0.18) quantizations

Requirements: A python installation with huggingface-hub module to use CLI.

Licensing

License detected: unknown

The license for the provided quantized models is derived from the original model. For additional information see the original model's page above, or, if unavailable, the files and the page backups below.


Backups

Original files

Original page (click to expand)

Mystic-Matron-12B

Overview

Mystic-Matron-12B was created by merging Scarlet-Seraph-12B, Mahou-1.5-Mistral-Nemo-12B, and Chaos-Unknown-12B using a custom method.

Merge configuration
models:
  - model: Vortex5/Scarlet-Seraph-12B
  - model: flammenai/Mahou-1.5-mistral-nemo-12B
  - model: Vortex5/Chaos-Unknown-12b
merge_method: cdrf
parameters:
  strength: 0.92
  route: 0.42
  tau: 3.5
  agree: 0.52
dtype: bfloat16
tokenizer:
  source: Vortex5/Scarlet-Seraph-12B
      

Intended Use

๐ŸŒ’ Storytelling
๐ŸŽญ Roleplay
โœจ Creative Writing
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for DeathGodlike/Mystic-Matron-12B_EXL3

Quantized
(4)
this model