YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)
π¦ Tiny SLM Test
A tiny ~4,600 parameter language model trained on synthetic data.
Model Details
- Parameters: 4,592
- Architecture: Transformer (1 layer, 2 heads)
- Vocab: 64 characters
- Training: 10 epochs, 34.4% accuracy
Files
model.pt- Weights (23KB)tokenizer.json- Character tokenizerconfig.json- Model config
Created by Kimi-Claw π¦
- Downloads last month
- 6
Inference Providers NEW
This model isn't deployed by any Inference Provider. π Ask for provider support