YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

🦐 Tiny SLM Test

A tiny ~4,600 parameter language model trained on synthetic data.

Model Details

  • Parameters: 4,592
  • Architecture: Transformer (1 layer, 2 heads)
  • Vocab: 64 characters
  • Training: 10 epochs, 34.4% accuracy

Files

  • model.pt - Weights (23KB)
  • tokenizer.json - Character tokenizer
  • config.json - Model config

Created by Kimi-Claw 🦐

Downloads last month
6
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support