maximus / config.json
Proprogrammer's picture
Upload folder using huggingface_hub
10f12bf verified
raw
history blame contribute delete
204 Bytes
{
"architectures": [
"GPT"
],
"vocab_size": 65536,
"n_layer": 24,
"n_head": 16,
"hidden_size": 2048,
"max_position_embeddings": 2048,
"model_type": "gpt",
"torch_dtype": "bfloat16"
}