Djuunaa
djuna
AI & ML interests
None yet
Recent Activity
liked
a model 2 days ago
OpenResearcher/OpenResearcher-30B-A3B liked
a model 2 days ago
Klingspor/StarPO-4B reacted
to
danielhanchen's
post with π₯ 10 days ago
We collaborated with Hugging Face to enable you to train MoE models 12Γ faster with 35% less VRAM via our new Triton kernels (no accuracy loss). π€
Train gpt-oss locally on 12.8GB VRAM with our free notebooks: https://unsloth.ai/docs/new/faster-moe