allenai/social_i_qa
Updated • 26k • 30
How to use SkillForge45/CyberFuture-3 with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("text-generation", model="SkillForge45/CyberFuture-3") # Load model directly
from transformers import AutoModel
model = AutoModel.from_pretrained("SkillForge45/CyberFuture-3", dtype="auto")How to use SkillForge45/CyberFuture-3 with vLLM:
# Install vLLM from pip:
pip install vllm
# Start the vLLM server:
vllm serve "SkillForge45/CyberFuture-3"
# Call the server using curl (OpenAI-compatible API):
curl -X POST "http://localhost:8000/v1/completions" \
-H "Content-Type: application/json" \
--data '{
"model": "SkillForge45/CyberFuture-3",
"prompt": "Once upon a time,",
"max_tokens": 512,
"temperature": 0.5
}'docker model run hf.co/SkillForge45/CyberFuture-3
How to use SkillForge45/CyberFuture-3 with SGLang:
# Install SGLang from pip:
pip install sglang
# Start the SGLang server:
python3 -m sglang.launch_server \
--model-path "SkillForge45/CyberFuture-3" \
--host 0.0.0.0 \
--port 30000
# Call the server using curl (OpenAI-compatible API):
curl -X POST "http://localhost:30000/v1/completions" \
-H "Content-Type: application/json" \
--data '{
"model": "SkillForge45/CyberFuture-3",
"prompt": "Once upon a time,",
"max_tokens": 512,
"temperature": 0.5
}'docker run --gpus all \
--shm-size 32g \
-p 30000:30000 \
-v ~/.cache/huggingface:/root/.cache/huggingface \
--env "HF_TOKEN=<secret>" \
--ipc=host \
lmsysorg/sglang:latest \
python3 -m sglang.launch_server \
--model-path "SkillForge45/CyberFuture-3" \
--host 0.0.0.0 \
--port 30000
# Call the server using curl (OpenAI-compatible API):
curl -X POST "http://localhost:30000/v1/completions" \
-H "Content-Type: application/json" \
--data '{
"model": "SkillForge45/CyberFuture-3",
"prompt": "Once upon a time,",
"max_tokens": 512,
"temperature": 0.5
}'How to use SkillForge45/CyberFuture-3 with Docker Model Runner:
docker model run hf.co/SkillForge45/CyberFuture-3
Uptades: -Web Search
git clone https://huggingface.co/SkillForge45/CyberFuture-3
pip install torch transformers datasets googlesearch-python pyttsx3 speechrecognition fastapi uvicorn
METHOD #1 (HTML web interface):
python app.py
from model import ChatBot
# Initialize the chatbot
bot = ChatBot()
# Optionally train the model (requires GPU for good performance)
bot.train(epochs=3)
# Chat with web search
response = bot.generate_response("What's the latest news about AI?", use_web=True)
print(response)
# Voice interaction (requires microphone)
bot.voice_interface.speak(response)
user_input = bot.voice_interface.listen()
METHOD #3 (Console)
curl -X POST "http://localhost:8000/chat/" \
-H "Content-Type: application/x-www-form-urlencoded" \
-d "prompt=What's the weather in London today?&use_web=true" #web search, if you not need web: use_web=False
curl -X POST "http://localhost:8000/chat/" \
-F "audio_file=@your_recording.wav" \
-F "use_web=true" \
-F "use_voice=true"
METHOD 4 (Server) Start the app.py and go to http://localhost:8000/chat
This model is licensed, see the LICENSE for more information
Base model
SkillForge45/CyberFuture-1