βοΈ PENCILCLAW β Autonomous C++ Coding Agent (v1.0 Testing)
βββββββ ββββββββββββ βββ βββββββββββββ ββββββββββ ββββββ βββ βββ
βββββββββββββββββββββ βββββββββββββββββ βββββββββββ βββββββββββ βββ
ββββββββββββββ ββββββ ββββββ ββββββ βββ βββ βββββββββββ ββ βββ
βββββββ ββββββ βββββββββββββ ββββββ βββ βββ ββββββββββββββββββ
βββ βββββββββββ βββββββββββββββββββββββββββββββββββββββββββ βββββββββββββ
βββ βββββββββββ βββββ ββββββββββββββββββ βββββββββββββββββ βββ ββββββββ
PENCILCLAW is a C++βbased autonomous coding agent harness for your local Ollama instance to generate, manage, and execute C++ code. It features a persistent task system, Git integration, and a secure execution environment β all running offline with complete privacy.
Features
- Code Generation (
/CODE) β Generate C++ code for any idea, automatically saved as a.txtfile. - Autonomous Tasks (
/TASK) β Start a longβrunning coding goal; the agent continues working on it in the background via heartbeat. - Task Management β View status (
/TASK_STATUS) and stop tasks (/STOP_TASK). - Code Execution (
/EXECUTE) β Compile and run the last generated code block (with safety confirmation). - Git Integration β Every saved file is automatically committed to a local Git repository inside
pencil_data/. - Heartbeat & KeepβAlive β Keeps the Ollama model loaded and continues active tasks periodically.
- Secure by Design β Command injection prevented, path sanitisation, explicit confirmation before running AIβgenerated code.
- Natural Language Interface β Commands like "write code for a fibonacci function" are understood.
Project Structure
/home/kali/pencilclaw/
βββ pencilclaw.cpp # Main program source
βββ pencil_utils.hpp # Workspace utilities
βββ pencilclaw # Compiled executable
**βββ pencil_data/ # Created automatically on first run**
βββ session.log # Full interaction log
βββ .git/ # Local Git repository (if initialised)
βββ tasks/ # Autonomous task folders
β βββ 20260309_123456_build_calculator/
β βββ description.txt
β βββ log.txt
β βββ iteration_1.txt
β βββ ...
βββ [code files].txt # Files saved via /CODE or natural language
Requirements
- Compiler with C++17 support (g++ 7+ or clang 5+)
- libcurl development libraries
- nlohmann/json (headerβonly JSON library)
- Ollama installed and running
- A model pulled in Ollama (default:
qwen2.5:0.5bβ configurable via environment variableOLLAMA_MODEL)
Note: PENCILCLAW uses POSIX system calls (fork, pipe, execvp). It runs on Linux, macOS, and Windows Subsystem for Linux (WSL).
Installation
1. Install System Dependencies
sudo apt update
sudo apt install -y build-essential libcurl4-openssl-dev
2. Install nlohmann/json
The library is headerβonly; simply download json.hpp and place it in your include path, or install via package manager:
sudo apt install -y nlohmann-json3-dev
3. Install Ollama
curl -fsSL https://ollama.com/install.sh | sh
ollama serve & # start the service
ollama pull qwen2.5:0.5b # or your preferred model
Set Model (Optional)
Override the default model by setting the environment variable:
export OLLAMA_MODEL="llama3.2:latest"
4. cd
cd ~/pencilclaw/ -The folder you have the files installed
5. Compile PENCILCLAW
g++ -std=c++17 -o pencilclaw pencilclaw.cpp -lcurl
If json.hpp is in a nonβstandard location, add the appropriate -I flag.
Usage
Start the program:
./pencilclaw
You will see the > prompt. Commands are caseβsensitive and start with /. Any line not starting with / is treated as natural language and passed to Ollama.
Available Commands
| Command | Description |
|---|---|
/HELP |
Show this help message. |
/CODE <idea> |
Generate C++ code for the given idea; saved as <sanitized_idea>.txt. |
/TASK <description> |
Start a new autonomous coding task (creates a timestamped folder). |
/TASK_STATUS |
Show the current active task, its folder, and iteration count. |
/STOP_TASK |
Clear the active task (does not delete existing task files). |
/EXECUTE |
Compile and run the first C++ code block from the last AI output. |
/FILES |
List all saved .txt files and task folders. |
/DEBUG |
Toggle verbose debug output (shows JSON requests/responses). |
/EXIT |
Quit the program. |
Natural Language Examples
write code for a fibonacci functionstart a task to build a calculatorsave it as mycode.txt(after code generation)
Git Integration
PENCILCLAW automatically initialises a Git repository inside pencil_data/ on first run. Every file saved via /CODE or task iteration is committed with a descriptive message. The repository is configured with a local identity (pencilclaw@local / PencilClaw) so commits work even without global Git configuration.
If you prefer not to use Git, simply remove the .git folder from pencil_data/ β PENCILCLAW will detect its absence and skip all Git operations.
Security Notes
- Code execution is potentially dangerous. PENCILCLAW always shows the code and requires you to type
yesbefore running it. - Path traversal is prevented β filenames are sanitised, and all writes are confined to
pencil_data/. - No shell commands are used β all external commands (
git,g++) are invoked viafork+execvpwith argument vectors, eliminating command injection risks.
Configuration
| Setting | Method |
|---|---|
| Ollama model | Environment variable OLLAMA_MODEL (default: qwen2.5:0.5b) |
| Workspace directory | Environment variable PENCIL_DATA (default: ./pencil_data/) |
| Heartbeat interval | Edit HEARTBEAT_INTERVAL in source (default 120 seconds) |
| Keepβalive interval | Edit KEEP_ALIVE_INTERVAL in source (default 120 seconds) |
Troubleshooting
| Problem | Solution |
|---|---|
json.hpp: No such file or directory |
Install nlohmann/json or add the correct -I flag. |
curl failed: Couldn't connect to server |
Ensure Ollama is running (ollama serve) and the URL http://localhost:11434 is accessible. |
| Model not found | Run ollama pull <model_name> (e.g., qwen2.5:0.5b). |
| Git commit fails | The repository already has a local identity; this should not happen. If it does, run git config manually in pencil_data/. |
| Compilation errors (C++17) | Use a compiler that supports -std=c++17 (g++ 7+ or clang 5+). |
License
This project is released under the MIT License. Built with C++ and Ollama.