Ask me what skills you need
What are you building?
Tell me what you're working on and I'll find the best agent skills for you.
Add Ollama MCP server so the container agent can call local models and optionally manage the Ollama model library.
This skill adds a stdio-based MCP server that exposes local Ollama models as tools for the container agent. Claude remains the orchestrator but can offload work to local models, and can optionally manage the model library directly.
Core tools (always available):
ollama_list_models — list installed Ollama models with name, size, and familyollama_generate — send a prompt to a specified model and return the responseManagement tools (opt-in via OLLAMA_ADMIN_TOOLS=true):
ollama_pull_model — pull (download) a model from the Ollama registryollama_delete_model — delete a locally installed model to free disk spaceollama_show_model — show model details: modelfile, parameters, and architecture infoollama_list_running — list models currently loaded in memory with memory usage and processor typeCheck if container/agent-runner/src/ollama-mcp-stdio.ts exists. If it does, skip to Phase 3 (Configure).
Verify Ollama is installed and running on the host:
ollama list
If Ollama is not installed, direct the user to https://ollama.com/download.
npx skills add qwibitai/nanoclaw --skill add-ollama-toolHow clear and easy to understand the SKILL.md instructions are, rated from 1 to 5.
Very clear and well structured, with almost no room for misunderstanding.
How directly an agent can act on the SKILL.md instructions, rated from 1 to 5.
Highly actionable with clear, concrete steps that an agent can follow directly.