Ollama Local
by Timverhoogt
v1.1.0
Manage and use local Ollama models. Use for model management (list/pull/remove), chat/completions, embeddings, and tool-use with local LLMs. Covers OpenClaw sub-agent integration and model selection guidance.
3,677
Downloads
5
Stars
28
Installs
2
Versions
Latest Changes
Install Ollama Local with One Click
Get a managed OpenClaw server and install this skill from your dashboard. No SSH, no Docker, no configuration needed.
Deploy with ClawHost