and-ray-m

Offline Llama

by and-ray-m v1.0.0

Manage local Ollama models autonomously with health monitoring, automatic fallback, self-healing, and offline operation without internet dependency.

342
Downloads
5
Installs
1
Versions

Latest Changes

Install Offline Llama with One Click

Get a managed OpenClaw server and install this skill from your dashboard. No SSH, no Docker, no configuration needed.

Deploy with ClawHost