joelnishanth

Local-First LLM

by joelnishanth v1.0.0

Routes LLM requests to a local model (Ollama, LM Studio, llamafile) before falling back to cloud APIs. Tracks token savings and cost avoidance in a persisten...

153
Downloads
1
Stars
1
Versions

Latest Changes

Install Local-First LLM with One Click

Get a managed OpenClaw server and install this skill from your dashboard. No SSH, no Docker, no configuration needed.

Deploy with ClawHost