ROCm vLLM Deployment
by alexhegit
v1.0.0
Production-ready vLLM deployment on AMD ROCm GPUs. Combines environment auto-check, model parameter detection, Docker Compose deployment, health verification...
68
Downloads
1
Versions
Latest Changes
Install ROCm vLLM Deployment with One Click
Get a managed OpenClaw server and install this skill from your dashboard. No SSH, no Docker, no configuration needed.
Deploy with ClawHost