A self-hosted, OpenAI-compatible proxy that routes across every credible free-tier LLM provider. Bring your own keys; we just point requests at whichever provider still has budget left.
Only providers with recurring free quotas, no credit card required, and a self-serve API.
:free models — DeepSeek, Kimi, Qwen, Llama, Gemma, Nemotron, Tencent HY3 …:free — perpetually, no card.Node 20+ required. better-sqlite3 brings prebuilt binaries — no compile step.
Sign up free at each provider, paste keys into the dashboard. No credit card needed for any of them.
Set base_url to your local proxy. Use any OpenAI-compatible client — Cursor, the SDK, curl, anything.
# 1. clone, install, run $git clone https://github.com/tashfeenahmed/freellmapi.git $cd freellmapi $npm install && npm run dev # 2. open http://localhost:3001 — paste keys in the UI # 3. call it like OpenAI $curl http://localhost:3001/v1/chat/completions \ -H "Content-Type: application/json" \ -d '{"messages":[{"role":"user","content":"hi"}]}'