Documentation
LM Studio REST API
LM Studio REST API
Build with LM Studio's local APIs and SDKs — TypeScript, Python, REST, and OpenAI‑compatible endpoints.
lmslmstudio-js)npm install @lmstudio/sdk
import { LMStudioClient } from "@lmstudio/sdk"; const client = new LMStudioClient(); const model = await client.llm.model("openai/gpt-oss-20b"); const result = await model.respond("Who are you, and what can you do?"); console.info(result.content);
Full docs: lmstudio-js, Source: GitHub
lmstudio-python)pip install lmstudio
import lmstudio as lms with lms.Client() as client: model = client.llm.model("openai/gpt-oss-20b") result = model.respond("Who are you, and what can you do?") print(result)
Full docs: lmstudio-python, Source: GitHub
lms server start --port 1234
curl http://localhost:1234/api/v1/chat \ -H "Content-Type: application/json" \ -H "Authorization: Bearer $LM_API_TOKEN" \ -d '{ "model": "openai/gpt-oss-20b", "input": "Who are you, and what can you do?" }'
Full docs: LM Studio REST API
This page's source is available on GitHub
On this page
Super quick start
TypeScript (lmstudio-js)
Python (lmstudio-python)
HTTP (LM Studio REST API)
Helpful links