Documentation

LM Studio Developer Docs

Build with LM Studio's local APIs and SDKs — TypeScript, Python, REST, and OpenAI‑compatible endpoints.

Get to know the stack

What you can build

  • Chat and text generation with streaming
  • Tool calling and local agents with MCP
  • Structured output (JSON schema)
  • Embeddings and tokenization
  • Model management (load, download, list)

Super quick start

TypeScript (lmstudio-js)

npm install @lmstudio/sdk
import { LMStudioClient } from "@lmstudio/sdk";

const client = new LMStudioClient();
const model = await client.llm.model("openai/gpt-oss-20b");
const result = await model.respond("Who are you, and what can you do?");

console.info(result.content);

Full docs: lmstudio-js, Source: GitHub

Python (lmstudio-python)

pip install lmstudio
import lmstudio as lms

with lms.Client() as client:
    model = client.llm.model("openai/gpt-oss-20b")
    result = model.respond("Who are you, and what can you do?")
    print(result)

Full docs: lmstudio-python, Source: GitHub

HTTP (LM Studio REST API)

lms server start --port 1234
curl http://localhost:1234/api/v1/chat \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer $LM_API_TOKEN" \
  -d '{
    "model": "openai/gpt-oss-20b",
    "input": "Who are you, and what can you do?"
  }'

Full docs: LM Studio REST API

This page's source is available on GitHub