Documentation
LM Studio REST API
LM Studio REST API
LM Studio offers a powerful REST API with first-class support for local inference and model management. In addition to our native API, we provide full OpenAI compatibility mode (learn more).
Previously, there was a v0 REST API. That API has since been deprecated in favor of the v1 REST API.
The v1 REST API includes enhanced features such as:
The following endpoints are available in LM Studio's v1 REST API.
| Endpoint | Method | Docs |
|---|---|---|
/api/v1/chat | POST | Chat |
/api/v1/models | GET | List Models |
/api/v1/models/load | POST | Load |
/api/v1/models/download | POST | Download |
/api/v1/models/download/status | GET | Download Status |
The table below compares the features of LM Studio's /api/v1/chat endpoint with the OpenAI-compatible /v1/responses and /v1/chat/completions endpoints.
| Feature | /api/v1/chat | /v1/responses | /v1/chat/completions |
|---|---|---|---|
| Stateful chat | ✅ | ✅ | ❌ |
| Remote MCPs | ✅ | ✅ | ❌ |
| MCPs you have in LM Studio | ✅ | ✅ | ❌ |
| Custom tools | ❌ | ✅ | ✅ |
| Model load streaming events | ✅ | ❌ | ❌ |
| Prompt processing streaming events | ✅ | ❌ | ❌ |
| Specify context length in the request | ✅ | ❌ | ❌ |
Please report bugs by opening an issue on Github.
This page's source is available on GitHub