Documentation

LM Studio REST API

LM Studio API

LM Studio's REST API for local inference and model management

LM Studio offers a powerful REST API with first-class support for local inference and model management. In addition to our native API, we provide full OpenAI compatibility mode (learn more).

What's new

Previously, there was a v0 REST API. That API has since been deprecated in favor of the v1 REST API.

The v1 REST API includes enhanced features such as:

Supported endpoints

The following endpoints are available in LM Studio's v1 REST API.

EndpointMethodDocs
/api/v1/chatPOSTChat
/api/v1/modelsGETList Models
/api/v1/models/loadPOSTLoad
/api/v1/models/downloadPOSTDownload
/api/v1/models/download/statusGETDownload Status

Inference endpoint comparison

The table below compares the features of LM Studio's /api/v1/chat endpoint with the OpenAI-compatible /v1/responses and /v1/chat/completions endpoints.

Feature/api/v1/chat/v1/responses/v1/chat/completions
Stateful chat
Remote MCPs
MCPs you have in LM Studio
Custom tools
Model load streaming events
Prompt processing streaming events
Specify context length in the request

Please report bugs by opening an issue on Github.

This page's source is available on GitHub