Home
Models
Use at Work
Docs
Blog
Download
Login
Careers
Privacy Policy
Terms of Use
Documentation
Introduction
API Changelog
Core
Authentication
Headless Mode
Using MCP via API
Idle TTL and Auto-Evict
Local Server
LM Studio REST API
Overview
Quickstart
Stateful Chats
Streaming events
Chat with a model
List your models
Load a model
Download a model
Get download status
REST API v0 (deprecated)
OpenAI Compatible Endpoints
Structured Output
Tools and Function Calling
List Models
Responses
Chat Completions
Embeddings
Completions (Legacy)
List available models via the OpenAI-compatible endpoint.
GET
curl http://localhost:1234/v1/models
This page's source is available on GitHub
On this page