π₯οΈ LM Studio β Local AI
Run agents on your own hardware using LM Studio's OpenAI-compatible API β the same
OpenAIClient you already know, just pointed at localhost.
LM Studio is a desktop
application that lets you download, manage, and serve GGUF models (Llama 3, Mistral,
Phi-3, Qwen, Gemma, and more) through a local REST server on
http://localhost:1234. Because LM Studio exposes a fully OpenAI-compatible
/v1 endpoint, Agent Framework connects to it using the standard
OpenAIClient β no extra SDK required.
All Agent Framework features work identically against LM Studio: streaming, function tools, multi-turn conversations, structured output, and multi-agent orchestration. No data leaves your machine and no API key is needed.
Key Concepts
- OpenAI-compatible endpoint β LM Studio exposes
http://localhost:1234/v1 - OpenAIClient β standard OpenAI SDK client; redirect via
OpenAIClientOptions.Endpoint - ApiKeyCredential β LM Studio doesn't require a real key; pass any non-empty string
- AsAIAgent() β wraps the chat client into a standard
AIAgent - LMS_ENDPOINT / LMS_MODEL β recommended environment variables for configuration
Prerequisites
Download and configure LM Studio before running any of the examples below.
# 1. Download LM Studio from https://lmstudio.ai and install it.
# 2. Open LM Studio β Search tab β download a model
# Recommended models: llama-3.2-3b-instruct, phi-3-mini, qwen2.5-7b-instruct
# 3. Go to Developer tab β enable "Start Server"
# The server starts at http://localhost:1234 by default.
# 4. Load a model in the "Local Server" tab β note the exact model identifier shown
# (e.g. "meta-llama-3.2-3b-instruct" or "lmstudio-community/...").
# 5. The server is now ready. No API key is required.
NuGet Packages
dotnet add package Microsoft.Agents.AI.OpenAI --prerelease
dotnet add package OpenAI
Microsoft.Agents.AI.OpenAI is the Agent Framework OpenAI integration (RC-1, pre-release).
The stable OpenAI package provides OpenAIClient and ApiKeyCredential.
No Azure SDK is needed β LM Studio is not Azure.
Environment Variables
# PowerShell
$env:LMS_ENDPOINT="http://localhost:1234/v1"
$env:LMS_MODEL="meta-llama-3.2-3b-instruct"
# Bash / macOS / Linux
export LMS_ENDPOINT="http://localhost:1234/v1"
export LMS_MODEL="meta-llama-3.2-3b-instruct"
The model identifier must exactly match what is shown in the LM Studio server panel
(e.g. meta-llama-3.2-3b-instruct). Copy it from the Developer tab.
Code Sample β Hello LM Studio
using Microsoft.Agents.AI;
using OpenAI;
using System.ClientModel;
// 1. Read LM Studio server settings from environment variables.
var endpoint = Environment.GetEnvironmentVariable("LMS_ENDPOINT")
?? throw new InvalidOperationException("LMS_ENDPOINT is not set.");
var modelName = Environment.GetEnvironmentVariable("LMS_MODEL")
?? throw new InvalidOperationException("LMS_MODEL is not set.");
// 2. Create a standard OpenAIClient pointed at the local LM Studio server.
// A real API key is not required β pass any non-empty placeholder string.
AIAgent agent = new OpenAIClient(
new ApiKeyCredential("lmstudio"),
new OpenAIClientOptions { Endpoint = new Uri(endpoint) })
.GetChatClient(modelName)
.AsAIAgent(
instructions: "You are a helpful assistant running locally via LM Studio.",
name: "LocalAgent");
// 3. Same RunAsync() API as any other AIAgent β provider is transparent.
Console.WriteLine(await agent.RunAsync("What is the capital of Japan?"));
Code Sample β Streaming with LM Studio
using Microsoft.Agents.AI;
using OpenAI;
using System.ClientModel;
var endpoint = Environment.GetEnvironmentVariable("LMS_ENDPOINT")!;
var modelName = Environment.GetEnvironmentVariable("LMS_MODEL")!;
AIAgent agent = new OpenAIClient(
new ApiKeyCredential("lmstudio"),
new OpenAIClientOptions { Endpoint = new Uri(endpoint) })
.GetChatClient(modelName)
.AsAIAgent(instructions: "You are a helpful assistant.", name: "StreamAgent");
// RunStreamingAsync works identically to cloud providers β tokens arrive in real time.
Console.Write("Agent: ");
await foreach (var update in agent.RunStreamingAsync("Explain how a transformer neural network works."))
{
if (update.Text is not null)
{
Console.Write(update.Text);
}
}
Console.WriteLine();
Code Sample β Function Tools with LM Studio
Note: Function calling support depends on the loaded model. Models with
-instruct or -tool in the name are typically tool-capable.
Check the model card on Hugging Face for confirmation.
using System.ComponentModel;
using Microsoft.Agents.AI;
using Microsoft.Extensions.AI;
using OpenAI;
using System.ClientModel;
var endpoint = Environment.GetEnvironmentVariable("LMS_ENDPOINT")!;
var modelName = Environment.GetEnvironmentVariable("LMS_MODEL")!;
// Define tool functions β identical pattern to Azure OpenAI.
string GetWeather(
[Description("The city to look up weather for")] string city)
=> $"The weather in {city} is 18Β°C and mostly sunny.";
string GetStockPrice(
[Description("The ticker symbol, e.g. MSFT")] string ticker)
=> $"{ticker} is trading at $142.50 (+1.2% today).";
AIAgent agent = new OpenAIClient(
new ApiKeyCredential("lmstudio"),
new OpenAIClientOptions { Endpoint = new Uri(endpoint) })
.GetChatClient(modelName)
.AsAIAgent(
instructions: "You are a helpful assistant. Use tools to answer accurately.",
name: "ToolAgent",
tools: [
AIFunctionFactory.Create(GetWeather),
AIFunctionFactory.Create(GetStockPrice)
]);
Console.WriteLine(await agent.RunAsync(
"What's the weather in Berlin and what's the Microsoft stock price?"));
Code Sample β Multi-Turn with LM Studio
using Microsoft.Agents.AI;
using OpenAI;
using System.ClientModel;
var endpoint = Environment.GetEnvironmentVariable("LMS_ENDPOINT")!;
var modelName = Environment.GetEnvironmentVariable("LMS_MODEL")!;
AIAgent agent = new OpenAIClient(
new ApiKeyCredential("lmstudio"),
new OpenAIClientOptions { Endpoint = new Uri(endpoint) })
.GetChatClient(modelName)
.AsAIAgent(instructions: "You are a friendly cooking assistant.", name: "ChefAgent");
// AgentSession tracks conversation history across turns automatically.
await using AgentSession session = agent.CreateSession();
Console.WriteLine(await session.RunAsync("I want to make pasta tonight. What should I know?"));
Console.WriteLine(await session.RunAsync("What sauce would you suggest for a beginner?"));
Console.WriteLine(await session.RunAsync("How long should I cook it for?"));
Code Sample β Structured Output with LM Studio
using System.ComponentModel;
using Microsoft.Agents.AI;
using OpenAI;
using System.ClientModel;
var endpoint = Environment.GetEnvironmentVariable("LMS_ENDPOINT")!;
var modelName = Environment.GetEnvironmentVariable("LMS_MODEL")!;
// Agent Framework auto-generates a JSON schema from this C# record.
record BookSummary(
[property: Description("The book title")] string Title,
[property: Description("The author's full name")] string Author,
[property: Description("Publication year as a 4-digit integer")] int Year,
[property: Description("A one-sentence summary of the book's main theme")] string Theme,
[property: Description("A genre label such as 'fiction', 'non-fiction', or 'sci-fi'")] string Genre);
AIAgent agent = new OpenAIClient(
new ApiKeyCredential("lmstudio"),
new OpenAIClientOptions { Endpoint = new Uri(endpoint) })
.GetChatClient(modelName)
.AsAIAgent(
instructions: "You are a literary expert. Respond only with valid JSON matching the schema.",
name: "BookAgent");
BookSummary book = await agent.RunAsync<BookSummary>(
"Summarise '1984' by George Orwell in structured form.");
Console.WriteLine($"Title: {book.Title} by {book.Author} ({book.Year})");
Console.WriteLine($"Genre: {book.Genre}");
Console.WriteLine($"Theme: {book.Theme}");
Step-by-Step Explanation
-
Install and start LM Studio β Download from
lmstudio.ai, load a model
in the app, and click Start Server in the Developer tab. The server listens on
port
1234by default. -
Note the model identifier β LM Studio shows the exact model string in the
server panel (e.g.
meta-llama-3.2-3b-instruct). Use this as yourLMS_MODELenvironment variable; the name must match exactly. -
Create an
OpenAIClientwith a custom endpoint β Pass any non-empty placeholder string as the API key (LM Studio doesn't validate it) and setOpenAIClientOptions.Endpointtohttp://localhost:1234/v1. -
Call
.GetChatClient(modelName).AsAIAgent()β From this point every Agent Framework feature works identically to cloud providers. The provider is completely transparent to your application code. -
Switch providers without code changes β To switch from LM Studio to
Azure OpenAI, replace the client construction block. All business logic using
RunAsync(),RunStreamingAsync(), andAgentSessionremains unchanged.
Next Steps
All Examples
- π€ Hello Agent
- π§ Function Tools
- π¬ Multi-Turn Conversations
- β‘ Streaming Responses
- π¦ Structured Output
- π Sequential Workflows
- πΈοΈ Multi-Agent Orchestration
- π¦ Ollama β Local AI
- π₯οΈ LM Studio β Local AI
- π§ Agent Memory
- π RAG
- π MCP Tools
- π OpenTelemetry
- π§ Customer Support Triage
- π¬ Research Pipeline
- π€ Tools vs Sub-Agents
Concepts Used
π LM Studio Documentation