SpongeEngine.LMStudioSharp
0.3.8.5
dotnet add package SpongeEngine.LMStudioSharp --version 0.3.8.5
NuGet\Install-Package SpongeEngine.LMStudioSharp -Version 0.3.8.5
<PackageReference Include="SpongeEngine.LMStudioSharp" Version="0.3.8.5" />
paket add SpongeEngine.LMStudioSharp --version 0.3.8.5
#r "nuget: SpongeEngine.LMStudioSharp, 0.3.8.5"
// Install SpongeEngine.LMStudioSharp as a Cake Addin #addin nuget:?package=SpongeEngine.LMStudioSharp&version=0.3.8.5 // Install SpongeEngine.LMStudioSharp as a Cake Tool #tool nuget:?package=SpongeEngine.LMStudioSharp&version=0.3.8.5
LMStudioSharp
C# client for LM Studio.
Features
- Complete support for LM Studio's native API
- Text completion and chat completion
- Streaming support for both completion types
- Text embeddings generation
- Model information retrieval
- Comprehensive configuration options
- Built-in error handling and logging
- Cross-platform compatibility
- Full async/await support
Installation
Install via NuGet:
dotnet add package SpongeEngine.LMStudioSharp
Quick Start
using SpongeEngine.LMStudioSharp;
using SpongeEngine.LMStudioSharp.Models.Completion;
using SpongeEngine.LMStudioSharp.Models.Chat;
// Configure the client
var options = new LmStudioClientOptions
{
HttpClient = new HttpClient
{
BaseAddress = new Uri("http://localhost:1234")
}
};
// Create client instance
using var client = new LmStudioSharpClient(options);
// List available models
var models = await client.ListModelsAsync();
var modelId = models.Data[0].Id;
// Text completion
var completionRequest = new CompletionRequest
{
Model = modelId,
Prompt = "Write a short story about a robot:",
MaxTokens = 200,
Temperature = 0.7f,
TopP = 0.9f
};
var completionResponse = await client.CompleteAsync(completionRequest);
Console.WriteLine(completionResponse.Choices[0].GetText());
// Chat completion
var chatRequest = new ChatRequest
{
Model = modelId,
Messages = new List<ChatMessage>
{
new() { Role = "system", Content = "You are a helpful assistant." },
new() { Role = "user", Content = "Tell me a joke about programming." }
},
Temperature = 0.7f
};
var chatResponse = await client.ChatCompleteAsync(chatRequest);
Console.WriteLine(chatResponse.Choices[0].GetText());
// Stream completion
await foreach (var token in client.StreamCompletionAsync(completionRequest))
{
Console.Write(token);
}
Configuration Options
Client Options
var options = new LmStudioClientOptions
{
HttpClient = new HttpClient
{
BaseAddress = new Uri("http://localhost:1234")
}, // Configure HttpClient with base address
JsonSerializerOptions = new JsonSerializerOptions(), // Optional JSON options
Logger = loggerInstance // Optional ILogger instance
};
Completion Request Parameters
var request = new CompletionRequest
{
Model = "model-id",
Prompt = "Your prompt here",
MaxTokens = 200, // Maximum tokens to generate
Temperature = 0.7f, // Randomness (0.0-1.0)
TopP = 0.9f, // Nucleus sampling threshold
Stop = new[] { "\n" }, // Stop sequences
Stream = false // Enable streaming
};
Error Handling
try
{
var response = await client.CompleteAsync(request);
}
catch (LlmSharpException ex)
{
Console.WriteLine($"LM Studio error: {ex.Message}");
if (ex.StatusCode.HasValue)
{
Console.WriteLine($"Status code: {ex.StatusCode}");
}
Console.WriteLine($"Response content: {ex.ResponseContent}");
}
catch (Exception ex)
{
Console.WriteLine($"General error: {ex.Message}");
}
Logging
The client supports Microsoft.Extensions.Logging:
var logger = LoggerFactory
.Create(builder => builder
.AddConsole()
.SetMinimumLevel(LogLevel.Debug))
.CreateLogger<LmStudioSharpClient>();
var options = new LmStudioClientOptions
{
HttpClient = new HttpClient
{
BaseAddress = new Uri("http://localhost:1234")
},
Logger = logger
};
var client = new LmStudioSharpClient(options);
JSON Serialization
Custom JSON options can be provided:
var jsonOptions = new JsonSerializerOptions
{
DefaultIgnoreCondition = JsonIgnoreCondition.WhenWritingNull
};
var options = new LmStudioClientOptions
{
HttpClient = new HttpClient
{
BaseAddress = new Uri("http://localhost:1234")
},
JsonSerializerOptions = jsonOptions
};
var client = new LmStudioSharpClient(options);
Testing
The library includes both unit and integration tests. Integration tests require a running LM Studio server.
To run the tests:
dotnet test
To configure the test environment:
# Set environment variables for testing
export LMSTUDIO_BASE_URL="http://localhost:1234"
License
This project is licensed under the MIT License - see the LICENSE file for details.
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
Support
For issues and feature requests, please use the GitHub issues page.
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net6.0 is compatible. net6.0-android was computed. net6.0-ios was computed. net6.0-maccatalyst was computed. net6.0-macos was computed. net6.0-tvos was computed. net6.0-windows was computed. net7.0 is compatible. net7.0-android was computed. net7.0-ios was computed. net7.0-maccatalyst was computed. net7.0-macos was computed. net7.0-tvos was computed. net7.0-windows was computed. net8.0 is compatible. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. net9.0 was computed. net9.0-android was computed. net9.0-browser was computed. net9.0-ios was computed. net9.0-maccatalyst was computed. net9.0-macos was computed. net9.0-tvos was computed. net9.0-windows was computed. |
-
net6.0
- Microsoft.Extensions.Caching.Memory (>= 9.0.1)
- Microsoft.Extensions.Logging.Abstractions (>= 9.0.1)
- Polly (>= 8.5.1)
- SpongeEngine.SpongeLLM.Core (>= 0.0.3)
- System.Linq.Async (>= 6.0.1)
-
net7.0
- Microsoft.Extensions.Caching.Memory (>= 9.0.1)
- Microsoft.Extensions.Logging.Abstractions (>= 9.0.1)
- Polly (>= 8.5.1)
- SpongeEngine.SpongeLLM.Core (>= 0.0.3)
- System.Linq.Async (>= 6.0.1)
-
net8.0
- Microsoft.Extensions.Caching.Memory (>= 9.0.1)
- Microsoft.Extensions.Logging.Abstractions (>= 9.0.1)
- Polly (>= 8.5.1)
- SpongeEngine.SpongeLLM.Core (>= 0.0.3)
- System.Linq.Async (>= 6.0.1)
NuGet packages (1)
Showing the top 1 NuGet packages that depend on SpongeEngine.LMStudioSharp:
Package | Downloads |
---|---|
SpongeEngine.SpongeLLM
Unified C# client for LLM providers. |
GitHub repositories
This package is not used by any popular GitHub repositories.