LLMKit 2.1.0
dotnet add package LLMKit --version 2.1.0
NuGet\Install-Package LLMKit -Version 2.1.0
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="LLMKit" Version="2.1.0" />
For projects that support PackageReference, copy this XML node into the project file to reference the package.
<PackageVersion Include="LLMKit" Version="2.1.0" />
<PackageReference Include="LLMKit" />
For projects that support Central Package Management (CPM), copy this XML node into the solution Directory.Packages.props file to version the package.
paket add LLMKit --version 2.1.0
The NuGet Team does not provide support for this client. Please contact its maintainers for support.
#r "nuget: LLMKit, 2.1.0"
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
#addin nuget:?package=LLMKit&version=2.1.0
#tool nuget:?package=LLMKit&version=2.1.0
The NuGet Team does not provide support for this client. Please contact its maintainers for support.
LLMKit
LLMKit is a thread-safe .NET library that provides a unified interface for interacting with various Large Language Models (LLMs) including OpenAI, Gemini, and DeepSeek.
Features
- Unified interface for multiple LLM providers
- Thread-safe implementation
- Conversation management
- Fluent API for message building
- Configurable parameters (tokens, temperature, etc.)
- Comprehensive error handling
- Dependency injection support
- Cancellation token support
- Custom endpoint support for all providers
Installation
Install-Package LLMKit
Requirements
- .NET 8.0 or later
- Valid API keys for the LLM providers
Quick Start
using LLMKit;
using LLMKit.Providers;
// Using statement ensures proper disposal
using var client = new LLMClient(
new OpenAIProvider(apiKey: "your-api-key", model: "gpt-3.5-turbo")
);
string response = await client.GenerateTextAsync(
"You are a helpful assistant.",
"What is the capital of France?"
);
Usage Examples
Basic Conversation with Proper Disposal
using var client = new LLMClient(new OpenAIProvider("your-api-key", "gpt-3.5-turbo"));
var conversation = client.StartConversation();
await client.SendMessageAsync("Hello, how are you?");
await client.SendMessageAsync("What's the weather like?");
await client.SendMessageAsync("Tell me a joke");
string history = client.GetFormattedConversation();
// Client is automatically disposed here
Custom Parameters
var client = new LLMClient(
new OpenAIProvider("your-api-key", "gpt-3.5-turbo"),
defaultMaxTokens: 1000,
defaultTemperature: 0.7
);
Dependency Injection
services.AddSingleton<ILLMProvider>(sp =>
new OpenAIProvider(
apiKey: Configuration["OpenAI:ApiKey"],
model: Configuration["OpenAI:Model"]
)
);
services.AddSingleton<LLMClient>();
Error Handling
try
{
var response = await client.GenerateTextAsync(
"You are a helpful assistant.",
"What is the capital of France?"
);
}
catch (LLMException ex)
{
Console.WriteLine($"Error: {ex.Message}");
}
Cancellation
using var cts = new CancellationTokenSource(TimeSpan.FromSeconds(30));
var response = await client.GenerateTextAsync(
"You are a helpful assistant.",
"What is the capital of France?",
cts.Token
);
Supported Providers
OpenAI
var provider = new OpenAIProvider(
apiKey: "your-api-key",
model: "gpt-3.5-turbo"
);
Gemini
var provider = new GeminiProvider(
apiKey: "your-api-key",
model: "gemini-2.0-flash"
);
DeepSeek
var provider = new DeepSeekProvider(
apiKey: "your-api-key",
model: "deepseek-chat"
);
Custom Endpoints
Each provider supports custom endpoints. If not provided, the library will use the default endpoint for that provider.
// OpenAI with custom endpoint
var client = new LLMClient(new OpenAIProvider(
apiKey: "your-api-key",
model: "gpt-3.5-turbo",
endpoint: new Uri("https://api.openai.com/v1/chat/completions")
));
// Gemini with custom endpoint
var client = new LLMClient(new GeminiProvider(
apiKey: "your-api-key",
model: "gemini-pro",
endpoint: new Uri("https://generativelanguage.googleapis.com/v1beta/models")
));
// DeepSeek with custom endpoint
var client = new LLMClient(new DeepSeekProvider(
apiKey: "your-api-key",
model: "deepseek-chat",
endpoint: new Uri("https://api.deepseek.com/v1/chat/completions")
));
License
MIT License. See LICENSE for details.
Support
For issues or questions, please open an issue in the GitHub repository.
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net8.0 is compatible. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. net9.0 was computed. net9.0-android was computed. net9.0-browser was computed. net9.0-ios was computed. net9.0-maccatalyst was computed. net9.0-macos was computed. net9.0-tvos was computed. net9.0-windows was computed. |
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.
-
net8.0
- System.Text.Json (>= 9.0.3)
NuGet packages
This package is not used by any NuGet packages.
GitHub repositories
This package is not used by any popular GitHub repositories.