OllamaSharp 4.0.0-preview.10
See the version list below for details.
dotnet add package OllamaSharp --version 4.0.0-preview.10
NuGet\Install-Package OllamaSharp -Version 4.0.0-preview.10
<PackageReference Include="OllamaSharp" Version="4.0.0-preview.10" />
<PackageVersion Include="OllamaSharp" Version="4.0.0-preview.10" />
<PackageReference Include="OllamaSharp" />
paket add OllamaSharp --version 4.0.0-preview.10
#r "nuget: OllamaSharp, 4.0.0-preview.10"
#addin nuget:?package=OllamaSharp&version=4.0.0-preview.10&prerelease
#tool nuget:?package=OllamaSharp&version=4.0.0-preview.10&prerelease
<p align="center"> <img alt="ollama" height="200px" src="https://github.com/awaescher/OllamaSharp/blob/main/Ollama.png"> </p>
OllamaSharp 🦙
OllamaSharp provides .NET bindings for the Ollama API, simplifying interactions with Ollama both locally and remotely.
✅ Supporting Microsoft.Extensions.AI and Microsoft Semantic Kernel
Features
- Ease of use: Interact with Ollama in just a few lines of code.
- API endpoint coverage: Support for all the Ollama API endpoints, including chats, embeddings, listing models, pulling and creating new models, and more.
- Real-time streaming: Stream responses directly to your application.
- Progress reporting: Get real-time progress feedback on tasks like model pulling.
- Support for vision models and tools (function calling).
Usage
OllamaSharp wraps each Ollama API endpoint in awaitable methods that fully support response streaming.
The following list shows a few simple code examples.
ℹ Try our full featured demo application that's included in this repository
Initializing
// set up the client
var uri = new Uri("http://localhost:11434");
var ollama = new OllamaApiClient(uri);
// select a model which should be used for further operations
ollama.SelectedModel = "llama3.1:8b";
Listing all models that are available locally
var models = await ollama.ListLocalModelsAsync();
Pulling a model and reporting progress
await foreach (var status in ollama.PullModelAsync("llama3.1:405b"))
Console.WriteLine($"{status.Percent}% {status.Status}");
Generating a completion directly into the console
await foreach (var stream in ollama.GenerateAsync("How are you today?"))
Console.Write(stream.Response);
Building interactive chats
var chat = new Chat(ollama);
while (true)
{
var message = Console.ReadLine();
await foreach (var answerToken in chat.SendAsync(message))
Console.Write(answerToken);
}
// messages including their roles and tool calls will automatically be tracked within the chat object
// and are accessible via the Messages property
Credits
The icon and name were reused from the amazing Ollama project.
I would like to thank all the contributors who take the time to improve OllamaSharp. First and foremost mili-tan, who always keeps OllamaSharp in sync with the Ollama API. ❤
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net5.0 was computed. net5.0-windows was computed. net6.0 was computed. net6.0-android was computed. net6.0-ios was computed. net6.0-maccatalyst was computed. net6.0-macos was computed. net6.0-tvos was computed. net6.0-windows was computed. net7.0 was computed. net7.0-android was computed. net7.0-ios was computed. net7.0-maccatalyst was computed. net7.0-macos was computed. net7.0-tvos was computed. net7.0-windows was computed. net8.0 was computed. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. net9.0 was computed. net9.0-android was computed. net9.0-browser was computed. net9.0-ios was computed. net9.0-maccatalyst was computed. net9.0-macos was computed. net9.0-tvos was computed. net9.0-windows was computed. net10.0 was computed. net10.0-android was computed. net10.0-browser was computed. net10.0-ios was computed. net10.0-maccatalyst was computed. net10.0-macos was computed. net10.0-tvos was computed. net10.0-windows was computed. |
.NET Core | netcoreapp2.0 was computed. netcoreapp2.1 was computed. netcoreapp2.2 was computed. netcoreapp3.0 was computed. netcoreapp3.1 was computed. |
.NET Standard | netstandard2.0 is compatible. netstandard2.1 was computed. |
.NET Framework | net461 was computed. net462 was computed. net463 was computed. net47 was computed. net471 was computed. net472 was computed. net48 was computed. net481 was computed. |
MonoAndroid | monoandroid was computed. |
MonoMac | monomac was computed. |
MonoTouch | monotouch was computed. |
Tizen | tizen40 was computed. tizen60 was computed. |
Xamarin.iOS | xamarinios was computed. |
Xamarin.Mac | xamarinmac was computed. |
Xamarin.TVOS | xamarintvos was computed. |
Xamarin.WatchOS | xamarinwatchos was computed. |
-
.NETStandard 2.0
- Microsoft.Bcl.AsyncInterfaces (>= 9.0.0-rc.2.24473.5)
- Microsoft.Extensions.AI.Abstractions (>= 9.0.0-preview.9.24507.7)
NuGet packages (19)
Showing the top 5 NuGet packages that depend on OllamaSharp:
Package | Downloads |
---|---|
Microsoft.KernelMemory.AI.Ollama
Provide access to Ollama LLM models in Kernel Memory to generate embeddings and text |
|
Microsoft.SemanticKernel.Connectors.Ollama
Semantic Kernel connector for Ollama. Contains services for text generation, chat completion and text embeddings. |
|
CommunityToolkit.Aspire.Hosting.Ollama
An Aspire integration leveraging the Ollama container with support for downloading a model on startup. |
|
CommunityToolkit.Aspire.OllamaSharp
A .NET Aspire client integration for the OllamaSharp library. |
|
EnergyAssembly
Package Description |
GitHub repositories (7)
Showing the top 7 popular GitHub repositories that depend on OllamaSharp:
Repository | Stars |
---|---|
microsoft/semantic-kernel
Integrate cutting-edge LLM technology quickly and easily into your apps
|
|
microsoft/kernel-memory
RAG architecture: index and query any data using LLM and natural language, track sources, show citations, asynchronous memory patterns.
|
|
dotnet/ai-samples
|
|
CommunityToolkit/Aspire
A community project with additional components and extensions for .NET Aspire
|
|
PowerShell/AIShell
An interactive shell to work with AI-powered assistance providers
|
|
davidfowl/aspire-ai-chat-demo
Aspire AI Chat is a full-stack chat sample that combines modern technologies to deliver a ChatGPT-like experience.
|
|
lindexi/lindexi_gd
博客用到的代码
|
Version | Downloads | Last Updated |
---|---|---|
5.2.6 | 449 | 7/4/2025 |
5.2.5 | 100 | 7/4/2025 |
5.2.4 | 96 | 7/4/2025 |
5.2.3 | 4,187 | 6/23/2025 |
5.2.2 | 11,032 | 5/30/2025 |
5.2.1 | 339 | 5/30/2025 |
5.1.20 | 361 | 5/30/2025 |
5.1.19 | 2,728 | 5/23/2025 |
5.1.18 | 7,643 | 5/19/2025 |
5.1.17 | 872 | 5/16/2025 |
5.1.16 | 1,162 | 5/13/2025 |
5.1.15 | 378 | 5/13/2025 |
5.1.14 | 11,935 | 5/6/2025 |
5.1.13 | 8,909 | 4/10/2025 |
5.1.12 | 97,575 | 4/10/2025 |
5.1.11 | 2,387 | 4/9/2025 |
5.1.10 | 1,703 | 4/7/2025 |
5.1.9 | 3,902 | 3/27/2025 |
5.1.8 | 214 | 3/27/2025 |
5.1.7 | 60,288 | 3/14/2025 |
5.1.6 | 200 | 3/14/2025 |
5.1.5 | 1,012 | 3/13/2025 |
5.1.4 | 2,783 | 3/5/2025 |
5.1.3 | 491 | 3/4/2025 |
5.1.2 | 21,954 | 2/24/2025 |
5.1.1 | 1,155 | 2/21/2025 |
5.1.0 | 202 | 2/21/2025 |
5.0.7 | 33,633 | 2/17/2025 |
5.0.6 | 27,709 | 2/3/2025 |
5.0.5 | 1,277 | 1/31/2025 |
5.0.4 | 3,963 | 1/27/2025 |
5.0.3 | 3,147 | 1/22/2025 |
5.0.2 | 8,525 | 1/15/2025 |
5.0.1 | 384 | 1/15/2025 |
4.0.22 | 9,513 | 1/10/2025 |
4.0.21 | 159 | 1/9/2025 |
4.0.20 | 398 | 1/8/2025 |
4.0.19 | 91 | 1/8/2025 |
4.0.18 | 455 | 1/8/2025 |
4.0.17 | 35,537 | 1/3/2025 |
4.0.16 | 131 | 1/3/2025 |
4.0.15 | 132 | 1/3/2025 |
4.0.14 | 138 | 1/3/2025 |
4.0.13 | 125 | 1/3/2025 |
4.0.12 | 125 | 1/3/2025 |
4.0.11 | 30,329 | 12/9/2024 |
4.0.10 | 122 | 12/9/2024 |
4.0.9 | 1,857 | 12/2/2024 |
4.0.8 | 34,632 | 11/22/2024 |
4.0.7 | 4,114 | 11/13/2024 |
4.0.6 | 13,376 | 11/7/2024 |
4.0.5 | 1,348 | 11/5/2024 |
4.0.4 | 600 | 11/4/2024 |
4.0.3 | 27,102 | 10/30/2024 |
4.0.2 | 301 | 10/29/2024 |
4.0.1 | 2,188 | 10/26/2024 |
4.0.0-preview.10 | 120 | 10/23/2024 |
4.0.0-preview.9 | 75 | 10/21/2024 |
4.0.0-preview.8 | 105 | 10/17/2024 |
3.0.15 | 5,012 | 10/21/2024 |
3.0.14 | 10,011 | 10/16/2024 |
3.0.13 | 136 | 10/16/2024 |
3.0.12 | 21,047 | 10/14/2024 |
3.0.11 | 1,491 | 10/9/2024 |
3.0.10 | 21,534 | 10/4/2024 |
3.0.9 | 150 | 10/4/2024 |
3.0.8 | 16,976 | 9/26/2024 |
3.0.7 | 34,056 | 9/12/2024 |
3.0.6 | 1,241 | 9/11/2024 |
3.0.5 | 828 | 9/11/2024 |
3.0.4 | 26,620 | 9/6/2024 |
3.0.3 | 151 | 9/6/2024 |
3.0.2 | 538 | 9/5/2024 |
3.0.1 | 22,488 | 9/2/2024 |
3.0.0 | 2,173 | 8/26/2024 |
2.1.3 | 2,733 | 8/23/2024 |
2.1.2 | 2,621 | 8/19/2024 |
2.1.1 | 4,390 | 8/5/2024 |
2.0.15 | 258 | 8/5/2024 |
2.0.14 | 142 | 8/3/2024 |
2.0.13 | 1,951 | 7/29/2024 |
2.0.12 | 154 | 7/28/2024 |
2.0.11 | 202 | 7/28/2024 |
2.0.10 | 4,954 | 7/12/2024 |
2.0.9 | 132 | 7/12/2024 |
2.0.8 | 171 | 7/12/2024 |
2.0.7 | 1,785 | 7/10/2024 |
2.0.6 | 3,163 | 6/25/2024 |
2.0.5 | 143 | 6/25/2024 |
2.0.4 | 608 | 6/24/2024 |
2.0.3 | 141 | 6/24/2024 |
2.0.2 | 200 | 6/24/2024 |
2.0.1 | 3,982 | 6/5/2024 |
1.1.13 | 531 | 6/5/2024 |
1.1.12 | 580 | 6/4/2024 |
1.1.11 | 336 | 6/2/2024 |
1.1.10 | 1,337 | 5/31/2024 |
1.1.9 | 7,291 | 5/15/2024 |
1.1.8 | 1,410 | 5/10/2024 |
1.1.7 | 157 | 5/10/2024 |
1.1.5 | 156 | 5/10/2024 |
1.1.4 | 288 | 5/10/2024 |
1.1.3 | 158 | 5/10/2024 |
1.1.2 | 156 | 5/10/2024 |
1.1.1 | 3,534 | 3/27/2024 |
1.1.0 | 2,208 | 1/8/2024 |
1.0.4 | 339 | 12/27/2023 |
1.0.3 | 416 | 11/30/2023 |
1.0.2 | 423 | 11/5/2023 |
1.0.1 | 252 | 10/16/2023 |
1.0.0 | 2,020 | 10/16/2023 |