FieldCure.AssistStudio.Core
0.7.0
See the version list below for details.
dotnet add package FieldCure.AssistStudio.Core --version 0.7.0
NuGet\Install-Package FieldCure.AssistStudio.Core -Version 0.7.0
<PackageReference Include="FieldCure.AssistStudio.Core" Version="0.7.0" />
<PackageVersion Include="FieldCure.AssistStudio.Core" Version="0.7.0" />
<PackageReference Include="FieldCure.AssistStudio.Core" />
paket add FieldCure.AssistStudio.Core --version 0.7.0
#r "nuget: FieldCure.AssistStudio.Core, 0.7.0"
#:package FieldCure.AssistStudio.Core@0.7.0
#addin nuget:?package=FieldCure.AssistStudio.Core&version=0.7.0
#tool nuget:?package=FieldCure.AssistStudio.Core&version=0.7.0
FieldCure.AssistStudio.Core
Platform-agnostic AI provider client library for .NET — Claude, OpenAI, Gemini, Ollama, Groq, and any OpenAI-compatible endpoint.
Features
- Multi-Provider — Claude, OpenAI, Gemini, Ollama, and Groq built-in. Implement
IAiProviderto add your own. - Streaming — Real-time structured event streaming via
IAsyncEnumerable<StreamEvent>. TheStreamEventdiscriminated union covers text, thinking, tool calls, usage, and completion. - Extended Thinking — Per-provider thinking/reasoning support.
ThinkingSupportdetection,ThinkingOverride(Auto / ForceOn / ForceOff), andThinkingBudgetonAiRequest. - Vision & Documents — Attach images (PNG, JPG, WebP, GIF), PDFs, and DOCX files.
PdfCapability(Auto / TextExtraction / NativePdf / PageAsImage) per provider. - Tool / Function Calling — Define tools with
IAssistTool.ToolCallExecutororchestrates execution with an optional confirmation callback.ToolResolvermerges built-in and MCP tools with name-conflict resolution. - MCP Tool Adapter —
McpToolAdapterbridges external MCP tools to theIAssistToolpipeline with zero MCP SDK dependency in Core. - Workspace Context —
IWorkspaceContextfor dynamic system prompt injection based on host app state. - RAG Support —
IContextProviderretrieves relevantContextChunks for a query. Chunks are injected into the system prompt alongside workspace context. - Token Tracking —
TokenUsage(input/output counts) exposed after every request. - Structured Logging —
DiagnosticLoggerwith pluggableOnException,OnWarning,OnInfocallbacks. - Hardware Probing —
HardwareProbefor GPU/CPU detection to evaluate Ollama model compatibility. - Platform-Agnostic — Targets
net8.0with no Windows dependency. Use from console apps, servers, or any .NET project.
Install
dotnet add package FieldCure.AssistStudio.Core
Quick Start
using FieldCure.AssistStudio.Models;
using FieldCure.AssistStudio.Providers;
// Create a provider
var provider = new ClaudeProvider(apiKey: "sk-ant-...", modelId: "claude-sonnet-4-20250514");
// Simple completion
var request = new AiRequest("What is the capital of France?");
var response = await provider.CompleteAsync(request);
Console.WriteLine(response.Content);
// Streaming with StreamEvent pattern matching
await foreach (var evt in provider.StreamAsync(request))
{
switch (evt)
{
case StreamEvent.TextDelta delta:
Console.Write(delta.Text);
break;
case StreamEvent.Usage usage:
Console.WriteLine($"\nTokens: {usage.TokenUsage.TotalTokens}");
break;
}
}
Extended Thinking
Models that support extended thinking (Claude, OpenAI o-series, Ollama think tags) can reason step-by-step before responding:
var request = new AiRequest("Prove that √2 is irrational.")
{
ThinkingEnabled = true,
ThinkingBudget = 8192
};
await foreach (var evt in provider.StreamAsync(request))
{
switch (evt)
{
case StreamEvent.ThinkingDelta t:
Console.Write($"[think] {t.Text}");
break;
case StreamEvent.TextDelta d:
Console.Write(d.Text);
break;
}
}
Use IAiProvider.GetThinkingSupport(modelId) to check per-model capability at runtime.
Tool Calling
Define tools by implementing IAssistTool, then pass them in AiRequest.Tools:
public class WeatherTool : IAssistTool
{
public string Name => "get_weather";
public string Description => "Get current weather for a city";
public string ParameterSchema => """{"type":"object","properties":{"city":{"type":"string"}}}""";
public bool RequiresConfirmation => false;
public async Task<string> ExecuteAsync(JsonElement parameters, CancellationToken ct = default)
{
var city = parameters.GetProperty("city").GetString();
return $"22°C, sunny in {city}";
}
}
ToolCallExecutor handles the execution loop with optional user confirmation:
var executor = new ToolCallExecutor([new WeatherTool()]);
executor.ConfirmationHandler = async (name, args) =>
{
// Show UI confirmation — return true to allow
return true;
};
var result = await executor.ExecuteAsync(toolCall);
ToolResolver.Resolve() merges built-in tools with MCP tools, prefixing MCP tool names with the server name on conflict.
Workspace Context
Inject dynamic context into the system prompt based on app state:
public class MyWorkspace : IWorkspaceContext
{
public string? ActiveLabel => "Project: MyApp";
public Task<string?> GetContextAsync(CancellationToken ct = default)
=> Task.FromResult<string?>($"Current file: {_currentFile}");
}
Supported Providers
| Provider | Streaming | Vision | Documents | Tool Calling | Thinking |
|---|---|---|---|---|---|
| Claude (Anthropic) | Yes | Yes | Yes | Yes | Yes |
| OpenAI (+ compatible) | Yes | Yes | Yes | Yes | o-series |
| Gemini (Google) | Yes | Yes | Yes | Yes | No |
| Ollama (local) | Yes | Dep. | Dep. | Dep. | think tags |
| Groq | Yes | Yes | Yes | Yes | Dep. |
OpenAI provider works with any OpenAI-compatible API (Groq, Azure OpenAI, etc.) by setting a custom
baseUrl.
Custom Provider
Implement IAiProvider to integrate any AI service:
public class MyProvider : IAiProvider
{
public string ProviderName => "MyService";
public string ModelId => "my-model-v1";
public TokenUsage? LastUsage { get; private set; }
public bool IsTruncated { get; private set; }
public string? LastRequestBody { get; private set; }
public string? LastRawResponse { get; private set; }
public PdfCapability PdfCapability => PdfCapability.TextExtraction;
public async Task<AiResponse> CompleteAsync(AiRequest request, CancellationToken ct = default)
{
// Call your API, return an AiResponse
throw new NotImplementedException();
}
public async IAsyncEnumerable<StreamEvent> StreamAsync(
AiRequest request,
[EnumeratorCancellation] CancellationToken ct = default)
{
yield return new StreamEvent.TextDelta("Hello from MyService!");
yield return new StreamEvent.StreamCompleted(false);
}
public Task<IReadOnlyList<AiModel>> ListModelsAsync(CancellationToken ct = default)
=> Task.FromResult<IReadOnlyList<AiModel>>(
[new AiModel("my-model-v1", "My Model", "myservice")]);
public Task<ConnectionInfo> ValidateConnectionAsync(CancellationToken ct = default)
=> Task.FromResult(new ConnectionInfo(true, null, null, null));
public ThinkingSupport GetThinkingSupport(string modelId)
=> ThinkingSupport.NotSupported;
}
Key Types
| Type | Description |
|---|---|
IAiProvider |
Provider interface — completion, streaming, model listing, thinking support |
StreamEvent |
Discriminated union — TextDelta, ThinkingDelta, ToolCallStart, ToolCallDelta, Usage, StreamCompleted |
IAssistTool |
Tool/function calling interface with optional confirmation |
AiRequest / AiResponse |
Request (messages, system prompt, tools, thinking) and response models |
ChatMessage |
Conversation message with role, content, attachments, and tree branching (ParentId) |
ProviderPreset |
Saved provider configuration — model, temperature, thinking, PDF capability |
Profile |
System prompt + tool selection preset |
ToolCallExecutor |
Executes tool calls with confirmation handler |
ToolResolver |
Merges built-in and MCP tools with conflict resolution |
McpToolAdapter |
Bridges MCP tools to IAssistTool (zero MCP SDK dependency) |
IWorkspaceContext |
Dynamic system prompt injection interface |
IContextProvider |
RAG retrieval interface — returns ContextChunks for a query |
ContextChunk |
Retrieved context record — Text, Source, Score |
ProviderFactory |
Create IAiProvider from a ProviderPreset |
DiagnosticLogger |
Structured logging with OnException/OnWarning/OnInfo callbacks |
ThinkingSupport |
Enum — NotSupported, Optional, Required |
ThinkingOverride |
Enum — Auto, ForceOn, ForceOff |
PdfCapability |
Enum — Auto, TextExtraction, NativePdf, PageAsImage |
HardwareProbe |
GPU/CPU detection for Ollama model compatibility |
Diagnostic Logging
Wire up DiagnosticLogger to capture internal events from providers and helpers:
DiagnosticLogger.OnException = ex => logger.LogError(ex, "AssistStudio error");
DiagnosticLogger.OnWarning = msg => logger.LogWarning(msg);
DiagnosticLogger.OnInfo = msg => logger.LogInformation(msg);
Related Packages
- FieldCure.AssistStudio.Controls.WinUI — WinUI 3 chat UI controls built on this library.
License
MIT — Copyright (c) 2026 FieldCure Co., Ltd.
| Product | Versions Compatible and additional computed target framework versions. |
|---|---|
| .NET | net8.0 is compatible. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. net9.0 was computed. net9.0-android was computed. net9.0-browser was computed. net9.0-ios was computed. net9.0-maccatalyst was computed. net9.0-macos was computed. net9.0-tvos was computed. net9.0-windows was computed. net10.0 was computed. net10.0-android was computed. net10.0-browser was computed. net10.0-ios was computed. net10.0-maccatalyst was computed. net10.0-macos was computed. net10.0-tvos was computed. net10.0-windows was computed. |
-
net8.0
- Microsoft.Win32.Registry (>= 5.0.0)
- PdfPig (>= 0.1.13)
- PDFtoImage (>= 5.2.0)
- System.Management (>= 9.0.14)
NuGet packages (1)
Showing the top 1 NuGet packages that depend on FieldCure.AssistStudio.Core:
| Package | Downloads |
|---|---|
|
FieldCure.AssistStudio.Controls.WinUI
AI Chat UI Controls for WinUI 3. Supports Claude, OpenAI, Gemini, Ollama and more. |
GitHub repositories
This package is not used by any popular GitHub repositories.
| Version | Downloads | Last Updated |
|---|---|---|
| 0.19.2 | 80 | 5/5/2026 |
| 0.19.1 | 94 | 5/4/2026 |
| 0.19.0 | 79 | 5/4/2026 |
| 0.18.0 | 99 | 4/27/2026 |
| 0.17.0 | 97 | 4/21/2026 |
| 0.16.0 | 93 | 4/14/2026 |
| 0.15.0 | 104 | 4/10/2026 |
| 0.14.0 | 95 | 4/7/2026 |
| 0.13.0 | 102 | 3/31/2026 |
| 0.12.0 | 106 | 3/30/2026 |
| 0.11.0 | 101 | 3/29/2026 |
| 0.10.0 | 98 | 3/24/2026 |
| 0.9.0 | 101 | 3/24/2026 |
| 0.8.0 | 90 | 3/22/2026 |
| 0.7.0 | 96 | 3/21/2026 |
| 0.6.0 | 104 | 3/17/2026 |
| 0.5.0 | 103 | 3/17/2026 |
| 0.4.0 | 101 | 3/17/2026 |
# Release Notes — FieldCure.AssistStudio.Core
## [0.7.0] - 2026-03-21
### Added
- Extended thinking support with `ThinkingOverride` and per-provider `ThinkingCapability` architecture
- `StreamEvent` replaces raw `string` in `IAiProvider.StreamAsync` (`IAsyncEnumerable<StreamEvent>`)
- Streaming tool call accumulation via `StreamToolCallAccumulator`
- Ollama `<think>` tag parsing into `ThinkingDelta` stream events
- `UrlFetchTool` built-in tool for web page content extraction
- `McpServerConfig` and `McpToolAdapter` for MCP (Model Context Protocol) integration
- `search_tools` meta-tool support with dynamic MCP metadata injection
- `DiagnosticLogger` structured logging hooks (`OnInfo`, `OnWarning`, `OnException`)
- `ModelCompatibility` helper for provider-specific feature detection
- `ContextProvider` delegate and `WorkspaceContext` model for tool context injection
- `ConversationToolState` model for tracking tool state across conversation turns
### Changed
- **Breaking:** `IAiProvider.StreamAsync` returns `IAsyncEnumerable<StreamEvent>` instead of `IAsyncEnumerable<string>`
- Providers implement proper `IDisposable` pattern (CA1816)
- Ollama remote host UX improved with timeout, error messages, and custom URL support
- Conversation file extension renamed from `.astx` to `.astd` (AssistStudio Document)
### Removed
- `ConversationManager` and `AppJsonContext` moved to App layer (not part of NuGet package)
---
## [0.6.0] - 2026-03-17
### Added
- `AppJsonContext` and `IndentedJsonContext` source-generated JSON serializer contexts for trim-safe serialization
### Changed
- `ConversationManager` uses `IndentedJsonContext.Default.ConversationData` instead of reflection-based `JsonSerializerOptions`
---
## [0.5.0] - 2026-03-17
### Added
- Dedicated NuGet package README with Core-specific usage examples and API reference
### Fixed
- GitHub repository URL corrected (`fieldlab` → `fieldcure`)
---
## [0.4.0] - 2026-03-17
### Added
- NuGet package metadata (Company, Copyright, Icon, README, Repository URL, Tags)
- Release notes auto-inclusion in NuGet package
- `publish-nuget.ps1` script for pack → sign → push workflow
---
## [0.3.0] - 2026-03-17
### Added
- Generic file and command tools for agentic workflows (`ReadFileTool`, `WriteFileTool`, `RunCommandTool`, etc.)
- `broadFileSystemAccess` support for tool file operations
---
## [0.2.0] - 2026-03-16
### Added
- Tool calling support for `ClaudeProvider` and `GeminiProvider`
- `PdfCapability` enum with `Auto`, `TextExtraction`, `NativePdf`, `PageAsImage` options
- `PdfCapability` property on `ProviderPreset` for per-preset PDF handling
- `AttachmentProcessor.RenderPdfPages()` for PDF-to-image conversion (via PDFtoImage)
- `PageAsImage` PDF handling in `OpenAiProvider` and `OllamaProvider` for vision models
- `DisplayName` default interface member on `IAssistTool` for human-readable UI labels
- `ProviderFactory` auto-resolves `PdfCapability.Auto` based on provider type
### Fixed
- Gemini tool call ID uniqueness for parallel calls
---
## [0.1.0] - 2026-03-15
### Added
- `IAiProvider` abstraction with `StreamAsync` returning `IAsyncEnumerable<string>`
- Provider implementations: `ClaudeProvider`, `OpenAiProvider`, `GeminiProvider`, `OllamaProvider`
- SSE and NDJSON streaming support
- Model listing (`ListModelsAsync`) for all providers
- `OllamaManager` for local model management (pull, delete, search)
- `OllamaFitPolicy` for automatic model selection based on hardware
- Token tracking (`TokenUsage` model)
- Conversation persistence (`ConversationManager`)
- Hardware detection helpers (`HardwareInfo`)
- Image and document attachment models (`ChatAttachment`)
- Profile and ProviderPreset models
- Tool use abstractions (`IAssistTool`, `ToolCall`, `ToolResult`)