PromptStream.AI.Integration.TokenFlow
0.8.5
dotnet add package PromptStream.AI.Integration.TokenFlow --version 0.8.5
NuGet\Install-Package PromptStream.AI.Integration.TokenFlow -Version 0.8.5
<PackageReference Include="PromptStream.AI.Integration.TokenFlow" Version="0.8.5" />
<PackageVersion Include="PromptStream.AI.Integration.TokenFlow" Version="0.8.5" />
<PackageReference Include="PromptStream.AI.Integration.TokenFlow" />
paket add PromptStream.AI.Integration.TokenFlow --version 0.8.5
#r "nuget: PromptStream.AI.Integration.TokenFlow, 0.8.5"
#:package PromptStream.AI.Integration.TokenFlow@0.8.5
#addin nuget:?package=PromptStream.AI.Integration.TokenFlow&version=0.8.5
#tool nuget:?package=PromptStream.AI.Integration.TokenFlow&version=0.8.5
<p align="center"> <img src="https://github.com/AndrewClements84/PromptStream.AI/blob/master/assets/logo.png?raw=true" alt="PromptStream.AI" width="500"/> </p>
🧠 PromptStream.AI
🧩 Description
PromptStream.AI — Token-aware prompt composition, validation, and conversational context toolkit for .NET.
Built atop Flow.AI.Core and TokenFlow.AI,
PromptStream.AI enables developers to compose, validate, generate, and manage multi-turn AI prompts with token budgeting, interpolation, and contextual memory.
⚙️ Key Features
- 🧩 Token-aware prompt builder with variable interpolation (
{{variable}}syntax) - ✅ Validation engine for token limits, structure, and completeness
- 💬 Shared Core Models from Flow.AI.Core (
PromptTemplate,PromptInstance,PromptMessage,PromptResponse) - 🧠 Context manager with replay, merge, summarization, and JSON persistence
- 💾 Persistent context storage (
ToJson/LoadFromJson) - 🧮 Token budgeting tools (
EstimateTokenUsage,TrimToTokenBudget) - ⚡ CLI utility (
PromptStream.AI.CLI) for building, validating, analyzing, and generating prompts - 🔌 Seamless integration with TokenFlow.AI for model-aware tokenization
🚀 Installation
dotnet add package PromptStream.AI
Requires:
- .NET 8.0 or higher
- Flow.AI.Core v0.2.0+
- (optional) TokenFlow.AI for advanced token metrics
🧠 Quickstart Example
using System;
using System.Collections.Generic;
using Flow.AI.Core.Models;
using Flow.AI.Core.Interfaces;
using TokenFlow.AI.Integration;
using PromptStream.AI.Services;
// Initialize the service with token tracking
var tokenProvider = new BasicTokenFlowProvider();
var modelClient = new TokenFlowModelClient("gpt-4o-mini");
var context = new PromptContextManager();
var service = new PromptStreamService(tokenProvider, context, modelClient);
// Define a shared Core template
var template = new PromptTemplate
{
Id = "summarize-v1",
Template = "Summarize the following:\n\n{{input}}\n\nBe concise.",
RequiredVariables = new() { "input" }
};
// Variables to inject
var variables = new Dictionary<string, string>
{
["input"] = "Flow.AI enables composable AI workflows for .NET developers."
};
// Build and validate
var (instance, validation) = service.BuildAndValidate(template, variables);
if (validation.IsValid)
{
Console.WriteLine($"✅ Valid prompt ({validation.TokenCount} tokens)");
Console.WriteLine(instance.RenderedText);
}
else
{
Console.WriteLine($"❌ Invalid: {string.Join(", ", validation.Errors)}");
}
// Add a user message to the context
context.AddMessage(new PromptMessage { Role = "user", Content = instance.RenderedText });
💻 CLI Usage (PromptStream.AI.CLI)
PromptStream.AI includes a full command-line interface for developers to build, validate, analyze, and generate prompts directly from the terminal.
🧩 Build a prompt
dotnet run --project src/PromptStream.AI.CLI/PromptStream.AI.CLI.csproj -- build --template "Hello {{name}}" --var name=Andrew
✅ Validate a prompt
dotnet run --project src/PromptStream.AI.CLI/PromptStream.AI.CLI.csproj -- validate --template "Summarize {{topic}}" --var topic="AI in .NET"
🤖 Generate a model response
dotnet run --project src/PromptStream.AI.CLI/PromptStream.AI.CLI.csproj -- generate --template "Explain {{concept}}" --var concept="tokenization" --save context.json
🧠 Manage conversation context
dotnet run --project src/PromptStream.AI.CLI/PromptStream.AI.CLI.csproj -- context --load context.json --summarize
📊 Analyze prompt usage
dotnet run --project src/PromptStream.AI.CLI/PromptStream.AI.CLI.csproj -- analyze --template "Summarize {{topic}}" --var topic="AI" --model gpt-4o-mini
Available commands:
| Command | Description |
|----------|--------------|
| build | Render a prompt with variable substitution |
| validate | Validate prompt completeness and token limits |
| generate | Build, validate, and produce a model-like response |
| context | Load, save, summarize, or clear conversation context |
| analyze | Estimate token usage and cost for prompts |
🌟 Supporting the Project
If you find PromptStream.AI helpful, please consider
⭐ starring the repository and ☕ supporting my work.
Your support helps keep the Flow.AI ecosystem growing.
Part of the Flow.AI Ecosystem
© 2025 Andrew Clements
| Product | Versions Compatible and additional computed target framework versions. |
|---|---|
| .NET | net8.0 is compatible. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. net9.0 was computed. net9.0-android was computed. net9.0-browser was computed. net9.0-ios was computed. net9.0-maccatalyst was computed. net9.0-macos was computed. net9.0-tvos was computed. net9.0-windows was computed. net10.0 was computed. net10.0-android was computed. net10.0-browser was computed. net10.0-ios was computed. net10.0-maccatalyst was computed. net10.0-macos was computed. net10.0-tvos was computed. net10.0-windows was computed. |
-
net8.0
- Flow.AI.Core (>= 0.2.0)
- PromptStream.AI (>= 0.8.5)
- TokenFlow.AI (>= 0.8.1)
- TokenFlow.Tokenizers (>= 0.8.1)
NuGet packages
This package is not used by any NuGet packages.
GitHub repositories
This package is not used by any popular GitHub repositories.