TokenFlow.Core 0.4.0

There is a newer version of this package available.
See the version list below for details.
dotnet add package TokenFlow.Core --version 0.4.0
                    
NuGet\Install-Package TokenFlow.Core -Version 0.4.0
                    
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="TokenFlow.Core" Version="0.4.0" />
                    
For projects that support PackageReference, copy this XML node into the project file to reference the package.
<PackageVersion Include="TokenFlow.Core" Version="0.4.0" />
                    
Directory.Packages.props
<PackageReference Include="TokenFlow.Core" />
                    
Project file
For projects that support Central Package Management (CPM), copy this XML node into the solution Directory.Packages.props file to version the package.
paket add TokenFlow.Core --version 0.4.0
                    
#r "nuget: TokenFlow.Core, 0.4.0"
                    
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
#:package TokenFlow.Core@0.4.0
                    
#:package directive can be used in C# file-based apps starting in .NET 10 preview 4. Copy this into a .cs file before any lines of code to reference the package.
#addin nuget:?package=TokenFlow.Core&version=0.4.0
                    
Install as a Cake Addin
#tool nuget:?package=TokenFlow.Core&version=0.4.0
                    
Install as a Cake Tool

<p align="center"> <img src="https://github.com/AndrewClements84/TokenFlow.AI/blob/master/assets/logo.png?raw=true" alt="TokenFlow.AI" width="500"/> </p>

TokenFlow.AI

Build Docs codecov NuGet Version NuGet Downloads License: MIT


💡 Overview

TokenFlow.AI is a lightweight .NET library for tokenization, chunking, and cost estimation across modern large language models (LLMs) such as OpenAI GPT‑4o, Anthropic Claude, and Azure OpenAI.

It provides accurate token counting, intelligent text splitting, cumulative usage tracking, and real‑time cost estimation for any AI‑driven application.
Now includes CLI utilities, developer documentation, and performance benchmarking.


🧩 Key Features

  • 🔢 GPT‑style token counting for .NET
  • 🧱 Smart text chunking with configurable token limits and overlap
  • 💰 Real‑time cost estimation for prompt and completion usage
  • 🧮 TokenUsageTracker — track cumulative token and cost usage across analyses
  • 🧩 Unified TokenFlowClient — analyze, chunk, and cost in one API
  • ⚙️ CLI utilities (TokenFlow.Tools) — structured automation with --format, --input, and --output options
  • 📘 Developer documentation site — API reference + usage guides via GitHub Pages
  • 🧾 Benchmark suite powered by BenchmarkDotNet
  • 🔌 Pluggable tokenizer providers (OpenAI, Anthropic, Azure AI)
  • 📦 Zero external dependencies — small, fast, portable
  • 🧠 Dual targeting for .NET Standard 2.0 and .NET 8.0

🚀 Installation

dotnet add package TokenFlow.AI

Or install the shared core contracts:

dotnet add package TokenFlow.Core

🧠 Quick Examples

Token analysis and cost estimation:

using TokenFlow.AI.Client;

var client = new TokenFlowClient("gpt-4o");
var result = client.AnalyzeText("TokenFlow.AI brings structure to prompt engineering.");

Console.WriteLine($"Model: {result.ModelId}");
Console.WriteLine($"Tokens: {result.TokenCount}");
Console.WriteLine($"Estimated cost: £{result.EstimatedCost:F4}");

Chunking long text:

var chunks = client.ChunkText("This is a long body of text that exceeds a given token limit...", maxTokens: 50, overlapTokens: 5);

foreach (var chunk in chunks)
    Console.WriteLine($"Chunk: {chunk.Substring(0, Math.Min(40, chunk.Length))}...");

Tracking cumulative usage:

using TokenFlow.AI.Tracking;

var tracker = new TokenUsageTracker(client.GetModel());

tracker.Record(client.AnalyzeText("Hello TokenFlow.AI!"));
tracker.Record(client.AnalyzeText("Let's track token usage across sessions."));

var summary = tracker.GetSummary();

Console.WriteLine($"Analyses: {summary.AnalysisCount}");
Console.WriteLine($"Total Tokens: {summary.TotalTokens}");
Console.WriteLine($"Total Cost: £{summary.TotalCost:F4}");

Model Registry JSON Loading:

using TokenFlow.AI.Registry;

var registry = new ModelRegistry();
registry.LoadFromJsonString("[{ \"Id\": \"custom-model\", \"Family\": \"openai\", \"TokenizerName\": \"tiktoken\", \"MaxInputTokens\": 10000, \"MaxOutputTokens\": 2000, \"InputPricePer1K\": 0.01, \"OutputPricePer1K\": 0.02 }]");

var model = registry.TryGet("custom-model");
Console.WriteLine($"{model.Id}: {model.Family} — {model.MaxInputTokens} tokens");

Running via CLI (v2.1):

# Human-readable table output
dotnet run --project src/TokenFlow.Tools -- analyze "Hello TokenFlow!"

# JSON output
dotnet run --project src/TokenFlow.Tools -- analyze "Hello" --format json

# CSV output to file
dotnet run --project src/TokenFlow.Tools -- analyze "Hello" --format csv --output result.csv

# Quiet mode for CI/CD
dotnet run --project src/TokenFlow.Tools -- analyze "Quiet test" --format quiet

🧪 Running Tests

dotnet test --no-build --verbosity normal

All unit tests are written in xUnit and run automatically through GitHub Actions.
Code coverage is tracked with Codecov, and the project maintains 100% line and branch coverage.


🛠️ Roadmap

✅ Completed
  • Core interfaces and models (ITokenizer, ICostEstimator, ModelSpec, TokenCountResult)
  • Implemented ApproxTokenizer, CostEstimator, and ModelRegistry
  • Added TokenChunker with overlap support
  • Added TokenFlowClient — unified entry point for developers
  • Added TokenUsageTracker — cumulative cost and token tracking
  • Implemented ITokenizerFactory for dynamic tokenizer resolution
  • Added TokenFlow.Tools — developer CLI utilities
  • Added TokenFlow.Tools.Benchmarks — BenchmarkDotNet performance suite
  • Full xUnit test suite with 100% code coverage
  • CI/CD pipeline with Codecov and automated NuGet publishing
  • Dual targeting for .NET Standard 2.0 and .NET 8.0
  • Extended ModelRegistry to support JSON configuration loading ✅
  • Developer documentation site (API + usage guides) ✅
  • Expanded CLI commands and options (--format, --input, --output, --quiet, CSV + table formatting) ✅
🌟 Future Goals
  • Advanced tokenizers (OpenAI tiktoken, Claude tokenizer)
  • Integration with other Flow.AI components once public
  • Performance regression tracking in CI
  • Public release under Flow.AI ecosystem

💬 Contributing

Pull requests are welcome!
If you’d like to contribute to TokenFlow.AI, please read the upcoming CONTRIBUTING.md once published.


🪪 License

Distributed under the MIT License.
See LICENSE for details.


If you find TokenFlow.AI useful, please give the repository a star on GitHub!
It helps others discover the project and supports ongoing development.

Product Compatible and additional computed target framework versions.
.NET net5.0 was computed.  net5.0-windows was computed.  net6.0 was computed.  net6.0-android was computed.  net6.0-ios was computed.  net6.0-maccatalyst was computed.  net6.0-macos was computed.  net6.0-tvos was computed.  net6.0-windows was computed.  net7.0 was computed.  net7.0-android was computed.  net7.0-ios was computed.  net7.0-maccatalyst was computed.  net7.0-macos was computed.  net7.0-tvos was computed.  net7.0-windows was computed.  net8.0 is compatible.  net8.0-android was computed.  net8.0-browser was computed.  net8.0-ios was computed.  net8.0-maccatalyst was computed.  net8.0-macos was computed.  net8.0-tvos was computed.  net8.0-windows was computed.  net9.0 was computed.  net9.0-android was computed.  net9.0-browser was computed.  net9.0-ios was computed.  net9.0-maccatalyst was computed.  net9.0-macos was computed.  net9.0-tvos was computed.  net9.0-windows was computed.  net10.0 was computed.  net10.0-android was computed.  net10.0-browser was computed.  net10.0-ios was computed.  net10.0-maccatalyst was computed.  net10.0-macos was computed.  net10.0-tvos was computed.  net10.0-windows was computed. 
.NET Core netcoreapp2.0 was computed.  netcoreapp2.1 was computed.  netcoreapp2.2 was computed.  netcoreapp3.0 was computed.  netcoreapp3.1 was computed. 
.NET Standard netstandard2.0 is compatible.  netstandard2.1 was computed. 
.NET Framework net461 was computed.  net462 was computed.  net463 was computed.  net47 was computed.  net471 was computed.  net472 was computed.  net48 was computed.  net481 was computed. 
MonoAndroid monoandroid was computed. 
MonoMac monomac was computed. 
MonoTouch monotouch was computed. 
Tizen tizen40 was computed.  tizen60 was computed. 
Xamarin.iOS xamarinios was computed. 
Xamarin.Mac xamarinmac was computed. 
Xamarin.TVOS xamarintvos was computed. 
Xamarin.WatchOS xamarinwatchos was computed. 
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.
  • .NETStandard 2.0

    • No dependencies.
  • net8.0

    • No dependencies.

NuGet packages (3)

Showing the top 3 NuGet packages that depend on TokenFlow.Core:

Package Downloads
TokenFlow.AI

Lightweight .NET library for tokenization, chunking, and cost estimation across modern LLMs (OpenAI GPT-4o, Claude, Azure OpenAI).

TokenFlow.Tokenizers

Model-specific tokenizer implementations for TokenFlow.AI (OpenAI, Claude, etc.).

PromptStream.AI

Token-aware prompt composition, validation, and conversational context toolkit for .NET.

GitHub repositories

This package is not used by any popular GitHub repositories.

Version Downloads Last Updated
0.8.1 277 10/13/2025
0.8.0 278 10/12/2025
0.7.8 425 10/9/2025
0.7.5 189 10/9/2025
0.7.4 189 10/9/2025
0.7.2 191 10/9/2025
0.7.1 193 10/9/2025
0.7.0 198 10/9/2025
0.4.0 195 10/7/2025
0.3.0 202 10/6/2025