PicoLLM 1.2.0

dotnet add package PicoLLM --version 1.2.0                
NuGet\Install-Package PicoLLM -Version 1.2.0                
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="PicoLLM" Version="1.2.0" />                
For projects that support PackageReference, copy this XML node into the project file to reference the package.
paket add PicoLLM --version 1.2.0                
#r "nuget: PicoLLM, 1.2.0"                
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
// Install PicoLLM as a Cake Addin
#addin nuget:?package=PicoLLM&version=1.2.0

// Install PicoLLM as a Cake Tool
#tool nuget:?package=PicoLLM&version=1.2.0                

picoLLM Inference Engine .NET Binding

Made in Vancouver, Canada by Picovoice

picoLLM Inference Engine

picoLLM Inference Engine is a highly accurate and cross-platform SDK optimized for running compressed large language models. picoLLM Inference Engine is:

  • Accurate; picoLLM Compression improves GPTQ by significant margins
  • Private; LLM inference runs 100% locally.
  • Cross-Platform
  • Runs on CPU and GPU
  • Free for open-weight models

Requirements

Compatibility

Platform compatible with .NET Framework 4.6.1+:

  • Windows (x86_64)

Platforms compatible with .NET Core 2.0+:

  • macOS (x86_64)
  • Windows (x86_64)

Platform compatible with .NET 6.0+:

  • Raspberry Pi (4 and 5)
  • Linux (x86_64)
  • macOS (arm64)
  • Windows (arm64)

Installation

You can install the latest version of picoLLM by getting the latest picoLLM Nuget package in Visual Studio or using the .NET CLI.

dotnet add package PicoLLM

Models

picoLLM Inference Engine supports a selection of open-weight models. The models are available from Picovoice Console.

  • Gemma
    • gemma-2b
    • gemma-2b-it
    • gemma-7b
    • gemma-7b-it
  • Llama-2
    • llama-2-7b
    • llama-2-7b-chat
    • llama-2-13b
    • llama-2-13b-chat
    • llama-2-70b
    • llama-2-70b-chat
  • Llama-3
    • llama-3-8b
    • llama-3-8b-instruct
    • llama-3-70b
    • llama-3-70b-instruct
  • Llama-3.2
    • llama3.2-1b-instruct
    • llama3.2-3b-instruct
  • Mistral
    • mistral-7b-v0.1
    • mistral-7b-instruct-v0.1
    • mistral-7b-instruct-v0.2
  • Mixtral
    • mixtral-8x7b-v0.1
    • mixtral-8x7b-instruct-v0.1
  • Phi-2
    • phi2
  • Phi-3
    • phi3
  • Phi-3.5
    • phi3.5

AccessKey

AccessKey is your authentication and authorization token for deploying Picovoice SDKs, including picoLLM. Anyone who is using Picovoice needs to have a valid AccessKey. You must keep your AccessKey secret. You would need internet connectivity to validate your AccessKey with Picovoice license servers even though the LLM inference is running 100% offline and completely free for open-weight models. Everyone who signs up for Picovoice Console receives a unique AccessKey.

Usage

Create an instance of the engine and generate a prompt completion:

using Pv;

PicoLLM pllm = PicoLLM.Create("${ACCESS_KEY}", "${MODEL_PATH}");

PicoLLMCompletion res = pllm.Generate('${PROMPT}');
Console.WriteLine(res.Completion);

Replace ${ACCESS_KEY} with yours obtained from Picovoice Console, ${MODEL_PATH} with the path to a model file downloaded from Picovoice Console, and ${PROMPT} with a prompt string.

Instruction-tuned models (e.g., llama-3-8b-instruct, llama-2-7b-chat, and gemma-2b-it) have a specific chat template. You can either directly format the prompt or use a dialog helper:

dialog = pllm.GetDialog();
dialog.AddHumanRequest(prompt);

PicoLLMCompletion res = pllm.Generate(dialog.Prompt());
dialog.AddLLMResponse(res.Completion);
Console.WriteLine(res.Completion);

To interrupt completion generation before it has finished:

pllm.Interrupt()

PicoLLM will have its resources freed by the garbage collector, but to have resources freed immediately after use, wrap it in a using statement or call .Dispose() directly:

using(PicoLLM pllm = PicoLLM.Create(accessKey, modelPath))
{
    // .. picoLLM usage here
}

Demos

The picoLLM dotnet demo project is a .NET application that provides command-line utilities for LLM completion and chat using picoLLM.

Product Compatible and additional computed target framework versions.
.NET net5.0 was computed.  net5.0-windows was computed.  net6.0 is compatible.  net6.0-android was computed.  net6.0-ios was computed.  net6.0-maccatalyst was computed.  net6.0-macos was computed.  net6.0-tvos was computed.  net6.0-windows was computed.  net7.0 was computed.  net7.0-android was computed.  net7.0-ios was computed.  net7.0-maccatalyst was computed.  net7.0-macos was computed.  net7.0-tvos was computed.  net7.0-windows was computed.  net8.0 is compatible.  net8.0-android was computed.  net8.0-browser was computed.  net8.0-ios was computed.  net8.0-maccatalyst was computed.  net8.0-macos was computed.  net8.0-tvos was computed.  net8.0-windows was computed.  net9.0 was computed.  net9.0-android was computed.  net9.0-browser was computed.  net9.0-ios was computed.  net9.0-maccatalyst was computed.  net9.0-macos was computed.  net9.0-tvos was computed.  net9.0-windows was computed. 
.NET Core netcoreapp2.0 was computed.  netcoreapp2.1 was computed.  netcoreapp2.2 was computed.  netcoreapp3.0 is compatible.  netcoreapp3.1 was computed. 
.NET Standard netstandard2.0 is compatible.  netstandard2.1 was computed. 
.NET Framework net461 was computed.  net462 was computed.  net463 was computed.  net47 was computed.  net471 was computed.  net472 was computed.  net48 was computed.  net481 was computed. 
MonoAndroid monoandroid was computed. 
MonoMac monomac was computed. 
MonoTouch monotouch was computed. 
Tizen tizen40 was computed.  tizen60 was computed. 
Xamarin.iOS xamarinios was computed. 
Xamarin.Mac xamarinmac was computed. 
Xamarin.TVOS xamarintvos was computed. 
Xamarin.WatchOS xamarinwatchos was computed. 
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.
  • .NETCoreApp 3.0

    • No dependencies.
  • .NETStandard 2.0

    • No dependencies.
  • net6.0

    • No dependencies.
  • net8.0

    • No dependencies.

NuGet packages

This package is not used by any NuGet packages.

GitHub repositories (1)

Showing the top 1 popular GitHub repositories that depend on PicoLLM:

Repository Stars
Picovoice/picollm
On-device LLM Inference Powered by X-Bit Quantization
Version Downloads Last updated
1.2.0 50 1/30/2025