TensorStack.Providers.DML
                              
                            
                                0.1.18
                            
                        
                            
                                
                                
                                    Prefix Reserved
                                
                            
                    dotnet add package TensorStack.Providers.DML --version 0.1.18
NuGet\Install-Package TensorStack.Providers.DML -Version 0.1.18
        
        
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
                    
    
    <PackageReference Include="TensorStack.Providers.DML" Version="0.1.18" />
        
        
For projects that support PackageReference, copy this XML node into the project file to reference the package.
                    
    
    <PackageVersion Include="TensorStack.Providers.DML" Version="0.1.18" />
<PackageReference Include="TensorStack.Providers.DML" />
        
        
For projects that support Central Package Management (CPM), copy this XML node into the solution Directory.Packages.props file to version the package.
                    
    
    paket add TensorStack.Providers.DML --version 0.1.18
        
        
 The NuGet Team does not provide support for this client. Please contact its maintainers for support.
                    
    
    #r "nuget: TensorStack.Providers.DML, 0.1.18"
        
        
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
                    
    
    #:package TensorStack.Providers.DML@0.1.18
        
        
#:package directive can be used in C# file-based apps starting in .NET 10 preview 4. Copy this into a .cs file before any lines of code to reference the package.
                    
    
    #addin nuget:?package=TensorStack.Providers.DML&version=0.1.18
#tool nuget:?package=TensorStack.Providers.DML&version=0.1.18
        
        
 The NuGet Team does not provide support for this client. Please contact its maintainers for support.
                    
    
    TensorStack.Providers.DML
The Provider class in the TensorStack.Providers.DML namespace gives you a simple, unified way to access available compute devices and create ONNX Runtime execution providers for them.
Basic Initialization
using TensorStack.Providers;
// Initialize the provider once (recommended at startup)
Provider.Initialize();
// Or initialize with custom environment options
var options = new EnvironmentCreationOptions
{
    // Example: configure threading, logging, or memory behavior
};
Provider.Initialize(options);
Listing Available Devices
// Get all available devices (CPU, GPU, NPU, etc.)
var devices = Provider.GetDevices();
foreach (var device in devices)
{
    Console.WriteLine($"{device.Type} - ID: {device.DeviceId}");
}
Getting a Specific Device
// Get the default CPU device
var cpu = Provider.GetDevice(DeviceType.CPU);
// Get a specific GPU device (e.g., device ID 0)
var gpu = Provider.GetDevice(DeviceType.GPU, 0);
Creating an Execution Provider
using Microsoft.ML.OnnxRuntime;
// Create an ExecutionProvider for the CPU
var cpuProvider = Provider.GetProvider(GraphOptimizationLevel.ORT_ENABLE_ALL);
// Create an ExecutionProvider for a specific device type
var gpuProvider = Provider.GetProvider(DeviceType.GPU, GraphOptimizationLevel.ORT_ENABLE_ALL);
// Create an ExecutionProvider for a specific device and ID
var customProvider = Provider.GetProvider(DeviceType.CPU, 0, GraphOptimizationLevel.ORT_ENABLE_EXTENDED);
Full Example
using System;
using TensorStack.Providers;
using Microsoft.ML.OnnxRuntime;
class Program
{
    static void Main()
    {
        // Initialize TensorStack provider system
        Provider.Initialize();
        // Choose the best CPU device
        var device = Provider.GetDevice();
        // Create an execution provider optimized for inference
        var provider = Provider.GetProvider(device, GraphOptimizationLevel.ORT_ENABLE_ALL);
        Console.WriteLine($"Using device: {device.Type} (ID: {device.DeviceId})");
    }
}
Notes
- Provider.Initialize()only runs once per process; additional calls are ignored.
- GetDevices()automatically initializes the provider if it hasn�t been called yet.
- By default, this provider wraps the CPU execution provider.
- GPU and NPU logic are placeholders and can be extended in your own implementations.
| Product | Versions Compatible and additional computed target framework versions. | 
|---|---|
| .NET | net9.0-windows7.0 is compatible. net10.0-windows was computed. | 
        
        Compatible target framework(s)
    
    
        
        Included target framework(s) (in package)
    
    Learn more about Target Frameworks and .NET Standard.
- 
                                                    net9.0-windows7.0- Microsoft.ML.OnnxRuntime.DirectML (>= 1.23.0)
- TensorStack.Common (>= 0.1.18)
 
NuGet packages
This package is not used by any NuGet packages.
GitHub repositories
This package is not used by any popular GitHub repositories.