Radiate 1.0.7

There is a newer version of this package available.
See the version list below for details.
dotnet add package Radiate --version 1.0.7
NuGet\Install-Package Radiate -Version 1.0.7
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="Radiate" Version="1.0.7" />
For projects that support PackageReference, copy this XML node into the project file to reference the package.
paket add Radiate --version 1.0.7
#r "nuget: Radiate, 1.0.7"
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
// Install Radiate as a Cake Addin
#addin nuget:?package=Radiate&version=1.0.7

// Install Radiate as a Cake Tool
#tool nuget:?package=Radiate&version=1.0.7

.NET port of Rust crate radiate.

Algorithms

  1. Random Forest
  2. Support Vector Machine
  3. MultiLayer Perceptron
    • Dense Layer
    • Dropout Layer
    • Flatten Layer
    • LSTM Layer
    • Convolutional Layer
    • MaxPooling Layer
  4. KMeans Clustering
  5. Evolution Engine
    • Evolve any object which implements the Genome base class. Implementation of NEAT for evolving NeuralNetworks is included.

Callbacks

Similar to Keras Callbacks. Hook into the training loop with custom code. Create an object that implements any or all of the following:

  1. IEpochStartedCallback
  2. IBatchCompletedCallback
  3. IEpochCompletedCallback
  4. ITrainingCompletedCallback

See this for example callbacks or this for a VerboseTrainingCallback to print training progress to the Console.

Feature Engineering

Functional feature engineering with TensorTrainSet. Transform input data (featurs, targets) with specific options below.

  1. Batch - Set a batch size to train on.
  2. Layer - Layer data by n rows.
  3. Split - Split the data into a training set and testing set. Default is 75% split training, 25% testing.
  4. Reshape - Reshape the row vector to a shape of (height, width, depth), useful for images.
  5. Pad - Pad an image Tensor with n zeros.
  6. Shuffle - Shuffle the rows of the dataset randomly.
  7. Kernel - Add kernel transform for the features, possible options are RBF, Polynomial, and Linear (None).
  8. TransformFeatures - Transform the feature data. Options are Normalize, Standardize, OHE (One Hot Encode), and Image (divide data point by 255).
  9. TransformTargets - Transform the target data. Options same as above.

Model saving and loading

Save an Optimizer<T> model like

var optimizer = new Optimizer<RandomForest>(forest, tensorTrainSet);
var wrapped = optimizer.Save();

The Optimizer<T> is not Json serializable, but the OptimizerWrap is, so the Optimizer<T> must be converted to a concrete object before serializing.

OptimizerWrap contains three items:

  1. TensorTrainSet options, the options used to transform the input features/targets. During predicion the Optimizer<T> uses these options to transform the input vector so it matches the trained features in order to get accurate predictions.
  2. LossFunction, the loss function used during training. If you save a model mid training, the loss function is needed when loading back in the model to continue training.
  3. ModelWrap, the machine learning model being trained/used for prediction.

The Optimizer<T> can also be converted to a json string or a memory stream like so:

var optimizer = new Optimizer<RandomForest>(forest, tensorTrainSet);
var jsonString = ModelWriter.ToJson<RandomForest>(optimizer);
var stream = ModelWriter.ToStream<RandomForest>(optimizer);

Loading in an Opimizer<T> from the above options:

var optimizer = ModelReader.FromJson<RandomForest>(jsonString);
var optimizer = ModelReader.FromStream<RandomForest>(stream);

Make predictions

Because the Optimizer<T> has TensorTrainSet, it can transform a given float[] to acceptable input to the model even after saving/loading a model. This makes making predictions as easy as

var vectorToPredict = new float[] { 1f, 2f, 3f, 4f, 5f };
var prediction = trainedOptimizer.Predict(vectorToPredict);

Random

All random numbers are generated by calling RandomGenerator.Next. This allows for random seeds to be set for model building.

RandomGenerator.Seed = 3;

Will set a random seed for the Optimizer<T>.

Loss functions

  1. CrossEntropy
  2. Difference
  3. Hinge
  4. MeanSquaredError

Activation Functions

  1. Sigmoid
  2. ReLu
  3. Tanh
  4. Linear
  5. SoftMax
  6. ExpSigmoid

Gradients

Only applies to MultiLayerPerceptron and SupportVectorMachine

  1. Adam
  2. SGD

Examples

Datasets coming from Radiate.Data which provides easily accessable common machine learning datasets.

Convolutional Neural Network on MNist handwritten digets dataset

<img src="https://camo.githubusercontent.com/01c057a753e92a9bc70b8c45d62b295431851c09cffadf53106fc0aea7e2843f/687474703a2f2f692e7974696d672e636f6d2f76692f3051493378675875422d512f687164656661756c742e6a7067" width="300px">

const int featureLimit = 5000;
const int batchSize = 128;
const int maxEpochs = 10;

var (rawInputs, rawLabels) = await new Mnist(featureLimit).GetDataSet();

var pair = new TensorTrainSet(rawInputs, rawLabels)
    .Reshape(new Shape(28, 28, 1))
    .TransformFeatures(Norm.Image)
    .TransformTargets(Norm.OHE)
    .Batch(batchSize)
    .Split();

var neuralNetwork = new MultiLayerPerceptron()
    .AddLayer(new ConvInfo(64, 3))
    .AddLayer(new MaxPoolInfo(2))
    .AddLayer(new FlattenInfo())
    .AddLayer(new DenseInfo(64, Activation.Sigmoid))
    .AddLayer(new DenseInfo(pair.OutputCategories, Activation.SoftMax));

var optimizer = new Optimizer<MultiLayerPerceptron>(neuralNetwork, pair, new List<ITrainingCallback>
{
    new VerboseTrainingCallback(pair, maxEpochs),
    new ConfusionMatrixCallback()
});

await optimizer.Train(epoch => maxEpochs == epoch.Index);

Random Forest on Iris Flowers dataset

<img src="https://upload.wikimedia.org/wikipedia/commons/5/56/Iris_dataset_scatterplot.svg" width="300px">

const int numTrees = 10;
const int maxDepth = 10;
const int minSampleSplit = 2;

var (rawFeatures, rawLabels) = await new IrisFlowers().GetDataSet();
var pair = new TensorTrainSet(rawFeatures, rawLabels)
    .Shuffle()
    .Split()
    .Batch(rawFeatures.Count);

var forest = new RandomForest(numTrees, new ForestInfo(minSampleSplit, maxDepth));
var optimizer = new Optimizer<RandomForest>(forest, pair, new List<ITrainingCallback>
{
    new VerboseTrainingCallback(pair),
    new ModelWriterCallback(),
    new ConfusionMatrixCallback()
});

await optimizer.Train();

More

See the examples for how to use the API.

Product Compatible and additional computed target framework versions.
.NET net6.0 is compatible.  net6.0-android was computed.  net6.0-ios was computed.  net6.0-maccatalyst was computed.  net6.0-macos was computed.  net6.0-tvos was computed.  net6.0-windows was computed.  net7.0 was computed.  net7.0-android was computed.  net7.0-ios was computed.  net7.0-maccatalyst was computed.  net7.0-macos was computed.  net7.0-tvos was computed.  net7.0-windows was computed.  net8.0 was computed.  net8.0-android was computed.  net8.0-browser was computed.  net8.0-ios was computed.  net8.0-maccatalyst was computed.  net8.0-macos was computed.  net8.0-tvos was computed.  net8.0-windows was computed. 
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.

NuGet packages (5)

Showing the top 5 NuGet packages that depend on Radiate:

Package Downloads
Radiate.Extensions

Package Description

Souk

Package Description

Radiate.Data

Package Description

Souk.Radiate

Package Description

Radiate.Genetics

Package Description

GitHub repositories

This package is not used by any popular GitHub repositories.

Version Downloads Last updated
1.6.2 61 5/8/2024
1.6.1 86 4/27/2024
1.6.0 87 4/27/2024
1.5.9 84 4/27/2024
1.5.8 86 4/27/2024
1.5.7 80 4/22/2024
1.5.6 97 4/21/2024
1.5.5 89 4/11/2024
1.5.4 95 3/24/2024
1.5.3 114 2/27/2024
1.5.2 89 2/26/2024
1.5.1 98 2/26/2024
1.5.0 104 1/31/2024
1.4.9 88 1/25/2024
1.4.8 83 1/24/2024
1.4.7 81 1/24/2024
1.4.6 105 1/23/2024
1.4.5 167 12/17/2023
1.4.4 172 8/10/2023
1.4.3 156 8/6/2023
1.4.2 157 6/5/2023
1.4.1 216 3/14/2023
1.4.0 234 3/8/2023
1.3.9 232 3/2/2023
1.3.8 248 3/1/2023
1.3.7 241 2/27/2023
1.3.6 252 2/15/2023
1.3.5 309 11/29/2022
1.3.4 293 11/29/2022
1.3.3 372 11/28/2022
1.3.2 311 11/28/2022
1.3.1 312 11/28/2022
1.3.0 306 11/27/2022
1.2.9 315 11/27/2022
1.2.8 444 11/26/2022
1.2.7 301 11/26/2022
1.2.6 416 7/25/2022
1.2.5 392 7/25/2022
1.2.4 398 7/25/2022
1.2.3 404 7/25/2022
1.2.2 392 7/25/2022
1.2.1 418 6/22/2022
1.2.0 401 6/21/2022
1.1.9 423 6/21/2022
1.1.8 434 6/21/2022
1.1.7 403 6/20/2022
1.1.6 454 4/14/2022
1.1.5 418 4/14/2022
1.1.4 414 4/14/2022
1.1.3 420 4/14/2022
1.1.2 403 4/14/2022
1.1.1 420 4/14/2022
1.1.0 424 4/14/2022
1.0.9 444 2/14/2022
1.0.8 424 2/14/2022
1.0.7 410 2/14/2022
1.0.6 438 1/20/2022
1.0.5 265 1/15/2022
1.0.4 267 1/15/2022
1.0.3 471 1/12/2022
1.0.2 445 1/12/2022
1.0.1 315 1/12/2022
1.0.0 422 1/12/2022