GradientDescentSharp 0.0.5

There is a newer version of this package available.
See the version list below for details.
dotnet add package GradientDescentSharp --version 0.0.5                
NuGet\Install-Package GradientDescentSharp -Version 0.0.5                
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="GradientDescentSharp" Version="0.0.5" />                
For projects that support PackageReference, copy this XML node into the project file to reference the package.
paket add GradientDescentSharp --version 0.0.5                
#r "nuget: GradientDescentSharp, 0.0.5"                
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
// Install GradientDescentSharp as a Cake Addin
#addin nuget:?package=GradientDescentSharp&version=0.0.5

// Install GradientDescentSharp as a Cake Tool
#tool nuget:?package=GradientDescentSharp&version=0.0.5                

Warning, API change: Now after any nn learning method call Learn() method

var result = nn.Backwards(input, expected);
result.Learn(); //to learn last backpropagation
result.Unlearn(); //to unlearn last backpropagation

GradientDescentSharp

This little library allows to compute a approximate solution for some defined problem with error function, using gradient descent.

Simple example:

//first define a problem
var problem = (IDataAccess<double> x) =>
{
    var n = x[0];
    //we seek for such value, that n^n=5
    var needToMinimize = Math.Pow(n, n) - 5.0;
    return Math.Abs(needToMinimize);
};
//then define changing variables
var variables = new ArrayDataAccess<double>(1);
//set variables close to global minima
variables[0] = 1;

//define descent
var descent = new MineDescent(variables, problem)
{
    DescentRate = 0.1,             // how fast to descent, this value will be adjusted on the fly
    Theta = 1e-4,                  // what precision of found minima we need
    DescentRateDecreaseRate = 0.1, // how much decrease DescentRate when we hit a grow of error function
    Logger = new ConsoleLogger()   // logger for descent progress
};

//do 30 iterations
descent.Descent(30);

System.Console.WriteLine("For problem n^n=5");
System.Console.WriteLine($"Error is {problem(variables)}");
System.Console.WriteLine($"n={variables[0]}");
System.Console.WriteLine($"n^n={Math.Pow(variables[0], variables[0])}");

Output

--------------Mine descent began
Error is 3.8894657589454242
Changed by 0.11053424105457577
-------------
...
-------------
Error is 0.2503619082577506
Changed by 0.7496380917422432
-------------
Error is 0.66669577875009
Changed by 0.4163338704923394
Undo step. Decreasing descentRate.
-------------
...
-------------
Error is 0.00015378896740614323
Changed by 8.779536614600403E-05
--------------Mine done in 24 iterations
For problem n^n=5
Error is 0.00015378896740614323
n=2.1293900000000012
n^n=5.000153788967406

As you can see, when we hit a grow in error function, the descent undo step and decrease descentRate, so we are guaranteed to hit a local minima!

It still is very important to define a good error function and init variables.

Also I've a good working feed-forward neural network implementation here.

And it can learn from error function only, like a simplified reinforcement learning, also there is a way to do continuous learning, all with one simple class - without much hassle and abstraction layers.

Check out neural network playground examples.

Product Compatible and additional computed target framework versions.
.NET net6.0 is compatible.  net6.0-android was computed.  net6.0-ios was computed.  net6.0-maccatalyst was computed.  net6.0-macos was computed.  net6.0-tvos was computed.  net6.0-windows was computed.  net7.0 was computed.  net7.0-android was computed.  net7.0-ios was computed.  net7.0-maccatalyst was computed.  net7.0-macos was computed.  net7.0-tvos was computed.  net7.0-windows was computed.  net8.0 was computed.  net8.0-android was computed.  net8.0-browser was computed.  net8.0-ios was computed.  net8.0-maccatalyst was computed.  net8.0-macos was computed.  net8.0-tvos was computed.  net8.0-windows was computed. 
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.

NuGet packages

This package is not used by any NuGet packages.

GitHub repositories

This package is not used by any popular GitHub repositories.

Version Downloads Last updated
0.0.9 180 9/27/2023
0.0.8 154 9/27/2023
0.0.7 150 9/25/2023
0.0.6 165 5/29/2023
0.0.5 159 5/24/2023
0.0.4 158 5/21/2023
0.0.3 155 5/21/2023
0.0.2 168 5/20/2023
0.0.1 158 5/15/2023