Gdo.koryakinp 2.0.0

Gradient Descent Optimization algorithms for .NET Core

There is a newer version of this package available.
See the version list below for details.
Install-Package Gdo.koryakinp -Version 2.0.0
dotnet add package Gdo.koryakinp --version 2.0.0
<PackageReference Include="Gdo.koryakinp" Version="2.0.0" />
For projects that support PackageReference, copy this XML node into the project file to reference the package.
paket add Gdo.koryakinp --version 2.0.0
The NuGet Team does not provide support for this client. Please contact its maintainers for support.
#r "nuget: Gdo.koryakinp, 2.0.0"
For F# scripts that support #r syntax, copy this into the source code to reference the package.

Gdo

Gradient Descent Optimization algorithms for .NET Core

  1. Adagrad
  2. Adam
  3. Adadelta
  4. RMSprop

Installing

PM> Install-Package Gdo.koryakinp

Example

compute a deriviative dx and provide it to Update() method of the optimizer:

var opt1 = new Adagrad(0.1);
opt1.SetValue(10);
opt1.Update(dx);
var res1 = opt1.Value;

Adagrad optimizer will use 0.1 learning rate to update the value.
Similarly you can use different optimizers:

Adam
var opt2 = new Adam(0.01, 100, 1000);
opt2.SetValue(10);
opt2.Update(dx);
var res2 = opt2.Value;
RMSprop
var opt3 =  new RMSprop(0.1, 25);
opt3.SetValue(10);
opt3.Update(dx);
var res3 = opt3.Value;
Adadelta
var opt4 = new Adadelta(0.1, 25);
opt4.SetValue(10);
opt4.Update(dx);
var res4 = opt4.Value;

Authors

Pavel koryakin koryakinp@koryakinp.com

License

This project is licensed under the MIT License - see the LICENSE.md for details.

Acknowledgments

Gdo

Gradient Descent Optimization algorithms for .NET Core

  1. Adagrad
  2. Adam
  3. Adadelta
  4. RMSprop

Installing

PM> Install-Package Gdo.koryakinp

Example

compute a deriviative dx and provide it to Update() method of the optimizer:

var opt1 = new Adagrad(0.1);
opt1.SetValue(10);
opt1.Update(dx);
var res1 = opt1.Value;

Adagrad optimizer will use 0.1 learning rate to update the value.
Similarly you can use different optimizers:

Adam
var opt2 = new Adam(0.01, 100, 1000);
opt2.SetValue(10);
opt2.Update(dx);
var res2 = opt2.Value;
RMSprop
var opt3 =  new RMSprop(0.1, 25);
opt3.SetValue(10);
opt3.Update(dx);
var res3 = opt3.Value;
Adadelta
var opt4 = new Adadelta(0.1, 25);
opt4.SetValue(10);
opt4.Update(dx);
var res4 = opt4.Value;

Authors

Pavel koryakin koryakinp@koryakinp.com

License

This project is licensed under the MIT License - see the LICENSE.md for details.

Acknowledgments

Release Notes

-changed Compute() to Update()
-added SetValue()

  • .NETCoreApp 2.0

    • No dependencies.

NuGet packages (1)

Showing the top 1 NuGet packages that depend on Gdo.koryakinp:

Package Downloads
Ann
Machine Learning library for .NET Core

GitHub repositories

This package is not used by any popular GitHub repositories.

Version History

Version Downloads Last updated
2.1.6 467 12/7/2018
2.1.5 390 11/24/2018
2.1.4 395 7/31/2018
2.1.3 458 7/13/2018
2.1.2 445 7/13/2018
2.1.1 434 7/13/2018
2.1.0 433 7/12/2018
2.0.0 537 3/25/2018
1.0.0 484 3/25/2018
Show less