DiffSharp 0.7.7

DiffSharp is an automatic differentiation (AD) library.

AD allows exact and efficient calculation of derivatives, by systematically invoking the chain rule of calculus at the elementary operator level during program execution. AD is different from numerical differentiation, which is prone to truncation and round-off errors, and symbolic differentiation, which is affected by expression swell and cannot fully handle algorithmic control flow.

Using the DiffSharp library, derivative calculations (gradients, Hessians, Jacobians, directional derivatives, and matrix-free Hessian- and Jacobian-vector products) can be incorporated with minimal change into existing algorithms. Diffsharp supports nested forward and reverse AD up to any level, meaning that you can compute exact higher-order derivatives or differentiate functions that are internally making use of differentiation. Please see the API Overview page for a list of available operations.

The library is under active development by Atılım Güneş Baydin and Barak A. Pearlmutter mainly for research applications in machine learning, as part of their work at the Brain and Computation Lab, Hamilton Institute, National University of Ireland Maynooth.

DiffSharp is implemented in the F# language and can be used from C# and the other languages running on Mono or the .Net Framework, targeting the 64 bit platform. It is tested on Linux and Windows. We are working on interfaces/ports to other languages.

There is a newer prerelease version of this package available.
See the version list below for details.
Install-Package DiffSharp -Version 0.7.7
dotnet add package DiffSharp --version 0.7.7
<PackageReference Include="DiffSharp" Version="0.7.7" />
For projects that support PackageReference, copy this XML node into the project file to reference the package.
paket add DiffSharp --version 0.7.7
The NuGet Team does not provide support for this client. Please contact its maintainers for support.

Release Notes

Please visit

https://github.com/DiffSharp/DiffSharp/releases

for the latest release notes.

NuGet packages (1)

Showing the top 1 NuGet packages that depend on DiffSharp:

Package Downloads
Hype
Hype is a proof-of-concept deep learning library, where you can perform optimization on compositional machine learning systems of many components, even when such components themselves internally perform optimization. This is enabled by nested automatic differentiation (AD) giving you access to the automatic exact derivative of any floating-point value in your code with respect to any other. Underlying computations are run by a BLAS/LAPACK backend (OpenBLAS by default).

GitHub repositories

This package is not used by any popular GitHub repositories.

Version History

Version Downloads Last updated
0.8.4-beta 535 8/24/2019
0.8.3-beta 242 7/4/2019
0.8.2-beta 227 6/25/2019
0.8.1-beta 218 6/20/2019
0.8.0-beta 228 6/11/2019
0.7.7 2,994 12/25/2015
0.7.6 612 12/15/2015
0.7.5 606 12/6/2015
0.7.4 661 10/13/2015
0.7.3 581 10/6/2015
0.7.2 569 10/4/2015
0.7.1 565 10/4/2015
0.7.0 569 9/29/2015
0.6.3 1,127 7/18/2015
0.6.2 578 6/6/2015
0.6.1 574 6/2/2015
0.6.0 569 4/26/2015
0.5.10 591 3/27/2015
0.5.9 557 2/26/2015
0.5.8 640 2/23/2015
0.5.7 552 2/17/2015
0.5.6 592 2/13/2015
0.5.5 556 12/15/2014
0.5.4 712 11/23/2014
0.5.3 1,289 11/7/2014
0.5.2 1,283 11/4/2014
0.5.1 569 10/27/2014
0.5.0 593 10/2/2014
Show less