Tensor 0.4.11

.NET Standard 2.0
dotnet add package Tensor --version 0.4.11
NuGet\Install-Package Tensor -Version 0.4.11
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="Tensor" Version="0.4.11" />
For projects that support PackageReference, copy this XML node into the project file to reference the package.
paket add Tensor --version 0.4.11
#r "nuget: Tensor, 0.4.11"
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
// Install Tensor as a Cake Addin
#addin nuget:?package=Tensor&version=0.4.11

// Install Tensor as a Cake Tool
#tool nuget:?package=Tensor&version=0.4.11

Tensor (n-dimensional array) library for F#

     Core features:
       - n-dimensional arrays (tensors) in host memory or on CUDA GPUs
       - element-wise operations (addition, multiplication, absolute value, etc.)
       - basic linear algebra operations (dot product, SVD decomposition, matrix inverse, etc.)
       - reduction operations (sum, product, average, maximum, arg max, etc.)
       - logic operations (comparision, and, or, etc.)
       - views, slicing, reshaping, broadcasting (similar to NumPy)
       - scatter and gather by indices
       - standard functional operations (map, fold, etc.)

     Data exchange:
       - read/write support for HDF5 (.h5)
       - interop with standard F# types (Seq, List, Array, Array2D, Array3D, etc.)

     Performance:
       - host: SIMD and BLAS accelerated operations
         - by default Intel MKL is used (shipped with NuGet package)
         - other BLASes (OpenBLAS, vendor-specific) can be selected by configuration option
       - CUDA GPU: all operations performed locally on GPU and cuBLAS used for matrix operations

     Requirements:
       - Linux, MacOS or Windows on x64
       - Linux requires libgomp.so.1 installed.

     Additional algorithms are provided in the Tensor.Algorithm package.

Product Compatible and additional computed target framework versions.
.NET net5.0 was computed.  net5.0-windows was computed.  net6.0 was computed.  net6.0-android was computed.  net6.0-ios was computed.  net6.0-maccatalyst was computed.  net6.0-macos was computed.  net6.0-tvos was computed.  net6.0-windows was computed.  net7.0 was computed.  net7.0-android was computed.  net7.0-ios was computed.  net7.0-maccatalyst was computed.  net7.0-macos was computed.  net7.0-tvos was computed.  net7.0-windows was computed. 
.NET Core netcoreapp2.0 was computed.  netcoreapp2.1 was computed.  netcoreapp2.2 was computed.  netcoreapp3.0 was computed.  netcoreapp3.1 was computed. 
.NET Standard netstandard2.0 is compatible.  netstandard2.1 was computed. 
.NET Framework net461 was computed.  net462 was computed.  net463 was computed.  net47 was computed.  net471 was computed.  net472 was computed.  net48 was computed.  net481 was computed. 
MonoAndroid monoandroid was computed. 
MonoMac monomac was computed. 
MonoTouch monotouch was computed. 
Tizen tizen40 was computed.  tizen60 was computed. 
Xamarin.iOS xamarinios was computed. 
Xamarin.Mac xamarinmac was computed. 
Xamarin.TVOS xamarintvos was computed. 
Xamarin.WatchOS xamarinwatchos was computed. 
Compatible target framework(s)
Additional computed target framework(s)
Learn more about Target Frameworks and .NET Standard.

NuGet packages (3)

Showing the top 3 NuGet packages that depend on Tensor:

Package Downloads
DeepNet

Deep learning library for F#. Provides symbolic model differentiation, automatic differentiation and compilation to CUDA GPUs. Includes optimizers and model blocks used in deep learning. Make sure to set the platform of your project to x64.

RPlotTools

Tools for plotting using R from F#.

Tensor.Algorithm

Data types: - arbitrary precision rational numbers Matrix algebra (integer, rational): - Row echelon form - Smith normal form - Kernel, cokernel and (pseudo-)inverse Matrix decomposition (floating point): - Principal component analysis (PCA) - ZCA whitening Misc: - Bezout's identity - Loading of NumPy's .npy and .npz files.

GitHub repositories

This package is not used by any popular GitHub repositories.

Version Downloads Last updated
0.4.11 6,152 5/8/2018
0.4.11-v0.4.11-215 606 5/8/2018
0.4.11-symtensor-core-242 944 11/15/2018
0.4.11-symtensor-core-241 938 11/15/2018
0.4.11-symtensor-core-240 915 11/15/2018
0.4.11-symtensor-core-239 904 11/15/2018
0.4.11-symtensor-core-238 895 11/15/2018
0.4.11-symtensor-core-237 975 11/15/2018
0.4.11-symtensor-core-236 845 11/14/2018
0.4.11-symtensor-core-235 915 11/14/2018
0.4.11-symtensor-core-234 868 11/14/2018
0.4.11-symtensor-core-231 976 11/9/2018
0.4.11-symtensor-core-230 945 11/9/2018
0.4.11-symtensor-core-229 905 11/8/2018
0.4.11-symtensor-core-228 932 11/8/2018
0.4.11-symtensor-core-227 953 10/30/2018
0.4.11-symtensor-core-226 1,000 10/30/2018
0.4.11-symtensor-core-225 910 10/30/2018
0.4.11-develop-216 1,107 5/8/2018
0.4.10-develop-213 1,146 5/8/2018
0.4.10-develop-212 1,101 5/7/2018
0.4.10-develop-211 1,154 5/7/2018
0.3.0.712-master 882 9/1/2017
0.3.0.711-master 888 9/1/2017
0.3.0.710-master 863 9/1/2017
0.3.0.709-master 865 8/31/2017
0.3.0.708-master 878 8/30/2017
0.3.0.707-master 866 8/30/2017
0.3.0.706-master 871 8/30/2017
0.3.0.701-master 915 6/26/2017
0.3.0.700-master 906 6/22/2017
0.3.0.699-master 890 6/22/2017
0.3.0.698-master 872 6/21/2017
0.3.0.697-master 880 6/21/2017
0.3.0.696-master 947 6/21/2017
0.3.0.695-master 904 6/21/2017
0.3.0.694-master 867 6/21/2017
0.3.0.693-master 893 6/20/2017
0.3.0.692-master 878 6/19/2017
0.3.0.691-master 914 6/19/2017
0.3.0.690-master 908 6/19/2017
0.3.0.689-master 889 5/14/2017
0.3.0.688 7,028 5/14/2017
0.3.0.686-master 901 5/14/2017
0.2.0.591-master 867 4/19/2017
0.2.0.565-master 864 4/11/2017
0.2.0.556-master 854 3/21/2017
0.2.0.551-master 920 3/17/2017
0.2.0.540-master 840 3/15/2017
0.2.0.536-master 844 3/14/2017
0.2.0.519-master 881 3/2/2017
0.2.0.516-master 860 3/2/2017
0.2.0.499-master 892 2/13/2017
0.2.0.494-master 864 2/7/2017
0.2.0.479-master 881 2/1/2017
0.2.0.463-master 881 1/17/2017
0.2.0.431-master 951 12/2/2016
0.2.0.422-master 1,239 11/9/2016
0.2.0.421-master 1,180 11/9/2016
0.2.0.411-master 924 10/26/2016
0.2.0.400-master 883 10/26/2016
0.2.0.394-master 888 10/25/2016
0.2.0.382-master 888 10/21/2016
0.2.0.377-master 874 10/20/2016
0.2.0.323-master 873 10/11/2016
0.2.0.262-master 908 9/29/2016
0.2.0.248-master 892 9/27/2016
0.2.0.174-master 887 9/16/2016
0.2.0.128-master 902 9/8/2016
0.2.0.122-master 899 9/8/2016
0.2.0.121-master 869 9/7/2016
0.2.0.111-master 880 9/7/2016
0.2.0.105-ci 922 9/5/2016
0.2.0.97-ci 935 8/30/2016
0.2.0.96-ci 892 8/29/2016
0.2.0.90-ci 896 8/25/2016
0.2.0.89-ci 854 8/24/2016
0.2.0.88-ci 894 8/24/2016
0.2.0.87-ci 894 8/24/2016
0.2.0.86-ci 889 8/23/2016
0.2.0.85-ci 895 8/22/2016
0.2.0.84-ci 905 8/22/2016
0.2.0.83-ci 918 8/22/2016
0.2.0.82 2,101 8/22/2016
0.2.0.81-ci 904 8/19/2016
0.2.0.80-ci 907 6/27/2016
0.2.0.79-ci 911 6/27/2016
0.2.0.77-ci 908 6/22/2016
0.2.0.76-ci 925 6/22/2016
0.2.0.75 1,575 6/15/2016
0.2.0.74-ci 1,260 6/15/2016
0.2.0.73 1,814 6/15/2016
0.2.0.72 1,806 6/15/2016
0.2.0.71 1,798 6/14/2016
0.2.0.70 1,687 6/9/2016
0.2.0.69 1,645 6/9/2016
0.2.0.68 1,480 6/9/2016
0.2.0.67 1,966 6/8/2016
0.2.0.66-ci 915 6/8/2016
0.2.0.65-ci 895 6/8/2016
0.2.0.64-ci 961 6/8/2016
0.2.0.63-ci 888 6/7/2016
0.2.0.62 1,461 6/7/2016
0.2.0.61 1,430 6/6/2016
0.2.0.60 1,451 6/6/2016
0.2.0.59 1,395 6/6/2016
0.2.0.57 1,468 6/3/2016
0.2.0.56 1,436 6/3/2016
0.2.0.55 1,517 6/3/2016
0.2.0.54 1,472 6/3/2016
0.2.0.53 1,788 6/3/2016
0.2.0.52-ci 886 6/2/2016
0.2.0.51-ci 913 6/2/2016
0.2.0.50-ci 903 6/2/2016
0.2.0.49 1,820 5/31/2016
0.2.0.48-ci 964 5/31/2016
0.2.0.46-ci 933 5/31/2016
0.2.0.45 1,666 5/31/2016
0.2.0.44 1,670 5/31/2016
0.2.0.43 1,637 5/31/2016
0.2.0.42 1,654 5/30/2016
0.2.0.41 1,644 5/30/2016
0.2.0.40 1,668 5/30/2016
0.2.0.39 1,695 5/30/2016
0.2.0.38 1,687 5/30/2016
0.2.0.37 1,614 5/30/2016
0.2.0.36 1,669 5/25/2016
0.2.0.35 1,667 5/24/2016
0.2.0.34 1,688 5/24/2016
0.2.0.33 2,484 5/24/2016
0.2.0.32-ci 888 5/24/2016
0.1.26-ci 924 5/24/2016
0.1.24-ci 916 5/24/2016
0.1.19-ci 892 5/24/2016