Tensor 0.4.11

Tensor (n-dimensional array) library for F#

     Core features:
       - n-dimensional arrays (tensors) in host memory or on CUDA GPUs
       - element-wise operations (addition, multiplication, absolute value, etc.)
       - basic linear algebra operations (dot product, SVD decomposition, matrix inverse, etc.)
       - reduction operations (sum, product, average, maximum, arg max, etc.)
       - logic operations (comparision, and, or, etc.)
       - views, slicing, reshaping, broadcasting (similar to NumPy)
       - scatter and gather by indices
       - standard functional operations (map, fold, etc.)

     Data exchange:
       - read/write support for HDF5 (.h5)
       - interop with standard F# types (Seq, List, Array, Array2D, Array3D, etc.)

     Performance:
       - host: SIMD and BLAS accelerated operations
         - by default Intel MKL is used (shipped with NuGet package)
         - other BLASes (OpenBLAS, vendor-specific) can be selected by configuration option
       - CUDA GPU: all operations performed locally on GPU and cuBLAS used for matrix operations

     Requirements:
       - Linux, MacOS or Windows on x64
       - Linux requires libgomp.so.1 installed.

     Additional algorithms are provided in the Tensor.Algorithm package.

Install-Package Tensor -Version 0.4.11
dotnet add package Tensor --version 0.4.11
<PackageReference Include="Tensor" Version="0.4.11" />
For projects that support PackageReference, copy this XML node into the project file to reference the package.
paket add Tensor --version 0.4.11
The NuGet Team does not provide support for this client. Please contact its maintainers for support.

This package is not used by any popular GitHub repositories.

Version History

Version Downloads Last updated
0.4.11 582 5/8/2018
0.4.11-v0.4.11-215 160 5/8/2018
0.4.11-symtensor-core-242 162 11/15/2018
0.4.11-symtensor-core-241 118 11/15/2018
0.4.11-symtensor-core-240 130 11/15/2018
0.4.11-symtensor-core-239 121 11/15/2018
0.4.11-symtensor-core-238 130 11/15/2018
0.4.11-symtensor-core-237 160 11/15/2018
0.4.11-symtensor-core-236 121 11/14/2018
0.4.11-symtensor-core-235 126 11/14/2018
0.4.11-symtensor-core-234 117 11/14/2018
0.4.11-symtensor-core-231 150 11/9/2018
0.4.11-symtensor-core-230 139 11/9/2018
0.4.11-symtensor-core-229 121 11/8/2018
0.4.11-symtensor-core-228 120 11/8/2018
0.4.11-symtensor-core-227 158 10/30/2018
0.4.11-symtensor-core-226 156 10/30/2018
0.4.11-symtensor-core-225 126 10/30/2018
0.4.11-develop-216 211 5/8/2018
0.4.10-develop-213 201 5/8/2018
0.4.10-develop-212 205 5/7/2018
0.4.10-develop-211 213 5/7/2018
0.3.0.712-master 235 9/1/2017
0.3.0.711-master 224 9/1/2017
0.3.0.710-master 222 9/1/2017
0.3.0.709-master 209 8/31/2017
0.3.0.708-master 233 8/30/2017
0.3.0.707-master 231 8/30/2017
0.3.0.706-master 225 8/30/2017
0.3.0.701-master 264 6/26/2017
0.3.0.700-master 285 6/22/2017
0.3.0.699-master 251 6/22/2017
0.3.0.698-master 253 6/21/2017
0.3.0.697-master 253 6/21/2017
0.3.0.696-master 277 6/21/2017
0.3.0.695-master 254 6/21/2017
0.3.0.694-master 245 6/21/2017
0.3.0.693-master 255 6/20/2017
0.3.0.692-master 250 6/19/2017
0.3.0.691-master 267 6/19/2017
0.3.0.690-master 249 6/19/2017
0.3.0.689-master 260 5/14/2017
0.3.0.688 488 5/14/2017
0.3.0.686-master 258 5/14/2017
0.2.0.591-master 261 4/19/2017
0.2.0.565-master 264 4/11/2017
0.2.0.556-master 259 3/21/2017
0.2.0.551-master 313 3/17/2017
0.2.0.540-master 250 3/15/2017
0.2.0.536-master 250 3/14/2017
0.2.0.519-master 265 3/2/2017
0.2.0.516-master 253 3/2/2017
0.2.0.499-master 267 2/13/2017
0.2.0.494-master 256 2/7/2017
0.2.0.479-master 273 2/1/2017
0.2.0.463-master 273 1/17/2017
0.2.0.431-master 343 12/2/2016
0.2.0.422-master 289 11/9/2016
0.2.0.421-master 271 11/9/2016
0.2.0.411-master 326 10/26/2016
0.2.0.400-master 279 10/26/2016
0.2.0.394-master 297 10/25/2016
0.2.0.382-master 283 10/21/2016
0.2.0.377-master 274 10/20/2016
0.2.0.323-master 278 10/11/2016
0.2.0.262-master 282 9/29/2016
0.2.0.248-master 284 9/27/2016
0.2.0.174-master 297 9/16/2016
0.2.0.128-master 286 9/8/2016
0.2.0.122-master 296 9/8/2016
0.2.0.121-master 283 9/7/2016
0.2.0.111-master 275 9/7/2016
0.2.0.105-ci 323 9/5/2016
0.2.0.97-ci 305 8/30/2016
0.2.0.96-ci 284 8/29/2016
0.2.0.90-ci 291 8/25/2016
0.2.0.89-ci 276 8/24/2016
0.2.0.88-ci 287 8/24/2016
0.2.0.87-ci 285 8/24/2016
0.2.0.86-ci 281 8/23/2016
0.2.0.85-ci 288 8/22/2016
0.2.0.84-ci 293 8/22/2016
0.2.0.83-ci 294 8/22/2016
0.2.0.82 382 8/22/2016
0.2.0.81-ci 294 8/19/2016
0.2.0.80-ci 306 6/27/2016
0.2.0.79-ci 304 6/27/2016
0.2.0.77-ci 308 6/22/2016
0.2.0.76-ci 302 6/22/2016
0.2.0.75 315 6/15/2016
0.2.0.74-ci 303 6/15/2016
0.2.0.73 302 6/15/2016
0.2.0.72 314 6/15/2016
0.2.0.71 353 6/14/2016
0.2.0.70 314 6/9/2016
0.2.0.69 282 6/9/2016
0.2.0.68 309 6/9/2016
0.2.0.67 306 6/8/2016
0.2.0.66-ci 294 6/8/2016
0.2.0.65-ci 292 6/8/2016
0.2.0.64-ci 324 6/8/2016
0.2.0.63-ci 292 6/7/2016
0.2.0.62 308 6/7/2016
0.2.0.61 296 6/6/2016
0.2.0.60 285 6/6/2016
0.2.0.59 286 6/6/2016
0.2.0.57 305 6/3/2016
0.2.0.56 295 6/3/2016
0.2.0.55 328 6/3/2016
0.2.0.54 305 6/3/2016
0.2.0.53 289 6/3/2016
0.2.0.52-ci 293 6/2/2016
0.2.0.51-ci 295 6/2/2016
0.2.0.50-ci 300 6/2/2016
0.2.0.49 312 5/31/2016
0.2.0.48-ci 305 5/31/2016
0.2.0.46-ci 288 5/31/2016
0.2.0.45 299 5/31/2016
0.2.0.44 297 5/31/2016
0.2.0.43 303 5/31/2016
0.2.0.42 317 5/30/2016
0.2.0.41 310 5/30/2016
0.2.0.40 299 5/30/2016
0.2.0.39 310 5/30/2016
0.2.0.38 301 5/30/2016
0.2.0.37 295 5/30/2016
0.2.0.36 303 5/25/2016
0.2.0.35 312 5/24/2016
0.2.0.34 308 5/24/2016
0.2.0.33 305 5/24/2016
0.2.0.32-ci 288 5/24/2016
0.1.26-ci 304 5/24/2016
0.1.24-ci 295 5/24/2016
0.1.19-ci 290 5/24/2016