Primo.ART.Linux 1.0.2

There is a newer version of this package available.
See the version list below for details.
dotnet add package Primo.ART.Linux --version 1.0.2                
NuGet\Install-Package Primo.ART.Linux -Version 1.0.2                
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="Primo.ART.Linux" Version="1.0.2" />                
For projects that support PackageReference, copy this XML node into the project file to reference the package.
paket add Primo.ART.Linux --version 1.0.2                
#r "nuget: Primo.ART.Linux, 1.0.2"                
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
// Install Primo.ART.Linux as a Cake Addin
#addin nuget:?package=Primo.ART.Linux&version=1.0.2

// Install Primo.ART.Linux as a Cake Tool
#tool nuget:?package=Primo.ART.Linux&version=1.0.2                

About

Primo RPA robot uses elements of the package to automate interaction with Primo AI Server intelligent data recognition server by Primo RPA company.

How to Use

In Primo RPA studio, create a project that will be executed by Primo RPA robot. Install this package via the ".Dependencies → Manage Dependencies" menu; the "AI" node with package elements will appear in Elements tree.

The "Primo AI server" element is a container for all other package elements. This element is used to connect to the server and to authenticate.
In pure code projects use the element as follows:

//app - [Primo.AI.Server.PrimoAIApp] PrimoAIApp application
//server - Server: [String] Server URL
//login - Login: [String] User login
//password - Password: [String] User password
//spassword - Secure password: [SecureString] Secure password
//to - Time-out: [Int32] Server interactions time-out (ms)
Primo.AI.Server.PrimoAIApp app = Primo.AI.Server.PrimoAIApp.Init(wf, server, [to], [login], [spassword], [password]);

The "Smart OCR\Create request OCR" element sends a request for document recognition. In response, it receives the request key.
In pure code projects:

//app - [Primo.AI.Server.PrimoAIApp] PrimoAIApp application
//modelType - [String] Model type for document processing
//file - Document: [String] Path to document file
System.Guid txt = app.CreateInference(file, modelType);

The "Smart OCR\Get result OCR" element receives the processing result from the server.
In pure code:

//app - [Primo.AI.Server.PrimoAIApp] PrimoAIApp application
//requestKey - [System.Guid] Request key
Primo.AI.Server.Model.InferenceResult txt = app.GetInference(requestKey);

The "NLP\Create request NLP" element sends a request NLP. In response, it receives the request key.
In pure code projects:

//app - [Primo.AI.Server.PrimoAIApp] PrimoAIApp application
//modelType - Routing key: [String] Routing key for request
//prompt - Prompt: [String] Text to be processed
//responseLength - Response length: [Int32] Maximum length of model responses in tokens
//responseSchema - Response keys: [List<String>] Parameters specific to each NLP task
//temperature - Temperature: [Double] The lower the temperature, the less variability in generation
//minP - Min p: [Double] An additional parameter that defines the threshold in the range of token selection (0 is any, 1 is the most likely)
//contextFile - Context: [String] Path to context file (.json)
System.Guid txt = app.CreateRequestNlp(modelType, prompt, responseLength, [responseSchema], [temperature], [minP], [contextFile]);

The "NLP\Get result NLP" element receives the result of the request NLP from the server.
In pure code:

//app - [Primo.AI.Server.PrimoAIApp] PrimoAIApp application
//requestKey - [System.Guid] Request key
Primo.AI.Server.Model.NlpResult txt = app.GetRequestNlp(requestKey);

Key Features

  • High performance
  • Supports .NET 8

Main Types

  • Primo.AI.Server.Elements.WFAttachPrimoAIServer
  • Primo.AI.Server.Elements.WFPrimoAICreateInference
  • Primo.AI.Server.Elements.WFPrimoAIGetInference
  • Primo.AI.Server.PrimoAIApp
  • Primo.AI.Server.Model.BoundingBox
  • Primo.AI.Server.Model.ImageTransforms
  • Primo.AI.Server.Model.RectangleItemModel
  • Primo.AI.Server.Model.InferenceResult
  • Primo.AI.Server.Model.InferenceResultContent
  • Primo.AI.Server.Model.InferenceResultFile
  • Primo.AI.Server.Model.InferenceResultItem
  • Primo.AI.Server.Model.InferenceResultItemRow
  • Primo.AI.Server.Model.NlpResult
  • Primo.AI.Server.Model.NlpResultContent

Feedback

Bug reports and contributions are welcome at Primo RPA chat

Product Compatible and additional computed target framework versions.
.NET net8.0 is compatible.  net8.0-android was computed.  net8.0-browser was computed.  net8.0-ios was computed.  net8.0-maccatalyst was computed.  net8.0-macos was computed.  net8.0-tvos was computed.  net8.0-windows was computed. 
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.

This package has no dependencies.

NuGet packages

This package is not used by any NuGet packages.

GitHub repositories

This package is not used by any popular GitHub repositories.

Version Downloads Last updated
1.0.3 42 12/18/2024
1.0.2 50 12/16/2024