BatchExtensions 1.0.16
dotnet add package BatchExtensions --version 1.0.16
NuGet\Install-Package BatchExtensions -Version 1.0.16
<PackageReference Include="BatchExtensions" Version="1.0.16" />
paket add BatchExtensions --version 1.0.16
#r "nuget: BatchExtensions, 1.0.16"
// Install BatchExtensions as a Cake Addin #addin nuget:?package=BatchExtensions&version=1.0.16 // Install BatchExtensions as a Cake Tool #tool nuget:?package=BatchExtensions&version=1.0.16
Azure OpenAI Global Batch Extensions
The Azure OpenAI Batch API is designed to handle large-scale and high-volume processing tasks efficiently. Process asynchronous groups of requests with separate quota, with 24-hour target turnaround, at 50% less cost than global standard. With batch processing, rather than send one request at a time you send a large number of requests in a single file. Global batch requests have a separate enqueued token quota avoiding any disruption of your online workloads.
This is a C# wrapper on top of the REST endpoints.
Without jsonl file.
var batchProcessingService = new BatchProcessingService(resourceName, apiKey);
var userPrompts = new List<string>()
{
"When was Microsoft founded?",
"When was the first XBOX released?",
"Who is the CEO of Microsoft?",
"What is Altair Basic?"
};
var uploadResponse = await batchProcessingService.UploadFileAsync("gpt-4o-mini",
"You are an AI assistant that helps people find information.",
[.. userPrompts]);
var fileInputId = uploadResponse.Id;
Console.WriteLine($"File uploaded successfully with file input id: {fileInputId}");
foreach (var job in jobs.Data)
{
Console.WriteLine($"Job Id: {job.Id}, Status: {job.Status}");
if(job.Status == "completed")
{
var responses = await batchProcessingService.DownloadBatchResponseAsync(job.OutputFileId);
foreach (var response in responses)
{
response.Response.Body.Choices.ForEach(choice =>
{
Console.WriteLine(choice.Message.Content);
});
Console.WriteLine();
}
}
}
With jsonl file
var batchProcessingService = new BatchProcessingService(resourceName, apiKey);
var uploadResponse = await batchProcessingService.UploadFileAsync(@"C:\BatchInput\test.jsonl");
var fileInputId = uploadResponse.Id;
Console.WriteLine($"File uploaded successfully with file input id: {fileInputId}");
var uploadFileStatus = await batchProcessingService.GetUploadFileStatusAsync(fileInputId);
Console.WriteLine($"File upload status: {uploadFileStatus.Status}");
var jobResponse = await batchProcessingService.CreateBatchJobAsync(fileInputId);
Console.WriteLine($"Job created successfully with job id: {jobResponse.Id}");
var jobs = await batchProcessingService.ListBatchJobsAsync();
foreach (var job in jobs.Data)
{
Console.WriteLine($"Job Id: {job.Id}, Status: {job.Status}");
if(job.Status == "completed")
{
var output = await batchProcessingService.DownloadFileAsync(job.OutputFileId);
File.WriteAllBytes(@$"C:\BatchOutput\{job.Id}.jsonl", output);
}
}
For the example file checkout Prepare your batch file.
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net8.0 is compatible. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. |
-
net8.0
- No dependencies.
NuGet packages
This package is not used by any NuGet packages.
GitHub repositories
This package is not used by any popular GitHub repositories.