Workleap.Extensions.Mongo.Ephemeral
1.6.1-preview.6
Prefix Reserved
See the version list below for details.
dotnet add package Workleap.Extensions.Mongo.Ephemeral --version 1.6.1-preview.6
NuGet\Install-Package Workleap.Extensions.Mongo.Ephemeral -Version 1.6.1-preview.6
<PackageReference Include="Workleap.Extensions.Mongo.Ephemeral" Version="1.6.1-preview.6" />
paket add Workleap.Extensions.Mongo.Ephemeral --version 1.6.1-preview.6
#r "nuget: Workleap.Extensions.Mongo.Ephemeral, 1.6.1-preview.6"
// Install Workleap.Extensions.Mongo.Ephemeral as a Cake Addin #addin nuget:?package=Workleap.Extensions.Mongo.Ephemeral&version=1.6.1-preview.6&prerelease // Install Workleap.Extensions.Mongo.Ephemeral as a Cake Tool #tool nuget:?package=Workleap.Extensions.Mongo.Ephemeral&version=1.6.1-preview.6&prerelease
Workleap.Extensions.Mongo
Workleap.Extensions.Mongo is a convenient set of .NET libraries designed to enhance and streamline the MongoDB C# driver integration into your C# projects.
Value proposition and features overview
Integrating the MongoDB C# driver into your C# projects can often lead to several questions:
- What's the optimal way to configure a MongoDB client from my app configuration? Should I use
appsettings.json
, environment variables, custom option classes, plain C# code? - How can I expose the MongoDB client to the rest of my code? Should I use dependency injection?
- What are the best practices for configuring the MongoDB client? How should MongoDB C# driver static settings be handled?
- If I want to support multiple MongoDB clusters and/or databases, won't that require a significant refactor of my existing code?
- What's the most effective way to manage my indexes? Should they be created manually, or within my C# code? How can I ensure that indexes are synchronized with my code?
- If I have multiple C# applications, how can I prevent MongoDB setup code duplication?
- How can I instrument my code, considering the MongoDB C# driver doesn't natively support OpenTelemetry?
- How can I execute integration tests in an isolated environment? Using a shared database requires cleanup, leading to unreliable test results.
Workleap.Extensions.Mongo was developed to address these challenges. We offer a straightforward, flexible, and standard approach to adding and configuring the MongoDB C# driver in your C# projects. Here is an overview of the features of Workleap.Extensions.Mongo:
- Support for dependency injection: We use Microsoft's modern dependency injection (
IServiceCollection
) to expose MongoDB's classes and interfaces. - Standardized configuration: We leverage Microsoft's configuration, enabling easy configuration of MongoDB settings via diverse configuration providers of your choice. The options pattern simplifies overriding and extending any setting, including static MongoDB settings (custom serializers and convention packs).
- Support for multiple MongoDB clusters and/or databases: You won't need to refactor your entire codebase to support multiple MongoDB data sources - it's supported by default.
- Elimination of boilerplate and duplicated code: Remove redundant, copy-pasted MongoDB C# code from your codebase, enabling you to focus on actually utilizing the driver.
- Built-in instrumentation: We provide built-in support for OpenTelemetry instrumentation, adhering to OpenTelemetry's semantic conventions for MongoDB. Additionally, we offer an extra NuGet package for Application Insights .NET SDK support.
- Optional index management: Declare indexes in your C# code, and then use our C# API to automatically create and update indexes based on what's declared in your code. Built-in Roslyn analyzers will assist developers in considering new indexes.
- Optional async enumerables support: You can simplify your code by using our extension methods that employ
IAsyncEnumerable
rather than more verbose MongoDB cursors. - Optional field-level encryption: Implement your own encrypt and decrypt methods, which can then automatically encrypt annotated MongoDB document fields for at-rest security.
- Optional ephemeral database for integration tests: Each of your integration test methods can have its own new MongoDB database operating locally.
Getting started
We offer three main NuGet packages:
Firstly, Workleap.Extensions.Mongo, is the package that you'd ideally install in your startup project, where your main method resides. This is where you would incorporate our library into your dependency injection services and link your configuration to our option classes:
// Method 1: Directly set the options values with C# code
services.AddMongo(options =>
{
options.ConnectionString = "[...]";
options.DefaultDatabaseName = "marketing";
});
// Method 2: Bind the options values to a configuration section
services.AddMongo(configuration.GetRequiredSection("Mongo").Bind);
// Method 3: Lazily bind the options values to a configuration section
services.AddMongo();
services.AddOptions<MongoClientOptions>().Bind(configuration.GetRequiredSection("Mongo"));
// appsettings.json (or any other configuration source such as environment variables or Azure KeyVault)
{
"Mongo": {
"ConnectionString": "[...]",
"DefaultDatabaseName": "marketing"
}
}
// Method 4: Implement IConfigureNamedOptions<MongoClientOptions>:
// https://learn.microsoft.com/en-us/dotnet/core/extensions/options#use-di-services-to-configure-options
The second NuGet package, Workleap.Extensions.Mongo.Abstractions, only provides abstractions and extension methods, aligning with the principles of Microsoft's extension libraries such as Microsoft.Extensions.Logging.Abstractions and Microsoft.Extensions.Configuration.Abstractions. You would typically install this package in your domain-specific .NET projects to avoid unnecessary NuGet dependencies. This package already includes the MongoDB C# driver, so there's no need to install it separately (unless you need a specific version).
// 1) Directly inject a collection bound to the default database
var people = serviceProvider.GetRequiredService<IMongoCollection<PersonDocument>>();
// 2) You can inject the default database
var people = serviceProvider.GetRequiredService<IMongoDatabase>().GetCollection<PersonDocument>();
// 3) You can inject the default client
var people = serviceProvider.GetRequiredService<IMongoClient>()
.GetDatabase("marketing").GetCollection<PersonDocument>();
// 4) Finally, you can inject a specific client for a specific registered MongoDB cluster
// More on that later in this document
var people = serviceProvider.GetRequiredService<IMongoClientProvider>()
.GetClient("mycluster").GetDatabase("marketing").GetCollection<PersonDocument>();
[MongoCollection("people")]
public class PersonDocument : IMongoDocument // IMongoDocument is an empty marker interface (required)
{
// [...]
}
The third NuGet package, Workleap.Extensions.Mongo.Ephemeral, is designed for your integration tests. It can be utilized whenever you require a real yet ephemeral MongoDB cluster with a single node replica set. Through dependency injection, each integration test method can have access to a unique and isolated database.
Adding and configuring MongoDB clients
When registering your dependency injection services, you can invoke services.AddMongo(...)
as demonstrated in the previous section. This action registers MongoDB dependencies and the main MongoDB cluster by providing the connection string and default (primary) database name.
It is also possible to register multiple additional MongoDB clusters:
services.AddMongo(options => { /* [...] */ })
.AddNamedClient("anotherCluster", options => { /* [...] */ })
.AddNamedClient("andAnotherOne", options => { /* [...] */ });
// There are many ways to configure these named options
services.AddOptions<MongoClientOptions>("anotherCluster").Bind(configuration.GetRequiredSection("Mongo:AnotherCluster"));
services.AddOptions<MongoClientOptions>("andAnotherOne").Bind(configuration.GetRequiredSection("Mongo:AndAnotherOne"));
The MongoClientOptions
option class further permits you to configure the MongoClientSettings
for a cluster:
services.AddMongo(options =>
{
options.MongoClientSettingsConfigurator = settings =>
{
settings.ApplicationName = "myapp";
settings.ClusterConfigurator = cluster =>
{
// [...]
};
};
});
While MongoClientOptions
is the option class for configuring a specific MongoDB cluster connection, MongoStaticOptions
is available to customize MongoDB static options such as BSON serializers and convention packs for the entire application:
services.AddMongo().ConfigureStaticOptions(options =>
{
// There are built-in serializers and conventions registered, but you can remove or override them
// ⚠ Caution, these are objects that will live for the entire lifetime of the application (singleton) as the MongoDB C# driver
// uses static properties to configure its behavior and serialization
options.GuidRepresentationMode = GuidRepresentationMode.V2; // V3 is the default
options.BsonSerializers[typeof(Guid)] = new GuidSerializer(GuidRepresentation.Standard);
options.ConventionPacks.Add(new MyConventionPack());
});
You can also use different options patterns to configure static options:
services.AddOptions<MongoStaticOptions>().Configure(options => { /* [...] */ })
When using the AddMongo()
method, multiple conventions are added automatically:
IgnoreExtraElementsConvention(ignoreExtraElements: true)
EnumRepresentationConvention(BsonType.String)
, so changing an enum member name is a breaking changeDateTime
andDateTimeOffset
are serialized asDateTime
instead of the default Ticks or (Ticks, Offset). In MongoDB, DateTime only supports precision up to the milliseconds. If you need more precision, you need to set the serializer at property level.
Declaring and using MongoDB documents and collections
The process doesn't deviate much from the standard way of declaring and using MongoDB collections in C#. However, there are two additional steps:
- You must decorate the document class with the
MongoCollectionAttribute
to specify the collection name, - The document class must implement the empty marker interface
IMongoDocument
for generic constraints purposes.
[MongoCollection("people")]
public class PersonDocument : IMongoDocument
{
// [...]
}
Refer back to the getting started section to learn how to resolve IMongoCollection<TDocument>
from the dependency injection services.
We also provide IAsyncEnumerable<TDocument>
extensions on IAsyncCursor<TDocument>
and IAsyncCursorSource<TDocument>
, eliminating the need to deal with cursors. For example:
var people = await collection.Find(FilterDefinition<PersonDocument>.Empty).ToAsyncEnumerable();
Logging and distributed tracing
Workleap.Extensions.Mongo supports modern logging with ILogger
and log level filtering. MongoDB commands can be logged at the Debug
level and optionally with their BSON content only if you set MongoClientOptions.Telemetry.CaptureCommandText
to true
.
Distributed tracing with OpenTelemetry is also integrated. We follow the semantic conventions for MongoDB. You can simply observe activities (traces) originating from our Workleap.Extensions.Mongo
assembly.
We also support distributed tracing with the Application Insights .NET SDK. To enable this feature, you need to install the additional Workleap.Extensions.Mongo.ApplicationInsights NuGet package. Simply use the .AddApplicationInsights()
on the builder object returned by services.AddMongo()
:
services.AddMongo().AddApplicationInsights();
By default, some commands such as isMaster
, buildInfo
, saslStart
, etc., are ignored by our instrumentation. You can either ignore additional commands or undo the ignoring of commands by modifying the MongoClientOptions.Telemetry.DefaultIgnoredCommandNames
collection.
Index management
We provide a mechanism for you to declare your collection indexes and ensure they are applied to your database. To do this, declare your indexes by implementing a custom MongoIndexProvider<TDocument>
:
[MongoCollection("people", IndexProviderType = typeof(PersonDocumentIndexes))]
public class PersonDocument : IMongoDocument
{
[BsonId]
[BsonRepresentation(BsonType.ObjectId)]
public string Id { get; set; }
public string Name { get; set; } = string.Empty;
}
public class PersonDocumentIndexes : MongoIndexProvider<PersonDocument>
{
public override IEnumerable<CreateIndexModel<PersonDocument>> CreateIndexModels()
{
// Index name is mandatory
yield return new CreateIndexModel<PersonDocument>(
Builders<PersonDocument>.IndexKeys.Combine().Ascending(x => x.Name),
new CreateIndexOptions { Name = "name" });
}
}
At this stage, nothing will happen. To actually create or update the index, you need to inject our IMongoIndexer
service and then call one of its UpdateIndexesAsync()
method overloads, for example:
var indexer = this.Services.GetRequiredService<IMongoIndexer>();
await indexer.UpdateIndexesAsync(new[] { typeof(PersonDocument) });
It is up to you to decide when and where to run the process of creating and updating the indexes. You could do it at the start of your application, in a separate application that runs in a continuous delivery pipeline, etc.
Our indexation engine handles:
- Discovering the
CreateIndexModel<TDocument>
declared in your code using reflection. - Computing a unique index name based on the model (we append a hash to the provided index name, for example,
name_512cbbb935626e2b4b7c44972597c4a8
). - Discovering existing indexes in the MongoDB collection.
- Comparing the names and hashes of both sides to determine if:
- We need to create a missing index.
- We need to drop and recreate an updated index (the hashes don't match).
- We need to drop an index that is no longer declared in your code.
- Leaving any other index intact. We only manage the indexes that have a name ending with a generated hash.
- Handling distributed race conditions. If many instances of an application call
indexer.UpdateIndexesAsync()
at the same time, only one will actually succeed (we use a distributed lock).
Note:
We do not recommend you try and run multiple UpdateIndexesAsync
tasks at the same time given that only one process can update indexes at a time through the use of a distributed lock.
For example in the code below, once a task has acquired the distributed lock, the other will wait until the lock is released before acquiring it and running the index update process.
In the end you're not saving any time by having multiple tasks run at the same time.
// ⚠️ Updating indexes in parallel is not recommended
var indexer = this.Services.GetRequiredService<IMongoIndexer>();
await Task.WhenAll(
indexer.UpdateIndexesAsync(AssemblyHandle.Assembly),
indexer.UpdateIndexesAsync(new[] { typeof(PersonDocument) })
);
Ideally if all your indexes are in the same assembly then you only have to call UpdateIndexesAsync
once.
But if you really do need to call it multiple times, then the code above should be re-written to the following:
var indexer = this.Services.GetRequiredService<IMongoIndexer>();
await indexer.UpdateIndexesAsync(AssemblyHandle.Assembly);
await indexer.UpdateIndexesAsync(new[] { typeof(PersonDocument) })
We include a Roslyn analyzer, detailed in a section below, that encourages developers to adorn classes that consume MongoDB collections with attributes (
IndexByAttribute
orNoIndexNeededAttribute
). The aim is to increase awareness about which indexes should be used (or created) when querying MongoDB collections.
Support for inheritance
The indexer mechanism support document inheritance and different indexer for a same collection. For example:
[BsonKnownTypes(typeof(DogPersonDocument), typeof(DogPersonDocument))]
[MongoCollection("people", IndexProviderType = typeof(PersonDocumentIndexes))]
public class PersonDocument : IMongoDocument
{
[BsonId]
[BsonRepresentation(BsonType.ObjectId)]
public string Id { get; set; }
public string Name { get; set; } = string.Empty;
}
public class PersonDocumentIndexes : MongoIndexProvider<PersonDocument>
{
public override IEnumerable<CreateIndexModel<PersonDocument>> CreateIndexModels()
{
yield return new CreateIndexModel<PersonDocument>(
Builders<PersonDocument>.IndexKeys.Combine().Ascending(x => x.Name),
new CreateIndexOptions { Name = "name" });
}
}
// No special indexer for this class
[BsonDiscriminator("Dog")]
public class DogPersonDocument : PersonDocument
{
public int DogCount { get; set; } = string.Empty;
}
// Need to redefine MongoCollectionAttribute to use a different indexer
[BsonDiscriminator("Cat")]
[MongoCollection("people", IndexProviderType = typeof(CatDocumentIndexes))]
public class CatPersonDocument : PersonDocument
{
public int CatCount { get; set; }
}
public class CatDocumentIndexes : MongoIndexProvider<PersonDocument>
{
public override IEnumerable<CreateIndexModel<PersonDocument>> CreateIndexModels()
{
yield return new CreateIndexModel<PersonDocument>(
Builders<PersonDocument>.IndexKeys.Combine().Ascending(x => x.CatCount),
new CreateIndexOptions { Name = "cat_count" });
}
}
Field encryption
The Workleap.Extensions.Mongo library supports field-level encryption at rest, which means you can specify in your C# code which document fields should be encrypted in your MongoDB database. Any C# property can be encrypted, as long as you provide how data gets encrypted and decrypted. These properties then become binary data in your documents.
To enable field-level encryption, simply decorate the sensitive properties with the [SensitiveInformationAttribute]
:
[MongoCollection("people")]
public class PersonDocument : IMongoDocument
{
// [...]
[SensitiveInformation(SensitivityScope.User)] // Other values are "Tenant" and "Application"
public string Address { get; set; } = string.Empty;
}
Next, create a class that implements IMongoValueEncryptor
:
public class MyMongoalueEncryptor : IMongoValueEncryptor
{
public byte[] Encrypt(byte[] bytes, SensitivityScope sensitivityScope)
{
// return protected bytes using the method of your choice
}
public byte[] Decrypt(byte[] bytes, SensitivityScope sensitivityScope)
{
// return unprotected bytes using the method of your choice
}
}
Finally, register this class in the dependency injection services:
// This ends up registered using the singleton service lifetime
services.AddMongo().AddEncryptor<MyMongoalueEncryptor>();
Keep in mind that encrypted values become binary data, which can make querying them more difficult. You'll need to take this into account when designing your database schema and queries.
Ephemeral MongoDB databases for integration tests
When creating integration tests, instead of using a shared MongoDB database for all your tests, you could assign a brand new ephemeral database for each individual test method. This approach reduces test flakiness, prevents the state of one test from impacting others and remove the need for manual or automatic cleanup.
This is what our NuGet package Workleap.Extensions.Mongo.Ephemeral does when you invoke its UseEphemeralRealServer()
method:
services.AddMongo().UseEphemeralRealServer();
When this method is called, each time a database or collection is requested within the scope of an individual IServiceProvider
:
- A MongoDB server starts (we use EphemeralMongo),
- A randomly named database is provided to your code.
When you dispose of the IServiceProvider
, the related resources are destroyed. We leverage internal caching to avoid running multiple instances of MongoDB servers concurrently, opting instead to reuse a single instance. This method allows you to run multiple concurrent tests, each with their own MongoDB database. If your test runner crashes, the MongoDB process will be terminated, preventing orphaned processes from consuming unnecessary resources.
Included Roslyn analyzers
Rule ID | Category | Severity | Description |
---|---|---|---|
GMNG01 | Design | Warning | Add 'IndexBy' or 'NoIndexNeeded' attributes on the containing type |
To modify the severity of one of these diagnostic rules, you can use a .editorconfig
file. For example:
## Disable analyzer for test files
[**Tests*/**.cs]
dotnet_diagnostic.GMNG01.severity = none
To learn more about configuring or suppressing code analysis warnings, refer to this documentation.
License
Copyright © 2023, Workleap. This code is licensed under the Apache License, Version 2.0. You may obtain a copy of this license at https://github.com/gsoft-inc/gsoft-license/blob/master/LICENSE.
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net5.0 was computed. net5.0-windows was computed. net6.0 is compatible. net6.0-android was computed. net6.0-ios was computed. net6.0-maccatalyst was computed. net6.0-macos was computed. net6.0-tvos was computed. net6.0-windows was computed. net7.0 was computed. net7.0-android was computed. net7.0-ios was computed. net7.0-maccatalyst was computed. net7.0-macos was computed. net7.0-tvos was computed. net7.0-windows was computed. net8.0 was computed. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. |
.NET Core | netcoreapp2.0 was computed. netcoreapp2.1 was computed. netcoreapp2.2 was computed. netcoreapp3.0 was computed. netcoreapp3.1 was computed. |
.NET Standard | netstandard2.0 is compatible. netstandard2.1 was computed. |
.NET Framework | net461 was computed. net462 was computed. net463 was computed. net47 was computed. net471 was computed. net472 is compatible. net48 was computed. net481 was computed. |
MonoAndroid | monoandroid was computed. |
MonoMac | monomac was computed. |
MonoTouch | monotouch was computed. |
Tizen | tizen40 was computed. tizen60 was computed. |
Xamarin.iOS | xamarinios was computed. |
Xamarin.Mac | xamarinmac was computed. |
Xamarin.TVOS | xamarintvos was computed. |
Xamarin.WatchOS | xamarinwatchos was computed. |
-
.NETFramework 4.7.2
- EphemeralMongo6 (>= 1.1.3)
- System.Memory (>= 4.5.5)
- Workleap.Extensions.Mongo (>= 1.6.1-preview.6)
-
.NETStandard 2.0
- EphemeralMongo6 (>= 1.1.3)
- System.Memory (>= 4.5.5)
- Workleap.Extensions.Mongo (>= 1.6.1-preview.6)
-
net6.0
- EphemeralMongo6 (>= 1.1.3)
- Workleap.Extensions.Mongo (>= 1.6.1-preview.6)
NuGet packages
This package is not used by any NuGet packages.
GitHub repositories
This package is not used by any popular GitHub repositories.
Version | Downloads | Last updated |
---|---|---|
1.11.2-preview.12 | 39 | 11/29/2024 |
1.11.2-preview.11 | 36 | 11/29/2024 |
1.11.2-preview.10 | 39 | 11/24/2024 |
1.11.2-preview.9 | 40 | 10/30/2024 |
1.11.2-preview.8 | 40 | 10/28/2024 |
1.11.2-preview.7 | 40 | 10/28/2024 |
1.11.2-preview.6 | 42 | 10/28/2024 |
1.11.2-preview.5 | 44 | 10/16/2024 |
1.11.2-preview.4 | 40 | 10/7/2024 |
1.11.2-preview.3 | 48 | 10/1/2024 |
1.11.2-preview.2 | 45 | 9/30/2024 |
1.11.2-preview.1 | 53 | 9/27/2024 |
1.11.1 | 2,625 | 9/20/2024 |
1.11.1-preview.2 | 41 | 9/20/2024 |
1.11.1-preview.1 | 39 | 9/20/2024 |
1.11.0 | 887 | 8/28/2024 |
1.10.1-preview.17 | 65 | 8/27/2024 |
1.10.1-preview.16 | 48 | 8/27/2024 |
1.10.1-preview.15 | 46 | 8/27/2024 |
1.10.1-preview.14 | 74 | 8/16/2024 |
1.10.1-preview.13 | 50 | 8/1/2024 |
1.10.1-preview.12 | 44 | 8/1/2024 |
1.10.1-preview.11 | 46 | 8/1/2024 |
1.10.1-preview.10 | 39 | 7/31/2024 |
1.10.1-preview.9 | 41 | 7/31/2024 |
1.10.1-preview.8 | 44 | 7/30/2024 |
1.10.1-preview.7 | 48 | 7/30/2024 |
1.10.1-preview.6 | 51 | 7/30/2024 |
1.10.1-preview.5 | 53 | 7/29/2024 |
1.10.1-preview.4 | 53 | 7/15/2024 |
1.10.1-preview.3 | 47 | 7/15/2024 |
1.10.1-preview.1 | 49 | 6/25/2024 |
1.10.0 | 1,366 | 6/20/2024 |
1.9.3-preview.6 | 57 | 6/20/2024 |
1.9.3-preview.5 | 55 | 6/17/2024 |
1.9.3-preview.4 | 55 | 6/10/2024 |
1.9.3-preview.3 | 59 | 5/31/2024 |
1.9.3-preview.2 | 54 | 5/31/2024 |
1.9.3-preview.1 | 60 | 5/29/2024 |
1.9.2 | 626 | 5/27/2024 |
1.9.2-preview.2 | 55 | 5/27/2024 |
1.9.1 | 139 | 5/16/2024 |
1.9.1-preview.2 | 63 | 5/16/2024 |
1.9.1-preview.1 | 60 | 5/10/2024 |
1.9.0 | 2,735 | 5/7/2024 |
1.8.1-preview.1 | 74 | 5/7/2024 |
1.8.0 | 290 | 5/6/2024 |
1.7.1-preview.1 | 73 | 5/6/2024 |
1.7.0 | 105 | 4/29/2024 |
1.6.1-preview.10 | 52 | 4/29/2024 |
1.6.1-preview.9 | 65 | 4/22/2024 |
1.6.1-preview.8 | 73 | 4/16/2024 |
1.6.1-preview.7 | 62 | 4/15/2024 |
1.6.1-preview.6 | 69 | 4/12/2024 |
1.6.1-preview.5 | 59 | 4/11/2024 |
1.6.1-preview.4 | 66 | 4/3/2024 |
1.6.1-preview.3 | 63 | 3/8/2024 |
1.6.1-preview.2 | 66 | 3/7/2024 |
1.6.1-preview.1 | 69 | 3/4/2024 |
1.6.0 | 258 | 2/29/2024 |
1.5.3-preview.8 | 65 | 2/29/2024 |
1.5.3-preview.7 | 55 | 2/19/2024 |
1.5.3-preview.5 | 59 | 2/16/2024 |
1.5.3-preview.4 | 76 | 2/12/2024 |
1.5.3-preview.3 | 70 | 2/7/2024 |
1.5.3-preview.2 | 73 | 2/6/2024 |
1.5.3-preview.1 | 64 | 2/5/2024 |
1.5.2 | 376 | 1/29/2024 |
1.5.2-preview.7 | 59 | 1/29/2024 |
1.5.2-preview.6 | 56 | 1/26/2024 |
1.5.2-preview.5 | 61 | 1/24/2024 |
1.5.2-preview.4 | 57 | 1/22/2024 |
1.5.2-preview.3 | 64 | 1/12/2024 |
1.5.2-preview.2 | 61 | 1/12/2024 |
1.5.2-preview.1 | 68 | 12/18/2023 |
1.5.1 | 923 | 12/5/2023 |
1.5.1-preview.4 | 82 | 12/4/2023 |
1.5.1-preview.3 | 85 | 11/26/2023 |
1.5.1-preview.2 | 73 | 11/21/2023 |
1.5.1-preview.1 | 68 | 11/17/2023 |
1.5.0 | 209 | 11/16/2023 |
1.4.1-preview.9 | 70 | 11/15/2023 |
1.4.1-preview.8 | 66 | 11/13/2023 |
1.4.1-preview.7 | 69 | 11/7/2023 |
1.4.1-preview.6 | 75 | 10/30/2023 |
1.4.1-preview.5 | 79 | 10/13/2023 |
1.4.1-preview.4 | 81 | 10/13/2023 |
1.4.1-preview.3 | 78 | 10/6/2023 |
1.4.1-preview.2 | 81 | 9/27/2023 |
1.4.1-preview.1 | 84 | 9/26/2023 |
1.4.0 | 365 | 9/25/2023 |
1.3.3-preview.6 | 77 | 9/25/2023 |
1.3.3-preview.5 | 78 | 9/25/2023 |
1.3.3-preview.4 | 85 | 9/21/2023 |
1.3.3-preview.3 | 74 | 9/20/2023 |
1.3.3-preview.2 | 78 | 9/20/2023 |
1.3.3-preview.1 | 86 | 9/11/2023 |
1.3.2 | 626 | 9/6/2023 |
1.3.2-preview.1 | 61 | 9/6/2023 |
1.3.1 | 170 | 7/20/2023 |
1.3.1-preview.4 | 94 | 7/20/2023 |
1.3.1-preview.3 | 75 | 7/20/2023 |