RobotsTxtCore 1.1.0

Robots.txt middleware with fluent interface.

Install-Package RobotsTxtCore -Version 1.1.0
dotnet add package RobotsTxtCore --version 1.1.0
<PackageReference Include="RobotsTxtCore" Version="1.1.0" />
For projects that support PackageReference, copy this XML node into the project file to reference the package.
paket add RobotsTxtCore --version 1.1.0
The NuGet Team does not provide support for this client. Please contact its maintainers for support.

RobotsTxtMiddleware Build Status

A Robots.txt middleware for ASP.NET Core. Why is this needed you ask? Because if you need to add dynamic values (such as a configured url from your CMS) you'll need some sort of code to handle that, and this makes it easy.

Installation

NuGet

PM> Install-Package RobotsTxtCore

.Net CLI

>  dotnet add package RobotsTxtCore

https://www.nuget.org/packages/RobotsTxtCore/

Usage

To specify multiple rules with the fluent interface makes it really easy.

    app.UseRobotsTxt(builder =>
        builder
            .AddSection(section => 
                section
                    .AddComment("Allow Googlebot")
                    .AddUserAgent("Googlebot")
                    .Allow("/")
                )
            .AddSection(section => 
                section
                    .AddComment("Disallow the rest")
                    .AddUserAgent("*")
                    .AddCrawlDelay(TimeSpan.FromSeconds(10))
                    .Disallow("/")
                )
            .AddSitemap("https://example.com/sitemap.xml")
    );

Output

# Allow Googlebot
User-agent: Googlebot
Allow: /

# Disallow the rest
User-agent: *
Disallow: /
Crawl-delay: 10

Sitemap: https://example.com/sitemap.xml

Or if you just want to deny everyone.

    app.UseRobotsTxt(builder =>
        builder
            .DenyAll()
    );

Output

User-agent: *
Disallow: /

RobotsTxtMiddleware Build Status

A Robots.txt middleware for ASP.NET Core. Why is this needed you ask? Because if you need to add dynamic values (such as a configured url from your CMS) you'll need some sort of code to handle that, and this makes it easy.

Installation

NuGet

PM> Install-Package RobotsTxtCore

.Net CLI

>  dotnet add package RobotsTxtCore

https://www.nuget.org/packages/RobotsTxtCore/

Usage

To specify multiple rules with the fluent interface makes it really easy.

    app.UseRobotsTxt(builder =>
        builder
            .AddSection(section => 
                section
                    .AddComment("Allow Googlebot")
                    .AddUserAgent("Googlebot")
                    .Allow("/")
                )
            .AddSection(section => 
                section
                    .AddComment("Disallow the rest")
                    .AddUserAgent("*")
                    .AddCrawlDelay(TimeSpan.FromSeconds(10))
                    .Disallow("/")
                )
            .AddSitemap("https://example.com/sitemap.xml")
    );

Output

# Allow Googlebot
User-agent: Googlebot
Allow: /

# Disallow the rest
User-agent: *
Disallow: /
Crawl-delay: 10

Sitemap: https://example.com/sitemap.xml

Or if you just want to deny everyone.

    app.UseRobotsTxt(builder =>
        builder
            .DenyAll()
    );

Output

User-agent: *
Disallow: /

This package is not used by any popular GitHub repositories.

Version History

Version Downloads Last updated
1.1.0 186 8/11/2019
1.0.1 2,030 2/27/2018
1.0.0 217 2/24/2018
1.0.0-beta-2 313 1/2/2018
1.0.0-beta-1 300 12/29/2017