RobotsTxtCore 1.1.0

Robots.txt middleware with fluent interface.

Install-Package RobotsTxtCore -Version 1.1.0
dotnet add package RobotsTxtCore --version 1.1.0
<PackageReference Include="RobotsTxtCore" Version="1.1.0" />
For projects that support PackageReference, copy this XML node into the project file to reference the package.
paket add RobotsTxtCore --version 1.1.0
The NuGet Team does not provide support for this client. Please contact its maintainers for support.
#r "nuget: RobotsTxtCore, 1.1.0"
#r directive can be used in F# Interactive, C# scripting and .NET Interactive. Copy this into the interactive tool or source code of the script to reference the package.
// Install RobotsTxtCore as a Cake Addin
#addin nuget:?package=RobotsTxtCore&version=1.1.0

// Install RobotsTxtCore as a Cake Tool
#tool nuget:?package=RobotsTxtCore&version=1.1.0
The NuGet Team does not provide support for this client. Please contact its maintainers for support.

RobotsTxtMiddleware Build Status

A Robots.txt middleware for ASP.NET Core. Why is this needed you ask? Because if you need to add dynamic values (such as a configured url from your CMS) you'll need some sort of code to handle that, and this makes it easy.

Installation

NuGet

PM> Install-Package RobotsTxtCore

.Net CLI

> dotnet add package RobotsTxtCore

https://www.nuget.org/packages/RobotsTxtCore/

Usage

To specify multiple rules with the fluent interface makes it really easy.

    app.UseRobotsTxt(builder =>
        builder
            .AddSection(section => 
                section
                    .AddComment("Allow Googlebot")
                    .AddUserAgent("Googlebot")
                    .Allow("/")
                )
            .AddSection(section => 
                section
                    .AddComment("Disallow the rest")
                    .AddUserAgent("*")
                    .AddCrawlDelay(TimeSpan.FromSeconds(10))
                    .Disallow("/")
                )
            .AddSitemap("https://example.com/sitemap.xml")
    );

Output

# Allow Googlebot
User-agent: Googlebot
Allow: /

# Disallow the rest
User-agent: *
Disallow: /
Crawl-delay: 10

Sitemap: https://example.com/sitemap.xml

Or if you just want to deny everyone.

    app.UseRobotsTxt(builder =>
        builder
            .DenyAll()
    );

Output

User-agent: *
Disallow: /

RobotsTxtMiddleware Build Status

A Robots.txt middleware for ASP.NET Core. Why is this needed you ask? Because if you need to add dynamic values (such as a configured url from your CMS) you'll need some sort of code to handle that, and this makes it easy.

Installation

NuGet

PM> Install-Package RobotsTxtCore

.Net CLI

> dotnet add package RobotsTxtCore

https://www.nuget.org/packages/RobotsTxtCore/

Usage

To specify multiple rules with the fluent interface makes it really easy.

    app.UseRobotsTxt(builder =>
        builder
            .AddSection(section => 
                section
                    .AddComment("Allow Googlebot")
                    .AddUserAgent("Googlebot")
                    .Allow("/")
                )
            .AddSection(section => 
                section
                    .AddComment("Disallow the rest")
                    .AddUserAgent("*")
                    .AddCrawlDelay(TimeSpan.FromSeconds(10))
                    .Disallow("/")
                )
            .AddSitemap("https://example.com/sitemap.xml")
    );

Output

# Allow Googlebot
User-agent: Googlebot
Allow: /

# Disallow the rest
User-agent: *
Disallow: /
Crawl-delay: 10

Sitemap: https://example.com/sitemap.xml

Or if you just want to deny everyone.

    app.UseRobotsTxt(builder =>
        builder
            .DenyAll()
    );

Output

User-agent: *
Disallow: /

NuGet packages (2)

Showing the top 2 NuGet packages that depend on RobotsTxtCore:

Package Downloads
Apprio.Enablement.Web.Api
Package Description
Kros.CqrsTemplate
Package Description

GitHub repositories

This package is not used by any popular GitHub repositories.

Version History

Version Downloads Last updated
1.1.0 58,817 8/11/2019
1.0.1 4,960 2/27/2018
1.0.0 516 2/24/2018
1.0.0-beta-2 624 1/2/2018
1.0.0-beta-1 603 12/29/2017