10 packages returned for Tags:"robots.txt"

A robots.txt parser for .NET Supports ; - Allow directives. - Crawl-delay directives. - Sitemap declarations. - * and $ wildcards. See https://bitbucket.org/cagdas/robotstxt for usage examples.
SimpleSitemap is a lite library that helps you create web sitemaps for collections or lists of items. These sitemaps follow the Sitemap Protocol. Both sitemapindex and urlset links are generated, based upon the data collection size and 'page size'. Examples of this could be a list of your users,... More information
- It adds a robots.txt and a robots_closed.txt to the root of the website - It adds a rewrite rule to the web.config that rewrites the robots.txt to robots_closed.txt for all urls ending with tamtam.nl
Small utility, used to check robots.txt on web app and make sure that it is valid. RobotsTxt check is based on https://www.nuget.org/packages/RobotsTxt, but this project is no longer maintained, so I copied it for myself and introduced some useful (as I see them) improvements
A simple middleware that creates a Sitemap.xml and or Robots.txt for your Search Engine Optimized Asp.NET webapp. Simple add app.UseSitemap() or app.UseRobots(), or both. Sitemap asks for a parameter if you would like to parse the controllers - you can add the [NoSiteMap] attribute to any class or... More information