robots.txt Rerouting on Development Servers

By  on  

Every website should have a robots.txt file.  Some bots hit sites so often that they slow down performance, other bots simply aren't desirable.  robots.txt files can also be used to communicate sitemap location and limit request rate.  It's important that the correct robots.txt file is served on development servers though, and that file is usually much different than your production robots.txt file.  Here's a quick .htaccess snippet you can use to make that happen:

RewriteCond %{HTTP_HOST} devdomain
RewriteRule ^robots.txt$ robots-go-away.txt [L]

The robots-go-away.txt file most likely directs robots not to index anything, unless you want your dev server to be indexed for some reason (hint:  you really don't want this).

Recent Features

  • By
    CSS Animations Between Media Queries

    CSS animations are right up there with sliced bread. CSS animations are efficient because they can be hardware accelerated, they require no JavaScript overhead, and they are composed of very little CSS code. Quite often we add CSS transforms to elements via CSS during...

  • By
    fetch API

    One of the worst kept secrets about AJAX on the web is that the underlying API for it, XMLHttpRequest, wasn't really made for what we've been using it for.  We've done well to create elegant APIs around XHR but we know we can do better.  Our effort to...

Incredible Demos

Discussion

  1. Here’s an example showing how to include multiple development domains:

    RewriteCond %{HTTP_HOST} ^localhost [OR]
    RewriteCond %{HTTP_HOST} ^example.dev [OR]
    RewriteCond %{HTTP_HOST} ^test.example.com [OR]
    RewriteCond %{HTTP_HOST} ^staging.example.com
    RewriteRule ^robots.txt$ robots-disallow.txt [L]
    
  2. Steve

    use vagrant

Wrap your code in <pre class="{language}"></pre> tags, link to a GitHub gist, JSFiddle fiddle, or CodePen pen to embed!