robots.txt Rerouting on Development Servers

By  on  

Every website should have a robots.txt file.  Some bots hit sites so often that they slow down performance, other bots simply aren't desirable.  robots.txt files can also be used to communicate sitemap location and limit request rate.  It's important that the correct robots.txt file is served on development servers though, and that file is usually much different than your production robots.txt file.  Here's a quick .htaccess snippet you can use to make that happen:

RewriteCond %{HTTP_HOST} devdomain
RewriteRule ^robots.txt$ robots-go-away.txt [L]

The robots-go-away.txt file most likely directs robots not to index anything, unless you want your dev server to be indexed for some reason (hint:  you really don't want this).

Recent Features

Incredible Demos

  • By
    Create Custom Events in MooTools 1.2

    Javascript has a number of native events like "mouseover," "mouseout", "click", and so on. What if you want to create your own events though? Creating events using MooTools is as easy as it gets. The MooTools JavaScript What's great about creating custom events in MooTools is...

  • By
    MooTools 1.2 Tooltips: Customize Your Tips

    I've never met a person that is "ehhhh" about XHTML/javascript tooltips; people seem to love them or hate them. I'm on the love side of things. Tooltips give you a bit more information about something than just the element itself (usually...

Discussion

  1. Here’s an example showing how to include multiple development domains:

    RewriteCond %{HTTP_HOST} ^localhost [OR]
    RewriteCond %{HTTP_HOST} ^example.dev [OR]
    RewriteCond %{HTTP_HOST} ^test.example.com [OR]
    RewriteCond %{HTTP_HOST} ^staging.example.com
    RewriteRule ^robots.txt$ robots-disallow.txt [L]
    
  2. Steve

    use vagrant

Wrap your code in <pre class="{language}"></pre> tags, link to a GitHub gist, JSFiddle fiddle, or CodePen pen to embed!