Prevent Robot Indexing with Response Headers

By  on  

Every so often you have parts of your website that would be better off not indexed by search engines.  API calls, search result pages, PDF documents -- all examples of responses which may not have value outside of the current user.  No we all know we can signal to the search engines not to index pages using a META tag, but oftentimes service calls and documents don't get the luxury of a META tag.  Luckily you can add a header to prevent these responses from being indexed.

The header name is X-Robots-Tag should be easy to add using the server-side language you prefer.  For example, adding this header with PHP may look like:

header('X-Robots-Tag: noindex');

If you're using a Django-based python site, the could would look like:

response['X-Robots-Tag'] = 'noindex'

This header can also be set within your .htaccess or httpd configuration files:

<Files ~ "\.pdf$">
  Header set X-Robots-Tag "noindex"
</Files>

The truth is that there's no guarantee that something your server serves wont be indexed by a search engine, but small tweaks like this can ensure your search engine standing can improve and that users don't find their way to "dead" parts of your site via search engines.

Recent Features

  • By
    CSS vs. JS Animation: Which is Faster?

    How is it possible that JavaScript-based animation has secretly always been as fast — or faster — than CSS transitions? And, how is it possible that Adobe and Google consistently release media-rich mobile sites that rival the performance of native apps? This article serves as a point-by-point...

  • By
    Designing for Simplicity

    Before we get started, it's worth me spending a brief moment introducing myself to you. My name is Mark (or @integralist if Twitter happens to be your communication tool of choice) and I currently work for BBC News in London England as a principal engineer/tech...

Incredible Demos

  • By
    Multiple File Upload Input

    More often than not, I find myself wanting to upload more than one file at a time.  Having to use multiple "file" INPUT elements is annoying, slow, and inefficient.  And if I hate them, I can't imagine how annoyed my users would be.  Luckily Safari, Chrome...

  • By
    MooTools CountDown Plugin

    There are numerous websites around the internet, RapidShare for example, that make you wait an allotted amount of time before presenting you with your reward. Using MooTools, I've created a CountDown plugin that allows you to easily implement a similar system. The MooTools JavaScript The CountDown class...

Discussion

  1. Chris

    I have a big problem with spam registration on an ExpressionEngine site I help manage. Could this help? I have no development experience, fyi…

  2. This a really old post, but for those who are using nGinx instead of Apache, you can do

    location ~* \.(doc|pdf)$ {
        add_header  X-Robots-Tag "noindex, noarchive, nosnippet";
    }
    

Wrap your code in <pre class="{language}"></pre> tags, link to a GitHub gist, JSFiddle fiddle, or CodePen pen to embed!