Prevent Robot Indexing with Response Headers

By  on  

Every so often you have parts of your website that would be better off not indexed by search engines.  API calls, search result pages, PDF documents -- all examples of responses which may not have value outside of the current user.  No we all know we can signal to the search engines not to index pages using a META tag, but oftentimes service calls and documents don't get the luxury of a META tag.  Luckily you can add a header to prevent these responses from being indexed.

The header name is X-Robots-Tag should be easy to add using the server-side language you prefer.  For example, adding this header with PHP may look like:

header('X-Robots-Tag: noindex');

If you're using a Django-based python site, the could would look like:

response['X-Robots-Tag'] = 'noindex'

This header can also be set within your .htaccess or httpd configuration files:

<Files ~ "\.pdf$">
  Header set X-Robots-Tag "noindex"
</Files>

The truth is that there's no guarantee that something your server serves wont be indexed by a search engine, but small tweaks like this can ensure your search engine standing can improve and that users don't find their way to "dead" parts of your site via search engines.

Recent Features

  • By
    Write Better JavaScript with Promises

    You've probably heard the talk around the water cooler about how promises are the future. All of the cool kids are using them, but you don't see what makes them so special. Can't you just use a callback? What's the big deal? In this article, we'll...

  • By
    How I Stopped WordPress Comment Spam

    I love almost every part of being a tech blogger:  learning, preaching, bantering, researching.  The one part about blogging that I absolutely loathe:  dealing with SPAM comments.  For the past two years, my blog has registered 8,000+ SPAM comments per day.  PER DAY.  Bloating my database...

Incredible Demos

  • By
    CSS Tooltips

    We all know that you can make shapes with CSS and a single HTML element, as I've covered in my CSS Triangles and CSS Circles posts.  Triangles and circles are fairly simply though, so as CSS advances, we need to stretch the boundaries...

  • By
    AJAX For Evil:  Spyjax with jQuery

    Last year I wrote a popular post titled AJAX For Evil: Spyjax when I described a technique called "Spyjax": Spyjax, as I know it, is taking information from the user's computer for your own use — specifically their browsing habits. By using CSS and JavaScript, I...

Discussion

  1. Chris

    I have a big problem with spam registration on an ExpressionEngine site I help manage. Could this help? I have no development experience, fyi…

  2. This a really old post, but for those who are using nGinx instead of Apache, you can do

    location ~* \.(doc|pdf)$ {
        add_header  X-Robots-Tag "noindex, noarchive, nosnippet";
    }
    

Wrap your code in <pre class="{language}"></pre> tags, link to a GitHub gist, JSFiddle fiddle, or CodePen pen to embed!