Prevent Robot Indexing with Response Headers

By  on  

Every so often you have parts of your website that would be better off not indexed by search engines.  API calls, search result pages, PDF documents -- all examples of responses which may not have value outside of the current user.  No we all know we can signal to the search engines not to index pages using a META tag, but oftentimes service calls and documents don't get the luxury of a META tag.  Luckily you can add a header to prevent these responses from being indexed.

The header name is X-Robots-Tag should be easy to add using the server-side language you prefer.  For example, adding this header with PHP may look like:

header('X-Robots-Tag: noindex');

If you're using a Django-based python site, the could would look like:

response['X-Robots-Tag'] = 'noindex'

This header can also be set within your .htaccess or httpd configuration files:

<Files ~ "\.pdf$">
  Header set X-Robots-Tag "noindex"
</Files>

The truth is that there's no guarantee that something your server serves wont be indexed by a search engine, but small tweaks like this can ensure your search engine standing can improve and that users don't find their way to "dead" parts of your site via search engines.

Recent Features

  • By
    Page Visibility API

    One event that's always been lacking within the document is a signal for when the user is looking at a given tab, or another tab. When does the user switch off our site to look at something else? When do they come back?

  • By
    Create Namespaced Classes with MooTools

    MooTools has always gotten a bit of grief for not inherently using and standardizing namespaced-based JavaScript classes like the Dojo Toolkit does.  Many developers create their classes as globals which is generally frowned up.  I mostly disagree with that stance, but each to their own.  In any event...

Incredible Demos

  • By
    prefers-color-scheme: CSS Media Query

    One device and app feature I've come to appreciate is the ability to change between light and dark modes. If you've ever done late night coding or reading, you know how amazing a dark theme can be for preventing eye strain and the headaches that result.

  • By
    Smooth Scrolling with MooTools Fx.SmoothScroll

    I get quite a few support requests for my previous MooTools SmoothScroll article and the issue usually boils down to the fact that SmoothScroll has become Fx.SmoothScroll. Here's a simple usage of Fx.SmoothScroll. The HTML The only HTML requirement for Fx.SmoothScroll is that all named...

Discussion

  1. Chris

    I have a big problem with spam registration on an ExpressionEngine site I help manage. Could this help? I have no development experience, fyi…

  2. This a really old post, but for those who are using nGinx instead of Apache, you can do

    location ~* \.(doc|pdf)$ {
        add_header  X-Robots-Tag "noindex, noarchive, nosnippet";
    }
    

Wrap your code in <pre class="{language}"></pre> tags, link to a GitHub gist, JSFiddle fiddle, or CodePen pen to embed!