robots.txt Rerouting on Development Servers
Every website should have a robots.txt file. Some bots hit sites so often that they slow down performance, other bots simply aren't desirable. robots.txt files can also be used to communicate sitemap location and limit request rate. It's important that the correct robots.txt file is served on development servers though, and that file is usually much different than your production robots.txt file. Here's a quick .htaccess snippet you can use to make that happen:
RewriteCond %{HTTP_HOST} devdomain
RewriteRule ^robots.txt$ robots-go-away.txt [L]
The robots-go-away.txt
file most likely directs robots not to index anything, unless you want your dev server to be indexed for some reason (hint: you really don't want this).
![Detect DOM Node Insertions with JavaScript and CSS Animations]()
I work with an awesome cast of developers at Mozilla, and one of them in Daniel Buchner. Daniel's shared with me an awesome strategy for detecting when nodes have been injected into a parent node without using the deprecated DOM Events API.
![Designing for Simplicity]()
Before we get started, it's worth me spending a brief moment introducing myself to you. My name is Mark (or @integralist if Twitter happens to be your communication tool of choice) and I currently work for BBC News in London England as a principal engineer/tech...
![Fancy FAQs with MooTools Sliders: Version 2]()
A little over a year ago I authored a post titled Fancy FAQs with MooTools Sliders. My post detailed a method of taking boring FAQs and making them more robust using the world's best JavaScript framework: MooTools. I've taken some time to...
![Introducing MooTools LazyLoad]()
Once concept I'm very fond of is lazy loading. Lazy loading defers the loading of resources (usually images) until they are needed. Why load stuff you never need if you can prevent it, right? I've created LazyLoad, a customizable MooTools plugin that...
Here’s an example showing how to include multiple development domains:
use vagrant