robots.txt Rerouting on Development Servers
Every website should have a robots.txt file. Some bots hit sites so often that they slow down performance, other bots simply aren't desirable. robots.txt files can also be used to communicate sitemap location and limit request rate. It's important that the correct robots.txt file is served on development servers though, and that file is usually much different than your production robots.txt file. Here's a quick .htaccess snippet you can use to make that happen:
RewriteCond %{HTTP_HOST} devdomain
RewriteRule ^robots.txt$ robots-go-away.txt [L]
The robots-go-away.txt
file most likely directs robots not to index anything, unless you want your dev server to be indexed for some reason (hint: you really don't want this).
![CSS Animations Between Media Queries]()
CSS animations are right up there with sliced bread. CSS animations are efficient because they can be hardware accelerated, they require no JavaScript overhead, and they are composed of very little CSS code. Quite often we add CSS transforms to elements via CSS during...
![fetch API]()
One of the worst kept secrets about AJAX on the web is that the underlying API for it, XMLHttpRequest
, wasn't really made for what we've been using it for. We've done well to create elegant APIs around XHR but we know we can do better. Our effort to...
![AJAX Page Loads Using MooTools Fx.Explode]()
Note: All credit for Fx.Explode goes to Jan Kassens.
One of the awesome pieces of code in MooTools Core Developer Jan Kassens' sandbox is his Fx.Explode functionality. When you click on any of the designated Fx.Explode elements, the elements "explode" off of the...
![Dynamically Load Stylesheets Using MooTools 1.2]()
Theming has become a big part of the Web 2.0 revolution. Luckily, so too has a higher regard for semantics and CSS standards. If you build your pages using good XHTML code, changing a CSS file can make your website look completely different.
Here’s an example showing how to include multiple development domains:
use vagrant