robots.txt Rerouting on Development Servers
Every website should have a robots.txt file. Some bots hit sites so often that they slow down performance, other bots simply aren't desirable. robots.txt files can also be used to communicate sitemap location and limit request rate. It's important that the correct robots.txt file is served on development servers though, and that file is usually much different than your production robots.txt file. Here's a quick .htaccess snippet you can use to make that happen:
RewriteCond %{HTTP_HOST} devdomain
RewriteRule ^robots.txt$ robots-go-away.txt [L]
The robots-go-away.txt
file most likely directs robots not to index anything, unless you want your dev server to be indexed for some reason (hint: you really don't want this).
![39 Shirts – Leaving Mozilla]()
In 2001 I had just graduated from a small town high school and headed off to a small town college. I found myself in the quaint computer lab where the substandard computers featured two browsers: Internet Explorer and Mozilla. It was this lab where I fell...
![Interview with a Pornhub Web Developer]()
Regardless of your stance on pornography, it would be impossible to deny the massive impact the adult website industry has had on pushing the web forward. From pushing the browser's video limits to pushing ads through WebSocket so ad blockers don't detect them, you have...
![Duplicate DeSandro’s CSS Effect]()
I recently stumbled upon David DeSandro's website when I saw a tweet stating that someone had stolen/hotlinked his website design and code, and he decided to do the only logical thing to retaliate: use some simple JavaScript goodness to inject unicorns into their page.
![MooTools Overlay Plugin]()
Overlays have become a big part of modern websites; we can probably attribute that to the numerous lightboxes that use them. I've found a ton of overlay code snippets out there but none of them satisfy my taste in code. Many of them are...
Here’s an example showing how to include multiple development domains:
use vagrant