Apache / Server Tutorials

  • By
    How to Create and Manage CRON Jobs

    Interval or scheduled task execution is used all over computer science, the most obvious use case being transaction batching.  For web developers like myself, the most obvious use case is executing CRON jobs for this blog, including polling for scheduled blog post publishing and a variety...

  • By
    Limit Download Speed with Apache

    My adventures into retro gaming have brought me back into the semi-seedy world of piracy websites and the technology considerations that dictate their business model.  Annoying popups and pornographic advertisements aside, the most obvious technological observation I made was that each of these sites used bandwidth...

  • By
    How to Make Email a Powerful Part of Your Web Application

    Giving your customers a way to access your application from their email account is a major way to boost their activity and engagement on your website. One of my favorite popular productivity tools, iDoneThis, gives me a simple way to record and share the...

  • By
    Check GZip Encoding with curl

    Last week I detailed how I enabled gzip encoding on nginx servers, the same server software I use on this site.  Enabling gzip on your server exponentially improves the site load time, thus improving user experience and (hopefully) Google page ranks.  I implemented said strategy and used...

  • By
    Serve SVG as an Image on Apache with .htaccess

    I've been a massive fan of SVG since my days creating charts and animations with the Dojo Toolkit.  SVG has been around forever, it even has IE support now, and it's ultra-flexible.  When creating this site's redesign, I used SVG within an IMG tag and...

  • By
    robots.txt Rerouting on Development Servers

    Every website should have a robots.txt file.  Some bots hit sites so often that they slow down performance, other bots simply aren't desirable.  robots.txt files can also be used to communicate sitemap location and limit request rate.  It's important that the correct robots.txt file is served...

  • By
    Serving Fonts from CDN

    For maximum performance, we all know we must put our assets on CDN (another domain).  Along with those assets are custom web fonts.  Unfortunately custom web fonts via CDN (or any cross-domain font request) don't work in Firefox or Internet Explorer (correctly so, by spec) though...

  • By
    Prepend and Append Files with .htaccess

    One of the lessor known and used capabilities of .htaccess files is the ability to prepend and append includes to every page request.  Doing so avoids needing to code <?php require('footer.php'); ?> in every template file you wat to use them in.  Here's the .htaccess code: Now...

  • By
    Prevent Robot Indexing with Response Headers

    Every so often you have parts of your website that would be better off not indexed by search engines.  API calls, search result pages, PDF documents -- all examples of responses which may not have value outside of the current user.  No we all know we...

  • By
    Create a Virtual Host in OSX

    As someone who had always developed on PCs, switching over to using Mac OS X was like going from peasant to prince.  Compared to Windows-based machines, Mac's OS X operating system is light years better.  One OS X feature I make much use of is the integrated...