Use FURL to Retrieve Website Headers

By  on  

It's important to know what headers your website and its files are communicating. For example, if your website is providing a 404 status, you're probably streaking toward your computer to fix the problem. Using the FURL library, you may retrieve website headers from the command line.

The Shell Script

furl https://davidwalsh.name

Simple and quick -- just like every shell directive.

The Sample Response

HTTP/1.1 200 OK
Date: Thu, 25 Jun 2009 01:50:50 GMT
Server: Apache/2.2.3 (CentOS)
X-Powered-By: PHP/5.2.6
X-Pingback: https://davidwalsh.name/xmlrpc.php
Cache-Control: max-age=1, private, must-revalidate
Expires: Thu, 25 Jun 2009 01:50:51 GMT
Vary: Accept-Encoding
Connection: close
Content-Type: text/html; charset=UTF-8

Don't have FURL? Install it by scripting this:

sudo port install furl

How is this useful? I would use this to periodically (cron) check my website to make sure it was up. What would you use this for?

Recent Features

  • By
    An Interview with Eric Meyer

    Your early CSS books were instrumental in pushing my love for front end technologies. What was it about CSS that you fell in love with and drove you to write about it? At first blush, it was the simplicity of it as compared to the table-and-spacer...

  • By
    How to Create a RetroPie on Raspberry Pi – Graphical Guide

    Today we get to play amazing games on our super powered game consoles, PCs, VR headsets, and even mobile devices.  While I enjoy playing new games these days, I do long for the retro gaming systems I had when I was a kid: the original Nintendo...

Incredible Demos

  • By
    CSS Kwicks

    One of the effects that made me excited about client side and JavaScript was the Kwicks effect.  Take a list of items and react to them accordingly when hovered.  Simple, sweet.  The effect was originally created with JavaScript but come five years later, our...

  • By
    Simple Image Lazy Load and Fade

    One of the quickest and easiest website performance optimizations is decreasing image loading.  That means a variety of things, including minifying images with tools like ImageOptim and TinyPNG, using data URIs and sprites, and lazy loading images.  It's a bit jarring when you're lazy loading images and they just...

Discussion

  1. I’d use it to retrieve the X-Pingback value and if it was included, I’d send a trackback. ;-)

  2. adamnfish

    Or, if you don’t fancy installing furl for this, you can do the same with curl (a powerful and flexible utility for doing performing requests) with the -I flag:

    eg.
    curl -I http://davidwalsh.name

    (you probably have curl installed already)

    to see the headers and the full response, use the verbose flag
    curl -v http://davidwalsh.name

  3. @adamnfish: Thanks for sharing that. On a side note, “adamnfish” sounds like a wacky morning FM radio show.

  4. Not sure where sources are but the Debian package is at http://bertorello.ns0.it/debian/furl/

  5. As already mentioned,

    curl -I HOSTNAME

    Has the same functionality but without installing something extra.

  6. curl -I is good. This is another suggestion…

    lwp-request -ed “http://lindesk.com/”

  7. Marco

    another trick is:

    lynx -head http://davidwalsh.name

    lynx is a linux textual browser

  8. Dang! I should have read this sooner. I was itching to jump all over the “curl -I” suggestion. Everyone got here first!

  9. Rex

    alias furl=’curl -i -X HEAD’

Wrap your code in <pre class="{language}"></pre> tags, link to a GitHub gist, JSFiddle fiddle, or CodePen pen to embed!