Use FURL to Retrieve Website Headers

By  on  

It's important to know what headers your website and its files are communicating. For example, if your website is providing a 404 status, you're probably streaking toward your computer to fix the problem. Using the FURL library, you may retrieve website headers from the command line.

The Shell Script

furl https://davidwalsh.name

Simple and quick -- just like every shell directive.

The Sample Response

HTTP/1.1 200 OK
Date: Thu, 25 Jun 2009 01:50:50 GMT
Server: Apache/2.2.3 (CentOS)
X-Powered-By: PHP/5.2.6
X-Pingback: https://davidwalsh.name/xmlrpc.php
Cache-Control: max-age=1, private, must-revalidate
Expires: Thu, 25 Jun 2009 01:50:51 GMT
Vary: Accept-Encoding
Connection: close
Content-Type: text/html; charset=UTF-8

Don't have FURL? Install it by scripting this:

sudo port install furl

How is this useful? I would use this to periodically (cron) check my website to make sure it was up. What would you use this for?

Recent Features

  • By
    CSS Gradients

    With CSS border-radius, I showed you how CSS can bridge the gap between design and development by adding rounded corners to elements.  CSS gradients are another step in that direction.  Now that CSS gradients are supported in Internet Explorer 8+, Firefox, Safari, and Chrome...

  • By
    Camera and Video Control with HTML5

    Client-side APIs on mobile and desktop devices are quickly providing the same APIs.  Of course our mobile devices got access to some of these APIs first, but those APIs are slowly making their way to the desktop.  One of those APIs is the getUserMedia API...

Incredible Demos

  • By
    Drag & Drop Elements to the Trash with MooTools 1.2

    Everyone loves dragging garbage files from their desktop into their trash can. There's a certain amount of irony in doing something on your computer that you also do in real life. It's also a quick way to get rid of things. That's...

  • By
    MooTools 1.2 Image Protector: dwProtector

    Image protection is a hot topic on the net these days, and why shouldn't it be? If you spent two hours designing an awesome graphic, would you want it ripped of in matter of seconds? Hell no! That's why I've created an image...

Discussion

  1. I’d use it to retrieve the X-Pingback value and if it was included, I’d send a trackback. ;-)

  2. adamnfish

    Or, if you don’t fancy installing furl for this, you can do the same with curl (a powerful and flexible utility for doing performing requests) with the -I flag:

    eg.
    curl -I http://davidwalsh.name

    (you probably have curl installed already)

    to see the headers and the full response, use the verbose flag
    curl -v http://davidwalsh.name

  3. @adamnfish: Thanks for sharing that. On a side note, “adamnfish” sounds like a wacky morning FM radio show.

  4. Not sure where sources are but the Debian package is at http://bertorello.ns0.it/debian/furl/

  5. As already mentioned,

    curl -I HOSTNAME

    Has the same functionality but without installing something extra.

  6. curl -I is good. This is another suggestion…

    lwp-request -ed “http://lindesk.com/”

  7. Marco

    another trick is:

    lynx -head http://davidwalsh.name

    lynx is a linux textual browser

  8. Dang! I should have read this sooner. I was itching to jump all over the “curl -I” suggestion. Everyone got here first!

  9. Rex

    alias furl=’curl -i -X HEAD’

Wrap your code in <pre class="{language}"></pre> tags, link to a GitHub gist, JSFiddle fiddle, or CodePen pen to embed!