Use FURL to Retrieve Website Headers

By  on  

It's important to know what headers your website and its files are communicating. For example, if your website is providing a 404 status, you're probably streaking toward your computer to fix the problem. Using the FURL library, you may retrieve website headers from the command line.

The Shell Script

furl https://davidwalsh.name

Simple and quick -- just like every shell directive.

The Sample Response

HTTP/1.1 200 OK
Date: Thu, 25 Jun 2009 01:50:50 GMT
Server: Apache/2.2.3 (CentOS)
X-Powered-By: PHP/5.2.6
X-Pingback: https://davidwalsh.name/xmlrpc.php
Cache-Control: max-age=1, private, must-revalidate
Expires: Thu, 25 Jun 2009 01:50:51 GMT
Vary: Accept-Encoding
Connection: close
Content-Type: text/html; charset=UTF-8

Don't have FURL? Install it by scripting this:

sudo port install furl

How is this useful? I would use this to periodically (cron) check my website to make sure it was up. What would you use this for?

Recent Features

  • By
    Creating Scrolling Parallax Effects with CSS

    Introduction For quite a long time now websites with the so called "parallax" effect have been really popular. In case you have not heard of this effect, it basically includes different layers of images that are moving in different directions or with different speed. This leads to a...

  • By
    Interview with a Pornhub Web Developer

    Regardless of your stance on pornography, it would be impossible to deny the massive impact the adult website industry has had on pushing the web forward. From pushing the browser's video limits to pushing ads through WebSocket so ad blockers don't detect them, you have...

Incredible Demos

  • By
    Animated Progress Bars Using MooTools: dwProgressBar

    I love progress bars. It's important that I know roughly what percentage of a task is complete. I've created a highly customizable MooTools progress bar class that animates to the desired percentage. The Moo-Generated XHTML This DIV structure is extremely simple and can be controlled...

  • By
    AJAX For Evil:  Spyjax with jQuery

    Last year I wrote a popular post titled AJAX For Evil: Spyjax when I described a technique called "Spyjax": Spyjax, as I know it, is taking information from the user's computer for your own use — specifically their browsing habits. By using CSS and JavaScript, I...

Discussion

  1. I’d use it to retrieve the X-Pingback value and if it was included, I’d send a trackback. ;-)

  2. adamnfish

    Or, if you don’t fancy installing furl for this, you can do the same with curl (a powerful and flexible utility for doing performing requests) with the -I flag:

    eg.
    curl -I http://davidwalsh.name

    (you probably have curl installed already)

    to see the headers and the full response, use the verbose flag
    curl -v http://davidwalsh.name

  3. @adamnfish: Thanks for sharing that. On a side note, “adamnfish” sounds like a wacky morning FM radio show.

  4. Not sure where sources are but the Debian package is at http://bertorello.ns0.it/debian/furl/

  5. As already mentioned,

    curl -I HOSTNAME

    Has the same functionality but without installing something extra.

  6. curl -I is good. This is another suggestion…

    lwp-request -ed “http://lindesk.com/”

  7. Marco

    another trick is:

    lynx -head http://davidwalsh.name

    lynx is a linux textual browser

  8. Dang! I should have read this sooner. I was itching to jump all over the “curl -I” suggestion. Everyone got here first!

  9. Rex

    alias furl=’curl -i -X HEAD’

Wrap your code in <pre class="{language}"></pre> tags, link to a GitHub gist, JSFiddle fiddle, or CodePen pen to embed!