Use FURL to Retrieve Website Headers
It's important to know what headers your website and its files are communicating. For example, if your website is providing a 404 status, you're probably streaking toward your computer to fix the problem. Using the FURL library, you may retrieve website headers from the command line.
The Shell Script
furl https://davidwalsh.name
Simple and quick -- just like every shell directive.
The Sample Response
HTTP/1.1 200 OK
Date: Thu, 25 Jun 2009 01:50:50 GMT
Server: Apache/2.2.3 (CentOS)
X-Powered-By: PHP/5.2.6
X-Pingback: https://davidwalsh.name/xmlrpc.php
Cache-Control: max-age=1, private, must-revalidate
Expires: Thu, 25 Jun 2009 01:50:51 GMT
Vary: Accept-Encoding
Connection: close
Content-Type: text/html; charset=UTF-8
Don't have FURL? Install it by scripting this:
sudo port install furl
How is this useful? I would use this to periodically (cron) check my website to make sure it was up. What would you use this for?
![Responsive and Infinitely Scalable JS Animations]()
Back in late 2012 it was not easy to find open source projects using requestAnimationFrame()
- this is the hook that allows Javascript code to synchronize with a web browser's native paint loop. Animations using this method can run at 60 fps and deliver fantastic...
![Write Simple, Elegant and Maintainable Media Queries with Sass]()
I spent a few months experimenting with different approaches for writing simple, elegant and maintainable media queries with Sass. Each solution had something that I really liked, but I couldn't find one that covered everything I needed to do, so I ventured into creating my...
![Upload Photos to Flickr with PHP]()
I have a bit of an obsession with uploading photos to different services thanks to Instagram. Instagram's iPhone app allows me to take photos and quickly filter them; once photo tinkering is complete, I can upload the photo to Instagram, Twitter, Facebook, and...
![MooTools Image Preloading with Progress Bar]()
The idea of image preloading has been around since the dawn of the internet. When we didn't have all the fancy stuff we use now, we were forced to use ugly mouseover images to show dynamism. I don't think you were declared an official...
I’d use it to retrieve the X-Pingback value and if it was included, I’d send a trackback. ;-)
Or, if you don’t fancy installing furl for this, you can do the same with curl (a powerful and flexible utility for doing performing requests) with the -I flag:
eg.
curl -I http://davidwalsh.name
(you probably have curl installed already)
to see the headers and the full response, use the verbose flag
curl -v http://davidwalsh.name
@adamnfish: Thanks for sharing that. On a side note, “adamnfish” sounds like a wacky morning FM radio show.
Not sure where sources are but the Debian package is at http://bertorello.ns0.it/debian/furl/
As already mentioned,
curl -I HOSTNAME
Has the same functionality but without installing something extra.
curl -I is good. This is another suggestion…
lwp-request -ed “http://lindesk.com/”
another trick is:
lynx -head http://davidwalsh.name
lynx is a linux textual browser
Dang! I should have read this sooner. I was itching to jump all over the “curl -I” suggestion. Everyone got here first!
alias furl=’curl -i -X HEAD’