Use FURL to Retrieve Website Headers
It's important to know what headers your website and its files are communicating. For example, if your website is providing a 404 status, you're probably streaking toward your computer to fix the problem. Using the FURL library, you may retrieve website headers from the command line.
The Shell Script
furl https://davidwalsh.name
Simple and quick -- just like every shell directive.
The Sample Response
HTTP/1.1 200 OK
Date: Thu, 25 Jun 2009 01:50:50 GMT
Server: Apache/2.2.3 (CentOS)
X-Powered-By: PHP/5.2.6
X-Pingback: https://davidwalsh.name/xmlrpc.php
Cache-Control: max-age=1, private, must-revalidate
Expires: Thu, 25 Jun 2009 01:50:51 GMT
Vary: Accept-Encoding
Connection: close
Content-Type: text/html; charset=UTF-8
Don't have FURL? Install it by scripting this:
sudo port install furl
How is this useful? I would use this to periodically (cron) check my website to make sure it was up. What would you use this for?
![CSS @supports]()
Feature detection via JavaScript is a client side best practice and for all the right reasons, but unfortunately that same functionality hasn't been available within CSS. What we end up doing is repeating the same properties multiple times with each browser prefix. Yuck. Another thing we...
![5 HTML5 APIs You Didn’t Know Existed]()
When you say or read "HTML5", you half expect exotic dancers and unicorns to walk into the room to the tune of "I'm Sexy and I Know It." Can you blame us though? We watched the fundamental APIs stagnate for so long that a basic feature...
![Duplicate DeSandro’s CSS Effect]()
I recently stumbled upon David DeSandro's website when I saw a tweet stating that someone had stolen/hotlinked his website design and code, and he decided to do the only logical thing to retaliate: use some simple JavaScript goodness to inject unicorns into their page.
![Facebook Sliders With Mootools and CSS]()
One of the great parts of being a developer that uses Facebook is that I can get some great ideas for progressive website enhancement. Facebook incorporates many advanced JavaScript and AJAX features: photo loads by left and right arrow, dropdown menus, modal windows, and...
I’d use it to retrieve the X-Pingback value and if it was included, I’d send a trackback. ;-)
Or, if you don’t fancy installing furl for this, you can do the same with curl (a powerful and flexible utility for doing performing requests) with the -I flag:
eg.
curl -I http://davidwalsh.name
(you probably have curl installed already)
to see the headers and the full response, use the verbose flag
curl -v http://davidwalsh.name
@adamnfish: Thanks for sharing that. On a side note, “adamnfish” sounds like a wacky morning FM radio show.
Not sure where sources are but the Debian package is at http://bertorello.ns0.it/debian/furl/
As already mentioned,
curl -I HOSTNAME
Has the same functionality but without installing something extra.
curl -I is good. This is another suggestion…
lwp-request -ed “http://lindesk.com/”
another trick is:
lynx -head http://davidwalsh.name
lynx is a linux textual browser
Dang! I should have read this sooner. I was itching to jump all over the “curl -I” suggestion. Everyone got here first!
alias furl=’curl -i -X HEAD’