Retrieve Headers with cURL

By  on  

We all know the cURL is incredibly useful.  We can retrieve remote content with curl, post to a remote URL, and perform hundreds of other tasks.  One simple task that can be completed is simply retrieving basic response headers.  To test the robot indexing prevention header I added to the Mozilla Developer Network, I used one simple cURL command to grab all headers from an address.

The Shell

The cURL command is short and sweet:

curl -I

Said command provides a list that looks similar to:

HTTP/1.1 200 OK
Date: Fri, 14 Sep 2012 21:51:17 GMT
Server: Apache/2.2.3 (CentOS)
Last-Modified: Fri, 14 Sep 2012 21:51:00 GMT
Accept-Ranges: bytes
Content-Length: 10910
Cache-Control: max-age=1, private, must-revalidate
Expires: Fri, 14 Sep 2012 22:51:00 GMT
Vary: Accept-Encoding,Cookie
X-Powered-By: W3 Total Cache/
Pragma: public
Connection: close
Content-Type: text/html; charset=UTF-8

This command is helpful when ensuring a given header has been correctly set within your programming, as well as seeing where a given short URL may redirect to:

$ curl -I

HTTP/1.1 301 Moved
Server: nginx
Date: Fri, 14 Sep 2012 21:53:14 GMT
Content-Type: text/html; charset=utf-8
Connection: keep-alive
Set-Cookie: _bit=5053a74a-0011d-0688d-311cf10a;;expires=Wed Mar 13 21:53:14 2013;path=/; HttpOnly
Cache-control: private; max-age=90
MIME-Version: 1.0
Content-Length: 115

It's also useful to see the server name, expires information and more.  I also appreciate that it's a clean list and no other information is pushed into the response.  If you get some time, cURL out to different popular domains and see what headers they send -- you could be surprised!

Recent Features

  • By
    5 Awesome New Mozilla Technologies You’ve Never Heard Of

    My trip to Mozilla Summit 2013 was incredible.  I've spent so much time focusing on my project that I had lost sight of all of the great work Mozillians were putting out.  MozSummit provided the perfect reminder of how brilliant my colleagues are and how much...

  • By
    6 Things You Didn’t Know About Firefox OS

    Firefox OS is all over the tech news and for good reason:  Mozilla's finally given web developers the platform that they need to create apps the way they've been creating them for years -- with CSS, HTML, and JavaScript.  Firefox OS has been rapidly improving...

Incredible Demos

  • By
    Spoiler Prevention with CSS Filters

    No one likes a spoiler.  Whether it be an image from an upcoming film or the result of a football match you DVR'd, sometimes you just don't want to know.  As a possible provider of spoiler content, some sites may choose to warn users ahead...

  • By
    MooTools Gone Wild: Element Flashing

    If you're like me and lay awake in bed at night, you've flipped on the TV and seen the commercials: misguided, attention-starved college girls fueled by alcohol ruining their futures by flashing lame camera-men on Spring Break. Why do they do it? Attention...


  1. Thanks for this great tips for cURL.
    I did not know before that i can get header with cURL

  2. Alex

    I’m using Charles (on desktop), it can show both the request and response headers. Handy even when not trying to cheat for water on farmville 2 :P

    I highly recommend it to other devs as it has many cool features (such as throttling, charts and request/response manipulation).

  3. That’s a useful tip – thanks! I’ve always done lynx -head dump but that’s a lot quicker and easier.

    One thing to be aware of is that it makes a HEAD request. I would think it normally will return the same header values as a GET or POST but still something worth thinking about depending what you are using it for.

  4. iamzesh

    It’s important to note that some servers are set to respond differently to HEADER requests than to GET requests. For example, a HEADER request returns a 200 OK while a GET request returns a 301 Moved Permanently… It can be quite confusing and unreliable depending on what you’re testing.

  5. Shyam Chathuranga

    Connection header is set to closed, however when checking via Firebug or other tools such as Pingdom shows Keep-alive. Do you know why is that so?

    Thank you,

Wrap your code in <pre class="{language}"></pre> tags, link to a GitHub gist, JSFiddle fiddle, or CodePen pen to embed!