Going Async With ES6 Generators

By  on  

Now that you've seen ES6 generators and are more comfortable with them, it's time to really put them to use for improving our real-world code.

The main strength of generators is that they provide a single-threaded, synchronous-looking code style, while allowing you to hide the asynchronicity away as an implementation detail. This lets us express in a very natural way what the flow of our program's steps/statements is without simultaneously having to navigate asynchronous syntax and gotchas.

In other words, we achieve a nice separation of capabilities/concerns, by splitting up the consumption of values (our generator logic) from the implementation detail of asynchronously fulfilling those values (the next(..) of the generator's iterator).

The result? All the power of asynchronous code, with all the ease of reading and maintainability of synchronous(-looking) code.

So how do we accomplish this feat?

Simplest Async

At its most simple, generators don't need anything extra to handle async capabilities that your program doesn't already have.

For example, let's imagine you have this code already:

function makeAjaxCall(url,cb) {
    // do some ajax fun
    // call `cb(result)` when complete
}

makeAjaxCall( "http://some.url.1", function(result1){
    var data = JSON.parse( result1 );

    makeAjaxCall( "http://some.url.2/?id=" + data.id, function(result2){
        var resp = JSON.parse( result2 );
        console.log( "The value you asked for: " + resp.value );
    });
} );

To use a generator (without any additional decoration) to express this same program, here's how you do it:

function request(url) {
    // this is where we're hiding the asynchronicity,
    // away from the main code of our generator
    // `it.next(..)` is the generator's iterator-resume
    // call
    makeAjaxCall( url, function(response){
        it.next( response );
    } );
    // Note: nothing returned here!
}

function *main() {
    var result1 = yield request( "http://some.url.1" );
    var data = JSON.parse( result1 );

    var result2 = yield request( "http://some.url.2?id=" + data.id );
    var resp = JSON.parse( result2 );
    console.log( "The value you asked for: " + resp.value );
}

var it = main();
it.next(); // get it all started

Let's examine how this works.

The request(..) helper basically wraps our normal makeAjaxCall(..) utility to make sure its callback invokes the generator iterator's next(..) method.

With the request("..") call, you'll notice it has no return value (in other words, it's undefined). This is no big deal, but it's something important to contrast with how we approach things later in this article: we effectively yield undefined here.

So then we call yield .. (with that undefined value), which essentially does nothing but pause our generator at that point. It's going to wait until the it.next(..) call is made to resume, which we've queued up (as the callback) to happen after our Ajax call finishes.

But what happens to the result of the yield .. expression? We assign that to the variable result1. How does that have the result of the first Ajax call in it?

Because when it.next(..) is called as the Ajax callback, it's passing the Ajax response to it, which means that value is getting sent back into our generator at the point where it's currently paused, which is in the middle of the result1 = yield .. statement!

That's really cool and super powerful. In essence, result1 = yield request(..) is asking for the value, but it's (almost!) completely hidden from us -- at least us not needing to worry about it here -- that the implementation under the covers causes this step to be asynchronous. It accomplishes that asynchronicity by hiding the pause capability in yield, and separating out the resume capability of the generator to another function, so that our main code is just making a synchronous(-looking) value request.

The exact same goes for the second result2 = yield result(..) statement: it transparently pauses & resumes, and gives us the value we asked for, all without bothering us about any details of asynchronicity at that point in our coding.

Of course, yield is present, so there is a subtle hint that something magical (aka async) may occur at that point. But yield is a pretty minor syntactic signal/overhead compared to the hellish nightmares of nested callbacks (or even the API overhead of promise chains!).

Notice also that I said "may occur". That's a pretty powerful thing in and of itself. The program above always makes an async Ajax call, but what if it didn't? What if we later changed our program to have an in-memory cache of previous (or prefetched) Ajax responses? Or some other complexity in our application's URL router could in some cases fulfill an Ajax request right away, without needing to actually go fetch it from a server?

We could change the implementation of request(..) to something like this:

var cache = {};

function request(url) {
    if (cache[url]) {
        // "defer" cached response long enough for current
        // execution thread to complete
        setTimeout( function(){
            it.next( cache[url] );
        }, 0 );
    }
    else {
        makeAjaxCall( url, function(resp){
            cache[url] = resp;
            it.next( resp );
        } );
    }
}

Note: A subtle, tricky detail here is the need for the setTimeout(..0) deferral in the case where the cache has the result already. If we had just called it.next(..) right away, it would have created an error, because (and this is the tricky part) the generator is not technically in a paused state yet. Our function call request(..) is being fully evaluated first, and then the yield pauses. So, we can't call it.next(..) again yet immediately inside request(..), because at that exact moment the generator is still running (yield hasn't been processed). But we can call it.next(..) "later", immediately after the current thread of execution is complete, which our setTimeout(..0) "hack" accomplishes. We'll have a much nicer answer for this down below.

Now, our main generator code still looks like:

var result1 = yield request( "http://some.url.1" );
var data = JSON.parse( result1 );
..

See!? Our generator logic (aka our flow control) didn't have to change at all from the non-cache-enabled version above.

The code in *main() still just asks for a value, and pauses until it gets it back before moving on. In our current scenario, that "pause" could be relatively long (making an actual server request, to perhaps 300-800ms) or it could be almost immediate (the setTimeout(..0) deferral hack). But our flow control doesn't care.

That's the real power of abstracting away asynchronicity as an implementation detail.

Better Async

The above approach is quite fine for simple async generators work. But it will quickly become limiting, so we'll need a more powerful async mechanism to pair with our generators, that's capable of handling a lot more of the heavy lifting. That mechanism? Promises.

If you're still a little fuzzy on ES6 Promises, I wrote an extensive 5-part blog post series all about them. Go take a read. I'll wait for you to come back. <chuckle, chuckle>. Subtle, corny async jokes ftw!

The earlier Ajax code examples here suffer from all the same Inversion of Control issues (aka "callback hell") as our initial nested-callback example. Some observations of where things are lacking for us so far:

  1. There's no clear path for error handling. As we learned in the previous post, we could have detected an error with the Ajax call (somehow), passed it back to our generator with it.throw(..), and then used try..catch in our generator logic to handle it. But that's just more manual work to wire up in the "back-end" (the code handling our generator iterator), and it may not be code we can re-use if we're doing lots of generators in our program.
  2. If the makeAjaxCall(..) utility isn't under our control, and it happens to call the callback multiple times, or signal both success and error simultaneously, etc, then our generator will go haywire (uncaught errors, unexpected values, etc). Handling and preventing such issues is lots of repetitive manual work, also possibly not portable.
  3. Often times we need to do more than one task "in parallel" (like two simultaneous Ajax calls, for instance). Since generator yield statements are each a single pause point, two or more cannot run at the same time -- they have to run one-at-a-time, in order. So, it's not very clear how to fire off multiple tasks at a single generator yield point, without wiring up lots of manual code under the covers.

As you can see, all of these problems are solvable, but who really wants to reinvent these solutions every time. We need a more powerful pattern that's designed specifically as a trustable, reusable solution for our generator-based async coding.

That pattern? yielding out promises, and letting them resume the generator when they fulfill.

Recall above that we did yield request(..), and that the request(..) utility didn't have any return value, so it was effectively just yield undefined?

Let's adjust that a little bit. Let's change our request(..) utility to be promises-based, so that it returns a promise, and thus what we yield out is actually a promise (and not undefined).

function request(url) {
    // Note: returning a promise now!
    return new Promise( function(resolve,reject){
        makeAjaxCall( url, resolve );
    } );
}

request(..) now constructs a promise that will be resolved when the Ajax call finishes, and we return that promise, so that it can be yielded out. What next?

We'll need a utility that controls our generator's iterator, that will receive those yielded promises and wire them up to resume the generator (via next(..)). I'll call this utility runGenerator(..) for now:

// run (async) a generator to completion
// Note: simplified approach: no error handling here
function runGenerator(g) {
    var it = g(), ret;

    // asynchronously iterate over generator
    (function iterate(val){
        ret = it.next( val );

        if (!ret.done) {
            // poor man's "is it a promise?" test
            if ("then" in ret.value) {
                // wait on the promise
                ret.value.then( iterate );
            }
            // immediate value: just send right back in
            else {
                // avoid synchronous recursion
                setTimeout( function(){
                    iterate( ret.value );
                }, 0 );
            }
        }
    })();
}

Key things to notice:

  1. We automatically initialize the generator (creating its it iterator), and we asynchronously will run it to completion (done:true).
  2. We look for a promise to be yielded out (aka the return value from each it.next(..) call). If so, we wait for it to complete by registering then(..) on the promise.
  3. If any immediate (aka non-promise) value is returned out, we simply send that value back into the generator so it keeps going immediately.

Now, how do we use it?

runGenerator( function *main(){
    var result1 = yield request( "http://some.url.1" );
    var data = JSON.parse( result1 );

    var result2 = yield request( "http://some.url.2?id=" + data.id );
    var resp = JSON.parse( result2 );
    console.log( "The value you asked for: " + resp.value );
} );

Bam! Wait... that's the exact same generator code as earlier? Yep. Again, this is the power of generators being shown off. The fact that we're now creating promises, yielding them out, and resuming the generator on their completion -- ALL OF THAT IS "HIDDEN" IMPLEMENTATION DETAIL! It's not really hidden, it's just separated from the consumption code (our flow control in our generator).

By waiting on the yielded out promise, and then sending its completion value back into it.next(..), the result1 = yield request(..) gets the value exactly as it did before.

But now that we're using promises for managing the async part of the generator's code, we solve all the inversion/trust issues from callback-only coding approaches. We get all these solutions to our above issues for "free" by using generators + promises:

  1. We now have built-in error handling which is easy to wire up. We didn't show it above in our runGenerator(..), but it's not hard at all to listen for errors from a promise, and wire them to it.throw(..) -- then we can use try..catch in our generator code to catch and handle errors.
  2. We get all the control/trustability that promises offer. No worries, no fuss.
  3. Promises have lots of powerful abstractions on top of them that automatically handle the complexities of multiple "parallel" tasks, etc.

    For example, yield Promise.all([ .. ]) would take an array of promises for "parallel" tasks, and yield out a single promise (for the generator to handle), which waits on all of the sub-promises to complete (in whichever order) before proceeding. What you'd get back from the yield expression (when the promise finishes) is an array of all the sub-promise responses, in order of how they were requested (so it's predictable regardless of completion order).

First, let's explore error handling:

// assume: `makeAjaxCall(..)` now expects an "error-first style" callback (omitted for brevity)
// assume: `runGenerator(..)` now also handles error handling (omitted for brevity)

function request(url) {
    return new Promise( function(resolve,reject){
        // pass an error-first style callback
        makeAjaxCall( url, function(err,text){
            if (err) reject( err );
            else resolve( text );
        } );
    } );
}

runGenerator( function *main(){
    try {
        var result1 = yield request( "http://some.url.1" );
    }
    catch (err) {
        console.log( "Error: " + err );
        return;
    }
    var data = JSON.parse( result1 );

    try {
        var result2 = yield request( "http://some.url.2?id=" + data.id );
    } catch (err) {
        console.log( "Error: " + err );
        return;
    }
    var resp = JSON.parse( result2 );
    console.log( "The value you asked for: " + resp.value );
} );

If a promise rejection (or any other kind of error/exception) happens while the URL fetching is happening, the promise rejection will be mapped to a generator error (using the -- not shown -- it.throw(..) in runGenerator(..)), which will be caught by the try..catch statements.

Now, let's see a more complex example that uses promises for managing even more async complexity:

function request(url) {
    return new Promise( function(resolve,reject){
        makeAjaxCall( url, resolve );
    } )
    // do some post-processing on the returned text
    .then( function(text){
        // did we just get a (redirect) URL back?
        if (/^https?:\/\/.+/.test( text )) {
            // make another sub-request to the new URL
            return request( text );
        }
        // otherwise, assume text is what we expected to get back
        else {
            return text;
        }
    } );
}

runGenerator( function *main(){
    var search_terms = yield Promise.all( [
        request( "http://some.url.1" ),
        request( "http://some.url.2" ),
        request( "http://some.url.3" )
    ] );

    var search_results = yield request(
        "http://some.url.4?search=" + search_terms.join( "+" )
    );
    var resp = JSON.parse( search_results );

    console.log( "Search results: " + resp.value );
} );

Promise.all([ .. ]) constructs a promise that's waiting on the three sub-promises, and it's that main promise that's yielded out for the runGenerator(..) utility to listen to for generator resumption. The sub-promises can receive a response that looks like another URL to redirect to, and chain off another sub-request promise to the new location. To learn more about promise chaining, read this article section.

Any kind of capability/complexity that promises can handle with asynchronicity, you can gain the sync-looking code benefits by using generators that yield out promises (of promises of promises of ...). It's the best of both worlds.

runGenerator(..): Library Utility

We had to define our own runGenerator(..) utility above to enable and smooth out this generator+promise awesomeness. We even omitted (for brevity sake) the full implementation of such a utility, as there's more nuance details related to error-handling to deal with.

But, you don't want to write your own runGenerator(..) do you?

I didn't think so.

A variety of promise/async libs provide just such a utility. I won't cover them here, but you can take a look at Q.spawn(..), the co(..) lib, etc.

I will however briefly cover my own library's utility: asynquence's runner(..) plugin, as I think it offers some unique capabilities over the others out there. I wrote an in-depth 2-part blog post series on asynquence if you're interested in learning more than the brief exploration here.

First off, asynquence provides utilities for automatically handling the "error-first style" callbacks from the above snippets:

function request(url) {
    return ASQ( function(done){
        // pass an error-first style callback
        makeAjaxCall( url, done.errfcb );
    } );
}

That's much nicer, isn't it!?

Next, asynquence's runner(..) plugin consumes a generator right in the middle of an asynquence sequence (asynchronous series of steps), so you can pass message(s) in from the preceding step, and your generator can pass message(s) out, onto the next step, and all errors automatically propagate as you'd expect:

// first call `getSomeValues()` which produces a sequence/promise,
// then chain off that sequence for more async steps
getSomeValues()

// now use a generator to process the retrieved values
.runner( function*(token){
    // token.messages will be prefilled with any messages
    // from the previous step
    var value1 = token.messages[0];
    var value2 = token.messages[1];
    var value3 = token.messages[2];

    // make all 3 Ajax requests in parallel, wait for
    // all of them to finish (in whatever order)
    // Note: `ASQ().all(..)` is like `Promise.all(..)`
    var msgs = yield ASQ().all(
        request( "http://some.url.1?v=" + value1 ),
        request( "http://some.url.2?v=" + value2 ),
        request( "http://some.url.3?v=" + value3 )
    );

    // send this message onto the next step
    yield (msgs[0] + msgs[1] + msgs[2]);
} )

// now, send the final result of previous generator
// off to another request
.seq( function(msg){
    return request( "http://some.url.4?msg=" + msg );
} )

// now we're finally all done!
.val( function(result){
    console.log( result ); // success, all done!
} )

// or, we had some error!
.or( function(err) {
    console.log( "Error: " + err );
} );

The asynquence runner(..) utility receives (optional) messages to start the generator, which come from the previous step of the sequence, and are accessible in the generator in the token.messages array.

Then, similar to what we demonstrated above with the runGenerator(..) utility, runner(..) listens for either a yielded promise or yielded asynquence sequence (in this case, an ASQ().all(..) sequence of "parallel" steps), and waits for it to complete before resuming the generator.

When the generator finishes, the final value it yields out passes along to the next step in the sequence.

Moreover, if any error happens anywhere in this sequence, even inside the generator, it will bubble out to the single or(..) error handler registered.

asynquence tries to make mixing and matching promises and generators as dead-simple as it could possibly be. You have the freedom to wire up any generator flows alongside promise-based sequence step flows, as you see fit.

ES7 async

There is a proposal for the ES7 timeline, which looks fairly likely to be accepted, to create still yet another kind of function: an async function, which is like a generator that's automatically wrapped in a utility like runGenerator(..) (or asynquence's' runner(..)). That way, you can send out promises and the async function automatically wires them up to resume itself on completion (no need even for messing around with iterators!).

It will probably look something like this:

async function main() {
    var result1 = await request( "http://some.url.1" );
    var data = JSON.parse( result1 );

    var result2 = await request( "http://some.url.2?id=" + data.id );
    var resp = JSON.parse( result2 );
    console.log( "The value you asked for: " + resp.value );
}

main();

As you can see, an async function can be called directly (like main()), with no need for a wrapper utility like runGenerator(..) or ASQ().runner(..) to wrap it. Inside, instead of using yield, you'll use await (another new keyword) that tells the async function to wait for the promise to complete before proceeding.

Basically, we'll have most of the capability of library-wrapped generators, but directly supported by native syntax.

Cool, huh!?

In the meantime, libraries like asynquence give us these runner utilities to make it pretty darn easy to get the most out of our asynchronous generators!

Summary

Put simply: a generator + yielded promise(s) combines the best of both worlds to get really powerful and elegant sync(-looking) async flow control expression capabilities. With simple wrapper utilities (which many libraries are already providing), we can automatically run our generators to completion, including sane and sync(-looking) error handling!

And in ES7+ land, we'll probably see async functions that let us do that stuff even without a library utility (at least for the base cases)!

The future of async in JavaScript is bright, and only getting brighter! I gotta wear shades.

But it doesn't end here. There's one last horizon we want to explore:

What if you could tie 2 or more generators together, let them run independently but "in parallel", and let them send messages back and forth as they proceed? That would be some super powerful capability, right!?! This pattern is called "CSP" (communicating sequential processes). We'll explore and unlock the power of CSP in the next article. Keep an eye out!

Kyle Simpson

About Kyle Simpson

Kyle Simpson is a web-oriented software engineer, widely acclaimed for his "You Don't Know JS" book series and nearly 1M hours viewed of his online courses. Kyle's superpower is asking better questions, who deeply believes in maximally using the minimally-necessary tools for any task. As a "human-centric technologist", he's passionate about bringing humans and technology together, evolving engineering organizations towards solving the right problems, in simpler ways. Kyle will always fight for the people behind the pixels.

Recent Features

  • By
    An Interview with Eric Meyer

    Your early CSS books were instrumental in pushing my love for front end technologies. What was it about CSS that you fell in love with and drove you to write about it? At first blush, it was the simplicity of it as compared to the table-and-spacer...

  • By
    Camera and Video Control with HTML5

    Client-side APIs on mobile and desktop devices are quickly providing the same APIs.  Of course our mobile devices got access to some of these APIs first, but those APIs are slowly making their way to the desktop.  One of those APIs is the getUserMedia API...

Incredible Demos

Discussion

  1. sean

    Hi master;

    “I got Method Generator.prototype.next called on incompatible receiver undefined” in the first demo, and I have to use bind to fix it’s context, we pass the it.next as the function to the callback, so, it was called as function instead of method. maybe my assumption is wrong, could you please help?

    • You’re absolutely correct, that was an oversight on my part when I was cleaning up the final code examples for the post. I’ve corrected the code in the post now.

      Side note: it.next.bind(it) is also a valid solution, though it’s IMO a little more clumsy looking. So I chose just to show a wrapping function. Either way, the end result is that next() is called on the proper iterator object.

    • sean

      oh, sorry I didn’t notice it was just a draft at that moment, thanks for your reply, your articles help me a lot in understand Generators, thanks for share

    • Nah, it wasn’t just a draft, you found a mistake I had made in the published article, and so thanks for that! :)

  2. I really appreciate your effort to provide code for ES6 Generators. It is very much benificial.

  3. Why the poor man’s promise test – isn’t it possible to just perform an instanceof check?

    ret.value instanceof Promise
    • @Christoph-

      Unfortunately, it’s not that simple. If you happened to be using only a genuine ES6 promise, it would be OK.

      But there’s lots of other libraries that generate “promises” which aren’t actually from that core ES6 Promise constructor. In promise-land, this concept is actually called a “thenable”, to refer to any promise-conforming object with a then() method on it. Promise implementations and libraries are all supposed to handle/consume all thenables, not just ES6-vended promises.

      Consider running libraries in older pre-ES6 browsers, where (sans polyfill) no such Promise constructor would exist. Even though we’re talking here about generators, which are also an ES6 only feature, generators can be transpiled to run in pre-ES6 browsers, and in that case, the promises you’d use would also probably be transpiled or polyfilled, and thus might not pass such a narrow check as p instanceof Promise.

      Make sense?

    • Ariel Jakobovits

      Just trying to practice my es6 here, so forgive me if I’m totally off base, but any chance a Symbol, if adopted by the promise libraries, could more reliably replace the ‘then’ property check?

  4. Braden

    In the first demo, why I encounter Error: Generator is already running when I execute the code.?

    [My Demo](http://jsfiddle.net/x97sopj1/1/)

    • The reason is because you’ve made your makeAjaxRequest(..) function synchronous, which means you’re trying to resume the generator before it’s actually been paused. You need to make it async to see the code work correctly.

      Try this version: http://jsfiddle.net/x97sopj1/2/

  5. Shouldn’t you be referencing the C# language’s implementation of async/await here for more reading? The implementation is the same, and it seems disingenuous to pretend you came up with this in the shower.

    • @Oisin-

      1. I didn’t invent async / await for JavaScript, that’s the TC39 committee. All I did was quote that they’re planning to implement it. I never claimed I “came up with [it] in the shower”.

      2. The TC39 committee has openly admitted they drew direct influence from C# for the proposal for async / await, so I think more than enough due credit has already been secured.

      3. Let’s keep things cordial around here, OK? No reason to jump to offense irrationally. I didn’t make the claims you seem to have assumed I did.

  6. Robert

    It would have totally cracked me up even more if the joke had gone like…

    … I PROMISE, I’ll *wait* for you to come back. . …

    Maybe you should roll that in… up the humor ante. ;)

    Good post.

  7. Thanks for the detailed post! I put together a TL;DR version similar to this post for those looking to get right down to business: Tame Async JavaScript With ES6. Hope it’s helpful for those scanning.

  8. Rich

    Fantastic post. Really made a few key ideas click with respect to combining Generators and Promises. Thanks.

  9. BT

    Why are you omitting error handling in your code examples? That’s the main thing I care about – how can you properly get errors thrown if you’re doing asynchronous stuff this way. Could you please add that in instead of omitting it?

  10. camilo

    I will go reactive combining some of these toys

  11. Jonas

    Is there a reason as to why you just don’t do:

    Promise.resolve(ret.value).then(iterate)
    

    and just skip the check for a thenable?

  12. I used your code (slightly changed) to process 1000 records read from CouchDB, in chunks of 100 records (using startkey) delivering each record from the cache[] object and getting another chunk from CouchDB when empty.
    Seems that the push-pull mechanism of generators, from yield to next(), especially when setTimeout( {.. that next() }, 0) in order to release the thread, is very slow, the job was done in 4300 ms.
    The exact same task, done with a simple callBack function was done in 428 ms. Generators are interesting but in my case they perform bad. In your example there are only two calls that need to be “returned synchronously” and that doesn’t count too much when the request hits the cache. But in my case, when 100 requests hits the record cache array, that setTimeout is slow.
    Is there a chance to change the pattern in order to issue a setTimeout only when the buffers is empty? I tried someething, inverting yield/next roles and yield cannot be called in closures! :-(

  13. Nick

    Hi Kyle, it seems clumsy to me that we have to reference “it” in the makeAjaxCall callback function before it is instantiated… Is there a better way to do this?

  14. Brian

    People find this simpler than callbacks? To me, this is wildly more complex, harder to follow, harder to debug, harder to get right. I can’t imagine trying to maintain code written this way.

    I also really don’t get the claims about its power. The caching example would be *exactly* the same using callbacks: the callback doesn’t care if it’s invoked via xhr or setTimeout. The generator/promise hasn’t contributed any useful abstraction in that example. By requiring a setTimeout it’s probably slower, as another reader noted.

    For high-level async abstractions, is there an argument for this pattern over Observable, which seems like a far, far simpler abstraction than generator/promise?

    • Unus

      same feeling here… I am a developer for about 17 years… and sometimes I lose track of the new things going on cause I do not have time to look at them been caught in my day to day life and projects; but then when I find time to take a look… oh boy instead of seeing things are getting better (thus also simpler) I See the opposite. why?! people are afraid of losing their jobs and try to have job security? :)
      seriously. this is crazy.
      I hope it is not just me…

    • elgs

      Yes and no. I am a developer for about 17 years, too. I have been coding in Javascript since my earliest code. Javascript has completely changed, in ways that I don’t like, coffee script, type script, and in ways I like, passing functions as parameters. I can clearly all these are a slowly struggling evolving process. New stuff keeps coming out in order to solve a problem, or to make one thing easier and better, but it could create more problems and new complicities. It’s just a evolving process driven by smart people. Those too complicated things without enough rewards will die out in this process. My $.02.

  15. Sergey

    Farewell callback hell, welcome generator hell.

  16. Unus

    I am looking at these promises and this new ways of doing async calls via promises + generators… man, I was expecting things to get better thus simpler… but no way, I see things are getting more complicated. why?
    why we do not keep things simple?
    is it just me that finds these new way way too complicated?

  17. i agree with all the neg comments. generators are a step backwards and simply unmaintainable in when used in an async fashion

  18. gnodar

    Hi, really enjoyed this article (and your books). One question, when calling runGenerator on *main():

        runGenerator( function *main(){
            var result1 = yield request( "http://some.url.1" );
            var data = JSON.parse( result1 );
    
            var result2 = yield request( "http://some.url.2?id=" + data.id );
            var resp = JSON.parse( result2 );
            console.log( "The value you asked for: " + resp.value );
        } );
    

    Why are you logging out resp.value, rather than just resp? It seems to be assuming that resp is the returned object from an iterator, when should already contain the value right?

    • > Why are you logging out resp.value, rather than just resp? It seems to be assuming that resp is the returned object from an iterator, when should already contain the value right?

      the .value here is not from the iterator, it’s a property from the fictional JSON object.

  19. Sangeet Agarwal

    Hi Kyle,

    In one of your earlier examples under the “Better Async” section, in the runGenerator IIFE, you check whether the iterator’s done is set to false

    if (!ret.done)

    , but this seems to be a redundant check since the iterate function will only be called when the generator yield’s and each time a it yield’s the done is set to false. I think the only time the done is set to true is if you were to go one step past the last yield.

    Here’s a codepen to illustrate this – btw, superb writeup – hoping to clarify this point to make sure I was understanding things correctly. thanks

    http://codepen.io/Sangeet/pen/QEQdoW

  20. Sangeet Agarwal

    quick correction on the above comment.

    “but this seems to be a redundant check since the iterate function will only be”

    should be

    “but this seems to be a redundant check since the iterator’s next() will only be”

  21. Kyle, great stuff as usual. And, thanks for pointing out that Q has a few methods relating to this type of functionality. I tend to use Q as a hold-over from Angular (and its $q service). I can’t believe I’m still discovering parts of the Q API – should probably just learn it already :)

  22. Thanks for good article, but with your runGenerator, when I reject the promise, it just ignores, not catching by catch statement in *main function.

  23. Andrew

    Why not just use regular synchronous functions? What’s the benefit of designing an async function to work like a synchronous function?

  24. Andrew

    The problem with all these methods of delaying response is that these language constructs assume I don’t need the thing I’m requesting. I don’t think humans think that way. If we really don’t need the thing, we can explicitly push it to a queue, but if we’re writing a function (with a return value) it’s likely we want some meaningful value to be returned (not the promise of one).

    I think callback/generator/async hell exists because people need the thing they request “immediately” (thus write a callback for every response).

    • Andrew

      *push our request to a queue

  25. Dominic Mayers

    The following does not seem correct:

    “Because when it.next(..) is called as the Ajax callback, it’s passing the Ajax response to it, which means that value is getting sent back into our generator at the point where it’s currently paused, which is in the middle of the result1 = yield .. statement! ”

    We don’t know at which yield statement the generator will be paused when the it.next() statement will be executed. It depends how long it takes.

    function* generator()
    {
      let res;  
      res = yield cb(); 
      console.log(res);
      res = yield
      console.log(res);
      res = yield
      console.log(res);
      res = yield
      console.log(res);
    }
    
    function cb ()
    {
      setTimeout(() => g.next('In cb'), 20);
    }
    
    var g = generator();
    
    g.next(); 
    setTimeout(() => g.next('Second'), 30); 
    setTimeout(() => g.next('Third'), 10); 
    setTimeout(() => g.next('Fourth'), 40); 
    
    // The result is
    // Third 
    // In cb
    // Second
    // Fourth
    
  26. Dominic Mayers

    But in your case, it works because there is no other next statement executed until after the queued one is executed, and so on.

  27. Nathan

    Thank you for this post. It really helped me understand how generators work.

  28. Blink

    Seems like the link to your previous posts on promises is dead (I got a DNS error).

  29. Olga

    Looks like this link https://blog.getify.com/promises-part-1/ is dead.

  30. Jason

    Thanks for your awesome article. I have some questions need your help. Below codes make me confused.

    function request(url) {
        // this is where we're hiding the asynchronicity,
        // away from the main code of our generator
        // it.next(..) is the generator's iterator-resume
        // call
        makeAjaxCall( url, function(response){
            it.next( response );
        } );
        // Note: nothing returned here!
    }
    
    function *main() {
        var result1 = yield request( "http://some.url.1" );
        var data = JSON.parse( result1 );
    
        var result2 = yield request( "http://some.url.2?id=" + data.id );
        var resp = JSON.parse( result2 );
        console.log( "The value you asked for: " + resp.value );
    }
    
    var it = main();
    it.next(); // get it all started
    

    In the function request, the iterator object “it” is not defined when you call it. Is this code working properly?

Wrap your code in <pre class="{language}"></pre> tags, link to a GitHub gist, JSFiddle fiddle, or CodePen pen to embed!