7

Its been a few days that I got to learn about Generator functions in JavaScript which are introduced in ES6 version.

There were various places generators were explained but what seems too intriguing for me is the fact that everywhere it is said that

A generator function is a way to write async code synchronously.

The question that I want to raise is "Why is there so much need to introduce a completely different programming strategy just for the sake of it?"

I understand the async nature of the JS code makes it difficult for a newbie to understand and debug the code but does it require a complete change in coding style altogether?

I maybe wrong or not completely understanding the concept behind its introduction but being inquisitive about it all is what compelled me to ask this question.

6
  • 1
    They're about more than writing async code: generators provide a function that can run forever, pausing at times, and allowing exchange of values both out of and into for iterations. Commented Jan 5, 2016 at 5:28
  • 1
    Oh I agree on that but can you please give me a use-case where I would need that kind of function? Commented Jan 5, 2016 at 5:33
  • 1
    You could use them to generate GUIDs, access a data store, or more. I even built a text-based adventure game on top of one once, sending user commands in, and receiving player-status on each yield. That use was primarily driven by curiosity :) Commented Jan 5, 2016 at 5:35
  • if you look as the "pretend top-down" code that one can now make with generators, it's not exactly "newbie friendly", so i don't think that's a motivation of the implementation crafters. the real power will likely be apparent later when we start using custom iteration on huge/infinite/slow objects. Commented Jan 5, 2016 at 5:43
  • 3
    Generators alone don't allow you to write asynchronous code "synchronously" (that sounds very misleading btw). Generators + Promises allow you do that. But it's still weird, and hence we will get async/await in the future. I think it's beautiful that async/await is nothing but generators + promises, but I always saw this as a (maybe well-planned) side effect of generators. Commented Jan 5, 2016 at 5:56

3 Answers 3

4

Because closures are less convenient for simple iteration; simplifying syntax for reasonably common tasks can be worth it, even if the language supported the same pattern before. Just compare:

function chain() {
    var args = Array.from(arguments);
    return function() {
        if (args.length === 0) return undefined; // Or some other sentinel
        var nextval = args[0].shift(); // Destructive to avoid copies or more closure vars
        if (args[0].length === 0) args.shift();
        return nextval;
    };
}
var x;
// a, b and c must be indexable, e.g. Arrays; we can't handle other closures without
// requiring some API specific protocol for generation
for (var nextchain = chain(a, b, c); (x = nextchain()) !== undefined;) {
    // do stuff with current value
}

to:

function* chain() {
    for (var i = 0; i < arguments.length; ++i)
        yield* arguments[i];
}
// a, b and c can be any iterable object; yield* can handle
// strings, Arrays, other generators, etc., all with no special handling
for (var x of chain(a, b, c)) {
    // do stuff with current value
}

Sure, the savings in lines of code aren't incredible. It's mostly just reducing boilerplate and unnecessary names, removing the need to deal with closures for simple cases, and with the for...of syntax, providing a common mechanism to iterate arbitrary iterable things, rather than requiring the user to explicitly construct the initial closure and advance it by name. But if the pattern is common enough, that's useful enough.

As noted in comments, a, b, c must be Array-like for the closure based approach (or you'd use a different closure based approach where the writer of chain imposes arbitrary requirements on stuff passed to it, with special cases for Array-like stuff vs. generator-like closures) and processing is destructive (you'd need to add more closure state or make copies to make it non-destructive, making it more complex or slower); for the generator-based approach with yield*, no special cases required. This makes generators composable without complex specs; they can build on one another easily.

Sign up to request clarification or add additional context in comments.

Comments

2

Scenario 1 (Async) : You might have heard of the importance of writing “non-blocking” javascript. When we do an I/O operation, we use callbacks or promises in javascript to write non blocking javascript code.

Scenario 2 (Sync): running infinite loop in javascript eg: node -e 'while(true) {}' will possibly freeze your computer

With all of this in mind, ES6 Generators allow us to effectively “pause” execution in the middle of a function and resume it at some time in the future (async code synchronously)

Use case: Imagine you need to do something with a sequence of infinite values. Arrays won’t be helpful in that case instead we could use ES6 generator functions

var iterator = generateRandoms();   //suppose it is a generator function which loops through infinite sequence

//generator functions returns a next function which can be called anytime in your code to get the next value from the sequence

console.log(iterator.next());     // { value: 0.4900301224552095, done: false }
console.log(iterator.next());     // { value: 0.8244022422935814, done: false }

And as far as complexity is concerned, its a new syntax but would not take long to grasp it.

For further reading:

  1. http://x-team.com/2015/04/generators-work/
  2. https://msdn.microsoft.com/en-in/library/dn858237(v=vs.94).aspx

Comments

0

Actually, the generator function is not so much about asynchronous, it is more about a function that can be interrupted, how the flow is interrupted is determined by the caller of generator, through iterator -- more explanation below

function* ab3() {
  console.log(1);
  var c = yield 2;
  console.log(2);
  return 1 + c;
  console.log(3);
}
  1. a generator function is a function that can be truncated in between

  2. it's execution starts when iterator.next() is called, a generator function returns iterator when called

  3. first call of next will run the statement of generator function till the first yield and it will return the yielded value

  4. second call of iterator.next will run till next yield or return statement so it will take in the data passed through next(3) // (3 in this example) and store it in variable (variable to which the yield is assigned), var y = yield 2; so 3 will be stored in y; and statement execution till the return statment., it will return {done:true} over here because we have reached the end of the generator function

  5. next execution ( 3rd here) of iterator.next() will return {value: undefined, done:true}, since nothing is further returned by generator function

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.