DEV Community

Cover image for The Secret Life of JavaScript: The Async Generator
Aaron Rose
Aaron Rose

Posted on

The Secret Life of JavaScript: The Async Generator

How to handle streams of data with for await...of.


Timothy was rubbing his temples. On his screen was a function that looked like it had been fighting a losing battle.

async function getAllUsers() {
    let url = '/api/users?page=1';
    const allUsers = [];

    while (url) {
        const response = await fetch(url);
        const data = await response.json();

        // Add this page's users to our big list
        allUsers.push(...data.users);

        // Prepare for the next loop... if there is one
        url = data.nextPage; 
    }

    return allUsers;
}

Enter fullscreen mode Exit fullscreen mode

"I'm trying to download all the user data," Timothy explained to Margaret. "But there are 50,000 users. If I wait for all the pages to download before I start processing them, the user waits for 20 seconds. It feels... stuck."

Margaret nodded. "You are treating a Stream like a Bucket," she said.

"You are trying to collect every single drop of water before you let anyone drink," she continued. "Why not let them drink from the hose?"

The Hybrid

Margaret wrote a new syntax on the board. It combined the two most powerful keywords in the language.

async function* fetchUsers() { ... }

Enter fullscreen mode Exit fullscreen mode

"Async meets Generator," she said. "The async allows us to wait for the network. The * allows us to yield data one piece at a time."

She rewrote Timothy's code, adding a safety net.

async function* fetchUsers() {
    let url = '/api/users?page=1';

    while (url) {
        try {
            const response = await fetch(url);
            const data = await response.json();

            // Instead of building a massive array, we deliver this page immediately
            for (const user of data.users) {
                yield user;
            }

            url = data.nextPage;
        } catch (error) {
            console.error("Stream interrupted", error);
            return; // Stop the stream safely
        }
    }
}

Enter fullscreen mode Exit fullscreen mode

Timothy looked at the code. "It looks similar," he admitted. "But how do I use it? The data isn't all there yet."

The Magic Loop (for await...of)

"This is where the magic happens," Margaret said. "We need a loop that knows how to wait."

She wrote the consumer code:

const userStream = fetchUsers();

for await (const user of userStream) {
    console.log("Processing:", user.name);
    // This loop automatically PAUSES while the next page downloads!
}

Enter fullscreen mode Exit fullscreen mode

Timothy watched the console simulation.

  1. The loop prints 10 users instantly.
  2. The loop pauses (while the network fetches Page 2).
  3. The loop wakes up and prints 10 more users.

"The pause is invisible," Timothy whispered.

"Exactly," Margaret said. "The code inside the loop doesn't know it is waiting. It just asks for the next user, and JavaScript handles the pause. You aren't processing a Memory Snapshot; you are processing Time."

The Emergency Brake

"One last thing," Margaret added, lowering her voice to a whisper. "In the real world, streams can be endless. Sometimes the user navigates away before you are done."

"What do I do?"

"You use an AbortController," she said. "It allows you to cut the hose. Always design your streams so they can be stopped."

The Conclusion

Timothy deleted his allUsers array. He didn't need the bucket anymore.

"It feels lighter," Timothy said. "I'm not hoarding data."

"That is the Zen of the Async Generator," Margaret smiled. "Don't carry the weight of the future. Just handle what is in front of you, right now."


Aaron Rose is a software engineer and technology writer at tech-reader.blog and the author of Think Like a Genius.

Top comments (10)

Collapse
 
alptekin profile image
alptekin I.

Great and mind opening post. Thank you for sharing. I learned something, again

Collapse
 
aaron_rose_0787cc8b4775a0 profile image
Aaron Rose

πŸ™β€

Collapse
 
bhavin-allinonetools profile image
Bhavin Sheth

This is a great explanation. I ran into this exact problem when working with a paginated API β€” I was waiting for everything to load before showing anything, and the UI felt slow. Switching to an async generator made a huge difference because I could process and show data immediately. Also +1 for mentioning AbortController β€” stopping the stream is something many examples forget, but it’s very important in real apps.

Collapse
 
aaron_rose_0787cc8b4775a0 profile image
Aaron Rose

βœ¨πŸ™

Collapse
 
trinhcuong-ast profile image
Kai Alder

The "bucket vs stream" analogy really clicked for me. I used this exact pattern recently when building a log viewer that needed to tail thousands of entries from a remote API. One thing worth mentioning though - if you're yielding individual items from each page (like the for (const user of data.users) loop), you might want to consider yielding the whole page as a chunk instead, especially if your consumer can handle batches. Yielding one-by-one is cleaner but can add overhead when you've got thousands of items per page. Also curious if anyone's combined this with ReadableStream? Since Node 18+ has web streams built in, you can pipe an async generator straight into a ReadableStream which opens up some nice composition patterns.

Collapse
 
aaron_rose_0787cc8b4775a0 profile image
Aaron Rose

βœ¨πŸ™β€οΈ

Collapse
 
nadim_mahmud_e7a2ff078389 profile image
Nadim Mahmud

I have a question is async data rerender while i click any button or load it will be helpful if you give me an example !

Collapse
 
aaron_rose_0787cc8b4775a0 profile image
Aaron Rose

Great question, Nadim! πŸš€

You are touching on the most important part: How do we get this onto the screen?

The Async Generator doesn't update the screen automatically. Instead, it hands you data piece-by-piece inside the loop. You write the code inside that loop to update the UI (or "rerender") the moment a new piece of data arrives.

Here is a simple example of how you would hook this up to a Button Click:

const button = document.getElementById('load-btn');
const list = document.getElementById('user-list');

button.addEventListener('click', async () => {
  // 1. Start the stream when user clicks
  const userStream = fetchUsers();

  // 2. Loop through the stream
  for await (const user of userStream) {
    // 3. Render THIS user immediately!
    // We don't have to wait for the whole list.
    const item = document.createElement('li');
    item.innerText = user.name;
    list.appendChild(item);
  }
});

Enter fullscreen mode Exit fullscreen mode

The Result: When the user clicks, they see the first item appear instantly, then the second, then the third. They don't have to wait for all 50,000 users to load. The "rerender" happens incrementally!

Pro Tip: For very fast streams (hundreds of items per second), you might want to batch updates using requestAnimationFrame or a micro-batching technique to avoid overwhelming the renderer.

Collapse
 
gregorystarr profile image
Gregory Starr

ill be adding this to my toolbox again, thanks for the refresher

Collapse
 
aaron_rose_0787cc8b4775a0 profile image
Aaron Rose

βœ¨πŸ™β€οΈ