JavaScript Event Loop: What I Got Wrong
The Moment Everything Clicked
I've been writing JavaScript for a while now. Async code, Promises, setTimeout — I use them almost every day. I thought I understood how they work.
Then I went down a rabbit hole.
I was watching Philip Roberts' talk "What the heck is the event loop anyway?" and somewhere around the 15-minute mark, something caught me off guard. I had always assumed the event loop was part of V8 — the JavaScript engine. Turns out, it's not. V8 doesn't know what an event loop is. It just executes code and manages the call stack. The event loop is the runtime's job — the browser's, or Node's.
That one thing unraveled a bunch of assumptions I didn't even know I was carrying.
Like — I thought setTimeout(fn, 0) runs basically immediately after the current code. It doesn't. I thought Promise callbacks and setTimeout callbacks were in the same queue. They're not. I thought I had a clear mental model of how async JS works. I didn't.
Here's what my mental model looked like before:
So I decided to actually sit down, go through the resources properly, and write it all out — because the best way I know to solidify something is to explain it.
This is that attempt. If we've had the same fuzzy understanding, hopefully this helps.
So — What Actually Is the Event Loop?
Let me start with the thing that tripped me up first.
When I thought "event loop," I thought "V8." It felt like it should be part of the engine — the thing that runs JavaScript. But V8 is actually much more focused than that.
What V8 Actually Does
V8's entire job is to take our JavaScript, compile it, and execute it on the Call Stack. It also manages memory through the Heap. That's the full scope of what it does.
It doesn't know what setTimeout is. It doesn't know what fetch is. It has no concept of timers, network requests, or DOM events. If we ran V8 completely standalone, none of those APIs would exist.
Where the Runtime Takes Over
All of that comes from the runtime — the environment V8 is embedded in. In the browser, that runtime is managed by the browser itself (Chrome uses something called Blink alongside V8). In Node.js, the runtime is built on top of a C library called libuv, which handles all the async I/O and the event loop mechanics.
So the actual picture looks like this:
The Event Loop's only job is to watch the Call Stack. The moment it's empty, the Event Loop steps in and decides what runs next. And this is where the rule that governs all async JavaScript lives.
Microtasks (Promise callbacks, queueMicrotask) always get priority over macrotasks (setTimeout, setInterval, I/O callbacks). Always. Even if the macrotask was queued first.
This is why this code does what it does:
console.log('1');
setTimeout(() => console.log('2'), 0); // macrotask
Promise.resolve().then(() => console.log('3')); // microtask
console.log('4');
// Output: 1 → 4 → 3 → 2console.log('1');
setTimeout(() => console.log('2'), 0); // macrotask
Promise.resolve().then(() => console.log('3')); // microtask
console.log('4');
// Output: 1 → 4 → 3 → 2setTimeout with 0ms delay feels like it should run right away. But "0ms" just means "queue it as soon as possible" — it still goes into the Macrotask Queue. The Promise callback, being a microtask, jumps ahead of it.
I had definitely written code that depended on the wrong mental model of this ordering without realizing it.
Macrotasks and Microtasks — They're Not the Same Queue
Once I understood that V8 and the runtime are separate, the next thing that clicked was that there isn't just one queue. There are two — and they have very different priorities.
The Macrotask Queue is where callbacks from setTimeout, setInterval, and I/O operations land after their async work completes.
The Microtask Queue is where Promise callbacks live — anything we chain with .then(), .catch(), .finally(), or write after an await. queueMicrotask() and MutationObserver callbacks go here too.
Step-by-Step Tracer
Let's trace through the example step by step. Click Play or step through manually:
Both setTimeout callbacks were queued before the Promise chain even resolved — yet the Promise callback ran first. Because microtasks always drain completely before any macrotask gets a turn.
And it's not just one pass through the microtask queue. If a microtask queues another microtask, that also runs before any macrotask. The event loop doesn't move to macrotasks until the microtask queue is completely empty.
The Starvation Problem
This has a consequence most people never think about. If a microtask keeps queuing another microtask, the macrotask queue never gets a turn:
// This will STARVE the macrotask queue
function keepGoing() {
Promise.resolve().then(keepGoing);
}
keepGoing();
setTimeout(() => console.log('this never runs'), 0);// This will STARVE the macrotask queue
function keepGoing() {
Promise.resolve().then(keepGoing);
}
keepGoing();
setTimeout(() => console.log('this never runs'), 0);async/await is Just Microtasks
If we use async/await, we're already using microtasks — just with cleaner syntax. Everything after an await is the equivalent of a .then() callback:
async function run() {
console.log('A');
await Promise.resolve();
console.log('B'); // this is a microtask
}
console.log('1');
run();
console.log('2');
// Output: 1 → A → 2 → Basync function run() {
console.log('A');
await Promise.resolve();
console.log('B'); // this is a microtask
}
console.log('1');
run();
console.log('2');
// Output: 1 → A → 2 → B'B' doesn't run immediately after 'A' even though the Promise resolves instantly. The await suspends run(), lets the synchronous code ('2') finish, and then 'B' runs as a microtask. Once we see await as "pause here, queue the rest as a microtask," the output becomes completely predictable.
Why Any of This Matters in Real Code
Understanding the event loop isn't an interview exercise. I've run into each of these in actual projects — and each time, having the right mental model was the difference between a 5-minute fix and two hours of confusion.
1. The Spinner That Never Spins
This one is classic. We want to show a loading indicator before running something expensive:
setIsLoading(true);
const result = heavyComputation(); // takes 2 seconds
setIsLoading(false);setIsLoading(true);
const result = heavyComputation(); // takes 2 seconds
setIsLoading(false);The spinner never appears. We stare at it, add a console.log, confirm setIsLoading(true) is being called — and it is. So why doesn't it show?
Because the call stack never empties between those two lines. The browser only gets a chance to paint between macrotasks. Our heavy computation is blocking the single thread entirely, so the UI has no opportunity to reflect the state change.
The fix is to deliberately yield to the browser:
setIsLoading(true);
setTimeout(() => {
const result = heavyComputation();
setIsLoading(false);
}, 0);setIsLoading(true);
setTimeout(() => {
const result = heavyComputation();
setIsLoading(false);
}, 0);That setTimeout(fn, 0) isn't really "run after 0ms" — it's "queue this as a macrotask, which lets the browser render first." Once we know how the event loop works, this stops being a weird trick and starts being an obvious tool.
2. async forEach — The Silent Bug
This one has caught a lot of developers off guard, including me:
[1, 2, 3].forEach(async (num) => {
await fetch(`/api/${num}`);
console.log(num);
});
console.log('done');
// 'done' logs first — forEach doesn't await anything[1, 2, 3].forEach(async (num) => {
await fetch(`/api/${num}`);
console.log(num);
});
console.log('done');
// 'done' logs first — forEach doesn't await anythingforEach doesn't know anything about the Promise our async callback returns. It calls the function, gets a Promise back, and immediately moves on — it never awaits it. Each await fetch(...) suspends that specific callback's microtask chain, but forEach has already finished by then.
If we need to wait for all operations:
await Promise.all([1, 2, 3].map(async (num) => {
await fetch(`/api/${num}`);
console.log(num);
}));
console.log('done'); // now this actually waitsawait Promise.all([1, 2, 3].map(async (num) => {
await fetch(`/api/${num}`);
console.log(num);
}));
console.log('done'); // now this actually waits3. Sequential await Is Silently Killing Our Performance
This is less of a bug and more of a habit that costs us without realizing:
// Sequential — total time: 3 seconds
const user = await getUser(); // 1s
const posts = await getPosts(); // 1s
const comments = await getComments(); // 1s// Sequential — total time: 3 seconds
const user = await getUser(); // 1s
const posts = await getPosts(); // 1s
const comments = await getComments(); // 1sEach await suspends the function and queues the rest as a microtask after the Promise resolves. The next await doesn't even start until the previous one is completely done. If these three calls don't depend on each other, we're wasting time:
// Parallel — total time: 1 second
const [user, posts, comments] = await Promise.all([
getUser(),
getPosts(),
getComments()
]);// Parallel — total time: 1 second
const [user, posts, comments] = await Promise.all([
getUser(),
getPosts(),
getComments()
]);Same result, a third of the wait time. This only clicks properly once we understand that await is just a microtask checkpoint — not a magic "go do this in the background" instruction.
4. Never Rely on setTimeout(fn, 0) Ordering
A subtle one. It's tempting to write something like:
setTimeout(() => console.log('this should run first'), 0);
setTimeout(() => console.log('this should run second'), 0);setTimeout(() => console.log('this should run first'), 0);
setTimeout(() => console.log('this should run second'), 0);In practice this usually works in order — but the moment we add Promises, or the browser is under any load, macrotask ordering between separate setTimeout calls isn't something we want to depend on architecturally. If order matters, make it explicit in our code structure, not in timing assumptions.
The One Thing to Take Away
If I had to compress everything in this article into a single rule:
When the call stack is empty — drain all microtasks first, then pick one macrotask, then repeat.
That's the entire event loop. Every async quirk, every surprising output, every "why isn't my spinner showing" bug — it all traces back to that one rule.
But here's the deeper insight that changes how we think about JavaScript entirely:
The JS Engine (V8) is single-threaded, synchronous, one thing at a time on the call stack. No exceptions.
The Runtime — the Web APIs it provides (like setTimeout, fetch) actually run outside the JS thread entirely. The browser handles them in its own internal threads (written in C++). So when we call fetch(), the actual network request is happening in the browser's networking layer — completely separate from our JS thread.
So it's not that the runtime makes JS parallel — our JS code never runs in parallel with itself. What the runtime does is offload the waiting part (waiting for a timer, waiting for a network response) to the outside world, and only brings the callback back into our JS thread when the call stack is free.
So the more precise conclusion is:
- JS execution — always single-threaded and synchronous
- The waiting/I/O work — offloaded to the runtime, runs outside JS
- The result — feels asynchronous, but our code itself never truly runs in parallel
Worth being precise about the word "parallel" — in interviews, that distinction matters a lot. And in debugging, it's the difference between understanding the bug and just staring at the screen.
What's Next
This is the first article in a series I'm writing as I go deeper into JS engine internals. Next up: how V8 actually compiles our JavaScript — parsing, the AST, Ignition, TurboFan, and why the code we write affects how well V8 can optimize it.
If that sounds interesting, follow along — I'm learning this in public and writing it up as I go.
Find me on Twitter, GitHub, or LinkedIn.
References
These two talks shaped my understanding more than anything else I've read. If you haven't watched them yet, start here:
- What the heck is the event loop anyway? — Philip Roberts (JSConf EU 2014). The talk that started it all for me. He built Loupe, an incredible visual tool to see the event loop in action.
- JavaScript Visualized — Event Loop, Web APIs, (Micro)task Queue — Lydia Hallie. Beautifully animated breakdown of microtasks vs macrotasks. Her visual style of explaining async JavaScript is unmatched.