Algorithms - 15. Heap (Priority Queue)
When to use Heap
Use a heap when you need to efficiently maintain the minimum or maximum element while frequently adding or removing elements. While most commonly used for priority queues, heaps can be adapted for:
- Priority queues (classic use case)
- Task scheduling by priority or deadline
- Dijkstra's shortest path algorithm
- Finding k-th largest/smallest element in a stream
- Heap sort algorithm for in-place sorting
- Median tracking using two heaps (min-heap and max-heap)
- Event simulation where events are processed by time
- Any problem where you repeatedly need the current minimum/maximum after insertions and deletions
Example problem
Task Scheduler: Given tasks with priorities [30, 10, 50, 20, 40, 5, 25]
, efficiently:
- Always process the highest priority task first
- Add new urgent tasks that jump to the front of the queue
- Remove completed tasks from the queue
After building a max-heap:
- First task to process: 50 (highest priority)
- After removing 50 and adding priority 60: next task is 60
- After removing 60: next task is 40
Answer: The heap maintains priority ordering, allows O(log n) insertion of urgent tasks, and O(log n) removal of the highest priority task.
What Heap Actually Does
A heap is a complete binary tree that maintains the heap property: every parent node has a value less than or equal to its children (min-heap) or greater than or equal to its children (max-heap). It provides exactly three core operations:
- PUSH: Add a new element while maintaining heap property
- POP: Remove and return the minimum/maximum element
- PEEK: View the minimum/maximum without removing it (O(1))
Think of it like a tournament bracket system:
- Min-heap: The champion (root) is the player with the lowest score (golf tournament)
- Max-heap: The champion (root) is the player with the highest score (basketball tournament)
- Each match winner advances up until reaching the top
How the Operations Work
PUSH Operation - "Add a new element"
When you call push(15)
, it:
- Places the element at the next available spot (maintaining complete tree)
- Percolates up: Swaps with parent if it violates heap property
- Continues swapping upward until heap property is restored
- Stops when either reaching root or parent is smaller (min-heap)
POP Operation - "Remove the minimum/maximum"
When you call pop()
, it:
- Saves the root value (this is what we return)
- Moves the last element to the root position
- Percolates down: Swaps with smaller child (min-heap) if property violated
- Continues swapping downward until heap property is restored
- Returns the saved root value
Array Representation Magic
The brilliant insight of heaps is representing a tree using an array with simple index arithmetic:
// For 1-indexed array (index 0 unused for simpler math):
parentIndex = Math.floor(childIndex / 2);
leftChildIndex = parentIndex * 2;
rightChildIndex = parentIndex * 2 + 1;
// This creates an implicit tree structure:
// Array: [0, 10, 30, 20, 40, 50, 70, 60]
// ^skip
//
// Tree visualization:
// 10 (index 1)
// / \
// 30(2) 20(3)
// / \ / \
// 40(4) 50(5) 70(6) 60(7)
The beauty of heaps is that maintaining one simple property (parent ≤ children for min-heap) automatically keeps the minimum at the root!
Single most important insight
The heap property only requires local parent-child relationships - and by maintaining just this simple local constraint through percolation, we automatically maintain a global property: the minimum/maximum is always at the root, accessible in O(1) time.
Mental note: The algorithm's genius lies in using a complete binary tree stored in an array. At each operation, we only need to fix violations locally: bubble up when inserting (child might be smaller than parent) or bubble down when removing (new root might be larger than children). These simple local fixes automatically maintain the global minimum/maximum at the root.
Decision Framework
Heap operations involve two key decisions:
During Push (percolate up):
- Compare with parent
- If violates heap property: swap and continue upward
- If satisfies heap property: stop
During Pop (percolate down):
- Compare with both children
- If violates heap property: swap with smaller child (min-heap) and continue downward
- If satisfies heap property: stop
Code
class MinHeap {
constructor() {
// Use 1-indexed array for simpler parent/child calculations
// Index 0 is unused (could store size or act as sentinel)
this.heap = new Array();
this.heap.push(0); // Placeholder at index 0
}
push(valueToInsert) {
// Step 1: Add element at the end (maintaining complete tree)
this.heap.push(valueToInsert);
// Step 2: Percolate up - bubble the new value up until heap property is satisfied
this.percolateUp();
}
percolateUp() {
// Start from the last element (just inserted)
let childIndex = this.heap.length - 1;
// Keep bubbling up while we're not at root
while (childIndex > 1) {
// Calculate parent position
const parentIndex = Math.floor(childIndex / 2);
// Get the actual values for comparison
const childValue = this.heap[childIndex];
const parentValue = this.heap[parentIndex];
// Check if heap property is violated
const childIsSmallerThanParent = childValue < parentValue;
// DECISION FRAMEWORK: Should we swap with parent?
if (childIsSmallerThanParent) {
// Swap child and parent to fix violation
this.swapElements(childIndex, parentIndex);
// Move up to parent's position for next iteration
childIndex = parentIndex;
} else {
// Heap property is satisfied - stop here
break;
}
}
}
pop() {
// Handle empty heap
const isEmpty = this.heap.length === 1;
if (isEmpty) {
return -1; // Or throw error
}
// Handle single element heap
const hasOnlyOneElement = this.heap.length === 2;
if (hasOnlyOneElement) {
return this.heap.pop();
}
// Step 1: Save the minimum value (at root) to return later
const minimumValue = this.heap[1];
// Step 2: Move last element to root position
const lastElement = this.heap.pop();
this.heap[1] = lastElement;
// Step 3: Percolate down - bubble root down until heap property is satisfied
this.percolateDown(1);
return minimumValue;
}
percolateDown(startIndex) {
let parentIndex = startIndex;
while (true) {
// Calculate children positions
const leftChildIndex = 2 * parentIndex;
const rightChildIndex = 2 * parentIndex + 1;
// Check if we've reached a leaf node
const isLeafNode = leftChildIndex >= this.heap.length;
if (isLeafNode) {
break;
}
// Get parent value
const parentValue = this.heap[parentIndex];
// Get left child value (we know it exists)
const leftChildValue = this.heap[leftChildIndex];
// Check if right child exists and get its value
const rightChildExists = rightChildIndex < this.heap.length;
const rightChildValue = rightChildExists
? this.heap[rightChildIndex]
: Infinity; // Use Infinity as placeholder for non-existent child
// Find the smaller child
const leftChildIsSmaller = leftChildValue <= rightChildValue;
// DECISION FRAMEWORK: Which child should we compare with?
const smallerChildIndex = leftChildIsSmaller
? leftChildIndex
: rightChildIndex;
const smallerChildValue = leftChildIsSmaller
? leftChildValue
: rightChildValue;
// Check if heap property is violated
const parentIsLargerThanChild = parentValue > smallerChildValue;
// DECISION FRAMEWORK: Should we swap with the smaller child?
if (parentIsLargerThanChild) {
// Swap parent with smaller child to fix violation
this.swapElements(parentIndex, smallerChildIndex);
// Move down to child's position for next iteration
parentIndex = smallerChildIndex;
} else {
// Heap property is satisfied - stop here
break;
}
}
}
swapElements(index1, index2) {
// Swap two elements in the heap array
const temp = this.heap[index1];
this.heap[index1] = this.heap[index2];
this.heap[index2] = temp;
}
peek() {
// Return minimum without removing it
const hasElements = this.heap.length > 1;
return hasElements ? this.heap[1] : -1;
}
// Build heap from array in O(n) time
heapify(inputArray) {
// Initialize heap with dummy element at index 0, then all input elements
this.heap = [0, ...inputArray];
// Find the last non-leaf node (parent of the last element)
const lastElementIndex = this.heap.length - 1;
const lastNonLeafIndex = Math.floor(lastElementIndex / 2);
// Percolate down from each non-leaf node, starting from bottom
for (let nodeIndex = lastNonLeafIndex; nodeIndex >= 1; nodeIndex--) {
this.percolateDown(nodeIndex);
}
}
}
// === EXAMPLE USAGE: Task Scheduler Problem ===
// We want to process tasks by priority (highest priority first)
// Since MinHeap gives us minimum first, we'll negate priorities
function demonstrateTaskScheduler() {
const scheduler = new MinHeap();
// Initial tasks with priorities
const taskPriorities = [30, 10, 50, 20, 40, 5, 25];
console.log("=== Task Scheduler Demo ===\n");
console.log("Adding tasks to scheduler:");
// Add all tasks (negate to convert min-heap to max-heap behavior)
for (const priority of taskPriorities) {
const negatedPriority = -priority; // Trick: negate for max-heap behavior
scheduler.push(negatedPriority);
console.log(` Added task with priority ${priority}`);
}
// Check highest priority without removing
const highestPriority = -scheduler.peek();
console.log(`\nHighest priority task: ${highestPriority}`);
// Process the highest priority task
const firstTaskPriority = -scheduler.pop();
console.log(`Processing task with priority ${firstTaskPriority}`);
// Add an urgent task
const urgentTaskPriority = 60;
console.log(`\nAdding urgent task with priority ${urgentTaskPriority}`);
scheduler.push(-urgentTaskPriority);
// Process all remaining tasks in priority order
console.log("\nProcessing remaining tasks in priority order:");
while (scheduler.peek() !== -1) {
const taskPriority = -scheduler.pop();
console.log(` Processing task with priority ${taskPriority}`);
}
}
// Run the demonstration
demonstrateTaskScheduler();
Doubt buster
Common doubt: "Why use an array starting at index 1 instead of 0? Isn't this wasting space and going against normal programming conventions?"
This is a reasonable concern - most data structures use 0-indexed arrays to avoid waste. Let's explore why heaps often break this convention.
Why this seems wrong
Starting arrays at index 1 appears inefficient:
- Wastes one array slot (index 0 unused)
- Goes against standard 0-indexing convention
- Seems like lazy programming
Many programmers' first instinct is to "fix" this by using 0-indexing.
The mathematical elegance of 1-indexing
With 1-indexed arrays, the parent-child relationships are beautifully simple:
// 1-indexed (elegant)
parent = Math.floor(i / 2)
leftChild = 2 * i
rightChild = 2 * i + 1
// Example: Node at index 5
parent = floor(5/2) = 2 ✓ Simple division
leftChild = 2*5 = 10 ✓ Simple multiplication
rightChild = 2*5+1 = 11 ✓ Simple multiplication + 1
With 0-indexed arrays, the formulas become more complex:
// 0-indexed (complex)
parent = Math.floor((i - 1) / 2)
leftChild = 2 * i + 1
rightChild = 2 * i + 2
// Example: Node at index 4 (same node as above)
parent = floor((4-1)/2) = 1 // Requires subtraction first
leftChild = 2*4+1 = 9 // Off-by-one adjustment
rightChild = 2*4+2 = 10 // Different off-by-one adjustment
Why the simpler math matters
1. Fewer operations = faster code
// 1-indexed: one operation for parent
parentIndex = currentIndex >> 1; // Bit shift (super fast)
// 0-indexed: multiple operations
parentIndex = (currentIndex - 1) >> 1; // Subtraction + bit shift
2. Less error-prone
The 0-indexed formulas have different adjustments (+1 for left, +2 for right, -1 for parent), making off-by-one errors more likely. The 1-indexed formulas have a consistent pattern.
3. Easier to reason about
With 1-indexing, the relationships form a clear pattern:
- Node
n
has children at2n
and2n+1
- These children have parent at
n
- It's immediately obvious that
floor(5/2) = 2
and2*2 = 4, 2*2+1 = 5
The binary representation insight
1-indexed heaps have a beautiful property in binary:
Index 1: 0001 (root)
Index 2: 0010 (left child of 1)
Index 3: 0011 (right child of 1)
Index 4: 0100 (left child of 2)
Index 5: 0101 (right child of 2)
Index 6: 0110 (left child of 3)
Index 7: 0111 (right child of 3)
Pattern: Parent is child with rightmost bit removed!
Child 5 (0101) → Parent 2 (010)
Child 7 (0111) → Parent 3 (011)
This pattern doesn't exist with 0-indexing!
Why wasting one element doesn't matter
Memory perspective:
- Modern systems allocate memory in chunks (often 16+ bytes)
- One unused integer (4-8 bytes) is negligible
- Array growth typically doubles capacity anyway
- The cleaner code prevents bugs that would waste far more resources
Practical example:
// 1-indexed: 1001 elements need array of size 1002
// 0-indexed: 1001 elements need array of size 1001
// Difference: 0.1% memory "waste"
// But the 1-indexed version might be 5-10% faster due to:
// - Simpler arithmetic (fewer CPU cycles)
// - Better branch prediction (cleaner patterns)
// - Fewer cache misses (more predictable access)
Real-world validation
Many production heap implementations use 1-indexing:
- Python's heapq uses 0-indexing but notes the formula complexity
- Java's PriorityQueue uses 0-indexing but has more complex parent/child methods
- Most algorithm textbooks teach 1-indexed heaps for clarity
- Competition programmers often prefer 1-indexed for fewer bugs
When to use each approach
Use 1-indexing when:
- Teaching or learning heap concepts
- Code clarity is paramount
- Performance is critical (simpler math)
- Working with algorithm competitions
Use 0-indexing when:
- Integrating with existing 0-indexed systems
- Memory is extremely constrained (embedded systems)
- Following team/language conventions
The hidden cost of 0-indexing
Consider this real bug that's easy to make with 0-indexing:
// 0-indexed heap - spot the bug!
class Heap {
constructor() {
this.heap = [];
}
parentIndex(i) {
return Math.floor(i - 1 / 2); // BUG! Should be (i-1)/2
}
}
// This compiles and runs but gives wrong results!
// With 1-indexing, Math.floor(i/2) is harder to mess up
The verdict
The "wasted" array element in 1-indexed heaps is a deliberate trade-off:
- Trade: One array slot (negligible memory)
- Gain: Simpler formulas, fewer bugs, faster operations, cleaner code
It's not lazy programming - it's an optimization that prioritizes correctness and performance over a trivial memory saving. The mathematical elegance alone justifies the approach, and the performance benefits seal the deal.