recursion.guide
A FIELD GUIDE · NO. 01

Recursion,
slowly.

A patient, visual tour of how functions call themselves — with code in Python, Java, and Go, hand-drawn call trees, quizzes between lessons, and honest notes on performance. No prior experience assumed.

▸ Start reading View contents
TABLE OF CONTENTS
§
01
CHAPTER ONE

What is recursion, really?

Recursion is when a function calls itself to solve a smaller version of the same problem.

Think of Russian nesting dolls. Each doll holds a smaller doll inside, which holds an even smaller doll, and so on — until you reach the tiniest one that can't be opened. That smallest doll is where the "unwrapping" stops.

In code, a function that calls itself works the same way. Each call tackles a slightly simpler problem, passing the smaller piece "inward." Eventually, we hit a case so simple it can be answered directly — and then everything unwinds back out.

◆ MENTAL MODEL

Every recursive problem has two parts: "How do I handle the tiniest case?" and "How do I turn a big problem into a smaller version of itself?"

FIG. 1 · NESTED CALLS
f(4) f(3) f(2) f(1) base case return 1
02
CHAPTER TWO

Anatomy of a recursive function

Every recursive function — in every programming language — contains exactly two ingredients. Miss either one and things go badly wrong.

INGREDIENT 01

The base case

A condition so simple the function can return an answer without calling itself. This is the "stop" signal.

Without it: infinite recursion. The program calls itself forever until the call stack runs out of memory and crashes. (You'll see StackOverflowError in Java, RecursionError in Python, or fatal error: stack overflow in Go.)

INGREDIENT 02

The recursive case

The part where the function calls itself with a smaller or simpler input, moving toward the base case.

Without progress: if each call doesn't actually get closer to the base case, you again get infinite recursion. "Smaller" must mean something concrete — smaller number, shorter list, fewer nodes, etc.

THE SIMPLEST POSSIBLE EXAMPLE · COUNTDOWN

Print numbers from n down to 1. Look at the base case and the recursive call in each language — the pattern is identical.

def countdown(n):
    if n <= 0:           # ← base case: stop here
        return
    print(n)
    countdown(n - 1)   # ← recursive call with smaller input

countdown(3)  # prints: 3, 2, 1
03
CHAPTER THREE

Your first walkthrough: factorial

Factorial is the gentlest recursion to trace by hand. We write n! (read "n factorial") for the product of all whole numbers from 1 up to n. So 4! = 4 × 3 × 2 × 1 = 24.

def factorial(n):
    if n <= 1:
        return 1                    # base case
    return n * factorial(n - 1)    # recursive case

print(factorial(4))  # 24
NOW LET'S TRACE factorial(4) BY HAND

Press Next step and watch what happens. Each line shows what the computer is thinking at that moment.

> Call factorial(4)
◆ THE BIG INSIGHT

Notice the two phases: first the function keeps calling itself deeper and deeper (the "winding up" phase), suspending each call mid-calculation. Then, once it hits the base case, results unwind back out, each multiplication finally happening on the way back up.

04
CHAPTER FOUR

The call stack, visualized

Every time a function is called, the computer adds a little "note" to a pile called the call stack. The note remembers: which function, which inputs, and where to come back to when done.

Recursion piles these notes up. factorial(4) can't finish until factorial(3) returns, which can't finish until factorial(2) returns, and so on. When the base case finally returns a real value, the stack unwinds — each function finishes its multiplication, pops off, and hands the answer to the one below it.

FIG. 2 · THE CALL STACK GROWS, THEN SHRINKS
winding up (calls stack up) factorial(4) factorial(3) factorial(2) factorial(1) ← base case hit! returns unwind unwinding (returns pop off) 4 × 6 = 24 3 × 2 = 6 2 × 1 = 2 returns 1 Stack pointer grows DOWN here → ← then shrinks UP here
◆ WHY THIS MATTERS

The call stack has a fixed size limit. Too many nested calls → crash. Python defaults to about 1000 recursion levels. Java's limit depends on thread stack size (typically 512KB–1MB, giving thousands of frames). Go grows the stack dynamically, so it handles deeper recursion but isn't infinite. We'll return to this in Chapter 7.

05
CHAPTER FIVE

Decoding your Fibonacci function

Here's the function you asked about, translated into three languages. They all do the same thing:

def f(n):
    if n <= 1:
        return n
    return f(n - 1) + f(n - 2)

print(f(4))  # 3
1

Identify the base cases

When n <= 1, return n. This means f(0) = 0 and f(1) = 1. No recursion happens; the function returns immediately. This is why f(1) returns 1.

2

Identify the recursive case

For any n > 1, return f(n-1) + f(n-2). Two recursive calls. Each gets smaller, so both are guaranteed to eventually hit the base case.

3

Recognize the pattern

Each number is the sum of the two before it: 0, 1, 1, 2, 3, 5, 8, 13, 21... — the Fibonacci sequence.

FIG. 3 · THE FULL CALL TREE FOR f(4)
f(4) f(3) f(2) f(2) f(1)=1 f(1)=1 f(0)=0 f(1)=1 f(0)=0 f(4) = f(3) + f(2) = 2 + 1 = 3 base cases (orange) return immediately every other call waits for its two children to finish
◆ ANSWER

f(4) returns 3. Trace back up: the base cases return 1, 1, 1, 0, 1, 0. Adding them up the tree gives f(2)=1, f(3)=2, and finally f(4) = 2 + 1 = 3.

06
CHAPTER SIX · CHECKPOINT

Quiz break

Four questions. No pressure. Tap an answer — green means right, red means have another look. Explanations appear underneath.

QUESTION 01

What does factorial(0) return, using the Chapter 3 code?

QUESTION 02

What happens if you remove the base case from a recursive function?

QUESTION 03

Using the Fibonacci function from Chapter 5, what does f(5) return?

QUESTION 04

Why does Fibonacci need TWO base cases (n=0 AND n=1) instead of just one?

§ § §
07
CHAPTER SEVEN · ADVANCED

When recursion gets slow

The naive Fibonacci function we just wrote is painfully slow. Try f(45) — you might wait 10 seconds. Try f(50) and you could wait minutes. Why?

Look at the call tree again. To compute f(4), we ended up computing f(2) twice and f(1) three times. The redundancy explodes as n grows. Computing f(30) makes more than a million calls. Computing f(50) makes over 20 billion.

FIG. 4 · REDUNDANT WORK HIGHLIGHTED
f(5) f(4) f(3) ① f(3) ② f(2) f(2) f(1) f(3) is computed TWICE — totally redundant f(2) gets computed three times in the full tree. f(1) four times. This redundancy is exponential: O(2ⁿ)
BIG-O COMPLEXITY FOR COMMON RECURSIVE PATTERNS
Pattern Time Space (stack) Example
Linear (one call) O(n) O(n) factorial, countdown, sum of list
Linear + memo O(n) O(n) memoized Fibonacci
Divide & conquer O(n log n) O(log n) merge sort, quicksort avg.
Binary search O(log n) O(log n) searching a sorted array
Naive double-branching O(2ⁿ) O(n) naive Fibonacci, naive subset-sum
Exhaustive permutations O(n!) O(n) generating all orderings
◆ THE RULE OF THUMB

If a recursive function makes two or more calls on itself in the recursive case, and those calls can revisit the same subproblem, you probably have an exponential runtime problem. The fix is usually memoization — coming up next.

08
CHAPTER EIGHT · ADVANCED

Memoization & dynamic programming

Memoization is a fancy word for a simple idea: cache the answer to each subproblem the first time you compute it, so you never redo work. With one line of caching, Fibonacci goes from minutes to microseconds.

from functools import lru_cache

# Idiomatic Python: one decorator, done.
# lru_cache remembers every (input → output) pair.
@lru_cache(maxsize=None)
def f(n):
    if n <= 1:
        return n
    return f(n - 1) + f(n - 2)

print(f(100))  # instant — 354224848179261915075
BEFORE MEMO

O(2ⁿ) time

f(45) ≈ 1.8 billion calls. f(50) ≈ 20 billion. f(60) ≈ several hours.

AFTER MEMO

O(n) time

f(1000) returns in microseconds. Each n is computed exactly once.

◆ BONUS: THE ITERATIVE VERSION

For Fibonacci specifically, you don't even need recursion. A simple loop uses constant memory and is the fastest option:

# Python — O(n) time, O(1) space
def f(n):
    a, b = 0, 1
    for _ in range(n):
        a, b = b, a + b
    return a

Sometimes the best use of recursion is to recognize when not to use it. Learning to translate recursive thinking into iterative code is a core skill.

09
CHAPTER NINE · ADVANCED

Tail recursion across languages

A recursive call is tail recursive if it's the very last thing the function does — nothing happens after it returns. Some languages can optimize this into a loop, freeing up stack space.

NOT TAIL RECURSIVE
def factorial(n):
    if n <= 1: return 1
    return n * factorial(n-1)
    # ↑ multiplication happens AFTER the return
TAIL RECURSIVE
def factorial(n, acc=1):
    if n <= 1: return acc
    return factorial(n-1, n*acc)
    # ↑ the call IS the return. Nothing after.
LANGUAGE SUPPORT — A HARD REALITY
Language Tail-call optimization? What to do instead
Python ❌ No (by design — Guido's decision) Use a while loop for deep recursion
Java ❌ No (JVM doesn't optimize it) Iterative loops, or trampolining as a workaround
Go ❌ Not guaranteed Loops; Go's dynamic stack helps but won't save you at extreme depth
Scala, Kotlin, Scheme, Haskell ✓ Yes (Scala needs @tailrec, Kotlin tailrec) Write tail-recursive form freely
◆ TAKEAWAY

In Python, Java, and Go, tail recursion gives you cleaner code but no stack savings. If you need to recurse deeper than ~1000 levels (Python) or many thousands (Java/Go), rewrite as a loop instead. Don't rely on tail-call optimization that isn't there.

10
CHAPTER TEN · ADVANCED

Real-world patterns

Fibonacci and factorial are training wheels. In actual code, recursion shines for problems that are naturally hierarchical: file systems, JSON, HTML trees, binary search, sorting, compilers, and puzzle-solving.

FIG. 5 · A BINARY TREE IS RECURSION MADE VISIBLE
10 5 15 3 7 12 20 Each node is either a leaf (base case) or has children (recursive case). Sum the tree: sum(node) = node.value + sum(left) + sum(right)
EXAMPLE: SUM ALL VALUES IN A BINARY TREE
class Node:
    def __init__(self, val, left=None, right=None):
        self.val = val
        self.left = left
        self.right = right

def tree_sum(node):
    if node is None:
        return 0                    # base case: empty
    return node.val + tree_sum(node.left) + tree_sum(node.right)
EXAMPLE: BINARY SEARCH (DIVIDE & CONQUER)

Each call cuts the search space in half, so finding an item in a million-element sorted array takes only ~20 comparisons. This is O(log n) — wildly efficient.

def binary_search(arr, target, lo=0, hi=None):
    if hi is None: hi = len(arr) - 1
    if lo > hi:                     # base case: not found
        return -1
    mid = (lo + hi) // 2
    if arr[mid] == target:         # base case: found
        return mid
    if target < arr[mid]:
        return binary_search(arr, target, lo, mid - 1)
    return binary_search(arr, target, mid + 1, hi)
▸ WHERE YOU'LL SEE IT

File systems. Walking every file in every subdirectory is naturally recursive.

▸ WHERE YOU'LL SEE IT

Parsers & compilers. JSON, HTML, and programming languages all have nested grammar.

▸ WHERE YOU'LL SEE IT

Backtracking. Sudoku, maze solvers, N-queens — try, fail, undo, try again.

11
CHAPTER ELEVEN

Mistakes & best practices

✗ MISTAKE 01

Forgetting the base case

Always write the base case first, before the recursive call. If you can't state the base case out loud, you don't yet understand the problem.

✗ MISTAKE 02

Recursive call doesn't shrink the input

Make sure every recursive call is on a strictly smaller problem. Passing the same value, or one that's just as complex, means you'll never terminate.

✗ MISTAKE 03

Recomputing the same subproblem

If your call tree has repeats (like naive Fibonacci), you need memoization or an iterative rewrite. Don't accept exponential runtime when a cache makes it linear.

✗ MISTAKE 04

Using recursion on deep structures in Python

Python's default recursion limit is 1000. Walking a list of 10,000 elements recursively crashes. Use a loop, or sys.setrecursionlimit() carefully if you truly need it.

◆ BEST PRACTICE CHECKLIST
12
CHAPTER TWELVE · FINAL

Final quiz

Five questions covering the whole guide. Feel free to scroll back if you want to check your thinking before answering.

QUESTION 01

What is the Big-O time complexity of the naive recursive Fibonacci?

QUESTION 02

Which of these is a tail-recursive function?

QUESTION 03

Why is binary search O(log n)?

QUESTION 04

In Python, Java, and Go, can you rely on tail-call optimization?

QUESTION 05

When should you prefer a loop over recursion?

END OF GUIDE

You made it.

You now have the mental models, the vocabulary, and the language-agnostic patterns to read and write recursive code confidently. Go break some problems into smaller ones.

◆ ◆ ◆