Time Complexity Cheat Sheet: From O(1) to O(n!)
Time complexity is the language of algorithm efficiency. In coding interviews, you are expected to identify the complexity of your solution, justify it, and know when to optimize. This cheat sheet gives you a visual hierarchy of common complexities, concrete examples, and rules of thumb for analyzing your own code.
The Complexity Hierarchy
| Complexity | Name | Example |
|---|---|---|
| O(1) | Constant | Array index, hash lookup, arithmetic |
| O(log n) | Logarithmic | Binary search, balanced BST operations |
| O(n) | Linear | Single loop, two pointers, sliding window |
| O(n log n) | Linearithmic | Merge sort, heap sort, most efficient sorts |
| O(n²) | Quadratic | Nested loops, bubble sort, insertion sort |
| O(n³) | Cubic | Three nested loops, Floyd-Warshall |
| O(2^n) | Exponential | Backtracking, subsets, recursive Fibonacci |
| O(n!) | Factorial | All permutations, brute-force TSP |
Analyzing Nested Loops
The most common source of complexity is nested loops. For each loop, count how many iterations it performs in terms of n. Multiply the iteration counts when loops are nested.
Two independent loops: O(n) + O(n) = O(n)
for (let i = 0; i < n; i++) { /* O(n) */ }
for (let j = 0; j < n; j++) { /* O(n) */ }
// Total: O(n) + O(n) = O(n)Two nested loops: O(n) × O(n) = O(n²)
for (let i = 0; i < n; i++) {
for (let j = 0; j < n; j++) {
// O(1) work
}
}
// Total: O(n²)Inner loop depends on outer: O(n × (n + 1) / 2) ≈ O(n²)
for (let i = 0; i < n; i++) {
for (let j = i; j < n; j++) {
// O(1) work — inner runs (n-i) times
}
}
// Total: n + (n-1) + ... + 1 = n(n+1)/2 = O(n²)Analyzing Recursive Calls
For recursion, use the recurrence relation. The master theorem handles divide-and-conquer recurrences of the form T(n) = aT(n/b) + f(n).
Single recursive call: O(n)
T(n) = T(n-1) + O(1) → O(n). One recursive call per level, n levels.
function f(n) {
if (n <= 0) return;
f(n - 1); // one call, n levels
}Two recursive calls: O(2^n)
T(n) = 2T(n-1) + O(1) → O(2^n). Binary tree of calls.
function fib(n) {
if (n <= 1) return n;
return fib(n-1) + fib(n-2); // 2 branches per level
}Divide and conquer: O(n log n)
T(n) = 2T(n/2) + O(n) → O(n log n). Merge sort, most balanced divide-and-conquer.
function mergeSort(arr, lo, hi) {
if (lo >= hi) return;
const mid = (lo + hi) >> 1;
mergeSort(arr, lo, mid); // T(n/2)
mergeSort(arr, mid+1, hi); // T(n/2)
merge(arr, lo, mid, hi); // O(n)
}Quick Rules of Thumb
- Halving the input each step: O(log n). Binary search, balanced BST.
- One pass over the data: O(n). Single loop, two pointers, sliding window.
- Comparing every pair: O(n²). Nested loops over same array.
- Trying all subsets: O(2^n). Each element: include or exclude.
- Trying all permutations: O(n!). Generate every ordering.
- Sort + linear scan: O(n log n). Often the best you can do for comparison-based algorithms.
Space Complexity Quick Reference
Space complexity follows similar rules. Recursion uses O(h) stack space where h is the depth. BFS uses O(w) for the queue where w is the maximum width. A DP table of size n×m uses O(n×m) space.
- • O(1): A few variables, no extra structures
- • O(log n): Balanced tree recursion, binary search recursion
- • O(n): One array, DFS on a list, hash map of n elements
- • O(n²): 2D DP table, adjacency matrix
For more pattern-specific complexity analysis, explore our pattern cheat sheet and interactive pattern walkthroughs.
Test your pattern recognition
Our quiz includes complexity questions. Can you identify the right pattern and its Big-O?