7.6 — Quick Revision: Time & Space Complexity
<< Overview
Big-O cheat sheet
| Complexity | Name | Example |
|---|
| O(1) | Constant | Hash lookup, array access |
| O(log n) | Logarithmic | Binary search |
| O(n) | Linear | Single loop, linear scan |
| O(n log n) | Linearithmic | Merge sort, heap sort |
| O(n²) | Quadratic | Nested loops, bubble sort |
| O(2ⁿ) | Exponential | Recursive Fibonacci, subsets |
| O(n!) | Factorial | Permutations |
Rules
- Drop constants: O(5n) → O(n)
- Drop lower terms: O(n² + n) → O(n²)
- Sequential: O(n) + O(m) = O(n + m)
- Nested: O(n) × O(m) = O(n × m)
- Recursion space = O(max depth)
Input size guide
n ≤ 10 → O(n!) OK
n ≤ 20 → O(2ⁿ) OK
n ≤ 500 → O(n³) OK
n ≤ 5000 → O(n²) OK
n ≤ 100K → O(n log n)
n ≤ 10M → O(n)
n ≤ 10¹⁸ → O(log n) or O(1)
Space complexity patterns
| Pattern | Space |
|---|
| Fixed variables | O(1) |
| Array of n elements | O(n) |
| 2D array n×m | O(nm) |
| Recursion depth d | O(d) |
| Hash map with k entries | O(k) |
Common pitfalls
- Forgetting recursive call stack space
- Treating hash map ops as always O(1) (can degrade to O(n))
- Not accounting for string immutability cost
- Confusing amortized O(1) with worst-case O(1)
- Mixing up O(n log n) and O(n²) when choosing algorithms
Self-check drill
SC-001
- Q: What is Big-O?
- A: Upper bound on algorithm's growth rate
SC-002
- Q: O(1) example?
- A: Array access by index, hash lookup
SC-003
- Q: O(log n) example?
- A: Binary search
SC-004
- Q: O(n²) means what?
- A: Runtime grows quadratically with input
SC-005
- Q: Why drop constants?
- A: Big-O measures growth rate, not exact time
SC-006
- Q: Merge sort complexity?
- A: O(n log n) time, O(n) space
SC-007
- Q: Recursion space?
- A: O(maximum call stack depth)
SC-008
- Q: How to optimize O(n²)?
- A: Hash maps, sorting, two pointers
SC-009
- Q: What is amortized analysis?
- A: Average cost over many operations
SC-010
- Q: Master theorem for T(n)=2T(n/2)+O(n)?
- A: O(n log n)
SC-011
- Q: What is Big-O?
- A: Upper bound on algorithm's growth rate
SC-012
- Q: O(1) example?
- A: Array access by index, hash lookup
SC-013
- Q: O(log n) example?
- A: Binary search
SC-014
- Q: O(n²) means what?
- A: Runtime grows quadratically with input
SC-015
- Q: Why drop constants?
- A: Big-O measures growth rate, not exact time
SC-016
- Q: Merge sort complexity?
- A: O(n log n) time, O(n) space
SC-017
- Q: Recursion space?
- A: O(maximum call stack depth)
SC-018
- Q: How to optimize O(n²)?
- A: Hash maps, sorting, two pointers
SC-019
- Q: What is amortized analysis?
- A: Average cost over many operations
SC-020
- Q: Master theorem for T(n)=2T(n/2)+O(n)?
- A: O(n log n)
SC-021
- Q: What is Big-O?
- A: Upper bound on algorithm's growth rate
SC-022
- Q: O(1) example?
- A: Array access by index, hash lookup
SC-023
- Q: O(log n) example?
- A: Binary search
SC-024
- Q: O(n²) means what?
- A: Runtime grows quadratically with input
SC-025
- Q: Why drop constants?
- A: Big-O measures growth rate, not exact time
SC-026
- Q: Merge sort complexity?
- A: O(n log n) time, O(n) space
SC-027
- Q: Recursion space?
- A: O(maximum call stack depth)
SC-028
- Q: How to optimize O(n²)?
- A: Hash maps, sorting, two pointers
SC-029
- Q: What is amortized analysis?
- A: Average cost over many operations
SC-030
- Q: Master theorem for T(n)=2T(n/2)+O(n)?
- A: O(n log n)
SC-031
- Q: What is Big-O?
- A: Upper bound on algorithm's growth rate
SC-032
- Q: O(1) example?
- A: Array access by index, hash lookup
SC-033
- Q: O(log n) example?
- A: Binary search
SC-034
- Q: O(n²) means what?
- A: Runtime grows quadratically with input
SC-035
- Q: Why drop constants?
- A: Big-O measures growth rate, not exact time
SC-036
- Q: Merge sort complexity?
- A: O(n log n) time, O(n) space
SC-037
- Q: Recursion space?
- A: O(maximum call stack depth)
SC-038
- Q: How to optimize O(n²)?
- A: Hash maps, sorting, two pointers
SC-039
- Q: What is amortized analysis?
- A: Average cost over many operations
SC-040
- Q: Master theorem for T(n)=2T(n/2)+O(n)?
- A: O(n log n)
SC-041
- Q: What is Big-O?
- A: Upper bound on algorithm's growth rate
SC-042
- Q: O(1) example?
- A: Array access by index, hash lookup
SC-043
- Q: O(log n) example?
- A: Binary search
SC-044
- Q: O(n²) means what?
- A: Runtime grows quadratically with input
SC-045
- Q: Why drop constants?
- A: Big-O measures growth rate, not exact time
SC-046
- Q: Merge sort complexity?
- A: O(n log n) time, O(n) space
SC-047
- Q: Recursion space?
- A: O(maximum call stack depth)
SC-048
- Q: How to optimize O(n²)?
- A: Hash maps, sorting, two pointers
SC-049
- Q: What is amortized analysis?
- A: Average cost over many operations
SC-050
- Q: Master theorem for T(n)=2T(n/2)+O(n)?
- A: O(n log n)
SC-051
- Q: What is Big-O?
- A: Upper bound on algorithm's growth rate
SC-052
- Q: O(1) example?
- A: Array access by index, hash lookup
SC-053
- Q: O(log n) example?
- A: Binary search
SC-054
- Q: O(n²) means what?
- A: Runtime grows quadratically with input
SC-055
- Q: Why drop constants?
- A: Big-O measures growth rate, not exact time
SC-056
- Q: Merge sort complexity?
- A: O(n log n) time, O(n) space
SC-057
- Q: Recursion space?
- A: O(maximum call stack depth)
SC-058
- Q: How to optimize O(n²)?
- A: Hash maps, sorting, two pointers
SC-059
- Q: What is amortized analysis?
- A: Average cost over many operations
SC-060
- Q: Master theorem for T(n)=2T(n/2)+O(n)?
- A: O(n log n)
SC-061
- Q: What is Big-O?
- A: Upper bound on algorithm's growth rate
SC-062
- Q: O(1) example?
- A: Array access by index, hash lookup
SC-063
- Q: O(log n) example?
- A: Binary search
SC-064
- Q: O(n²) means what?
- A: Runtime grows quadratically with input
SC-065
- Q: Why drop constants?
- A: Big-O measures growth rate, not exact time
SC-066
- Q: Merge sort complexity?
- A: O(n log n) time, O(n) space
SC-067
- Q: Recursion space?
- A: O(maximum call stack depth)
SC-068
- Q: How to optimize O(n²)?
- A: Hash maps, sorting, two pointers
SC-069
- Q: What is amortized analysis?
- A: Average cost over many operations
SC-070
- Q: Master theorem for T(n)=2T(n/2)+O(n)?
- A: O(n log n)
SC-071
- Q: What is Big-O?
- A: Upper bound on algorithm's growth rate
SC-072
- Q: O(1) example?
- A: Array access by index, hash lookup
SC-073
- Q: O(log n) example?
- A: Binary search
SC-074
- Q: O(n²) means what?
- A: Runtime grows quadratically with input
SC-075
- Q: Why drop constants?
- A: Big-O measures growth rate, not exact time
SC-076
- Q: Merge sort complexity?
- A: O(n log n) time, O(n) space
SC-077
- Q: Recursion space?
- A: O(maximum call stack depth)
SC-078
- Q: How to optimize O(n²)?
- A: Hash maps, sorting, two pointers
SC-079
- Q: What is amortized analysis?
- A: Average cost over many operations
SC-080
- Q: Master theorem for T(n)=2T(n/2)+O(n)?
- A: O(n log n)
SC-081
- Q: What is Big-O?
- A: Upper bound on algorithm's growth rate
SC-082
- Q: O(1) example?
- A: Array access by index, hash lookup
SC-083
- Q: O(log n) example?
- A: Binary search
SC-084
- Q: O(n²) means what?
- A: Runtime grows quadratically with input
SC-085
- Q: Why drop constants?
- A: Big-O measures growth rate, not exact time
SC-086
- Q: Merge sort complexity?
- A: O(n log n) time, O(n) space
SC-087
- Q: Recursion space?
- A: O(maximum call stack depth)
SC-088
- Q: How to optimize O(n²)?
- A: Hash maps, sorting, two pointers
SC-089
- Q: What is amortized analysis?
- A: Average cost over many operations
SC-090
- Q: Master theorem for T(n)=2T(n/2)+O(n)?
- A: O(n log n)
SC-091
- Q: What is Big-O?
- A: Upper bound on algorithm's growth rate
SC-092
- Q: O(1) example?
- A: Array access by index, hash lookup
SC-093
- Q: O(log n) example?
- A: Binary search
SC-094
- Q: O(n²) means what?
- A: Runtime grows quadratically with input
SC-095
- Q: Why drop constants?
- A: Big-O measures growth rate, not exact time
SC-096
- Q: Merge sort complexity?
- A: O(n log n) time, O(n) space
SC-097
- Q: Recursion space?
- A: O(maximum call stack depth)
SC-098
- Q: How to optimize O(n²)?
- A: Hash maps, sorting, two pointers
SC-099
- Q: What is amortized analysis?
- A: Average cost over many operations
SC-100
- Q: Master theorem for T(n)=2T(n/2)+O(n)?
- A: O(n log n)
SC-101
- Q: What is Big-O?
- A: Upper bound on algorithm's growth rate
SC-102
- Q: O(1) example?
- A: Array access by index, hash lookup
SC-103
- Q: O(log n) example?
- A: Binary search
SC-104
- Q: O(n²) means what?
- A: Runtime grows quadratically with input
SC-105
- Q: Why drop constants?
- A: Big-O measures growth rate, not exact time
SC-106
- Q: Merge sort complexity?
- A: O(n log n) time, O(n) space
SC-107
- Q: Recursion space?
- A: O(maximum call stack depth)
SC-108
- Q: How to optimize O(n²)?
- A: Hash maps, sorting, two pointers
SC-109
- Q: What is amortized analysis?
- A: Average cost over many operations
SC-110
- Q: Master theorem for T(n)=2T(n/2)+O(n)?
- A: O(n log n)
SC-111
- Q: What is Big-O?
- A: Upper bound on algorithm's growth rate
SC-112
- Q: O(1) example?
- A: Array access by index, hash lookup
SC-113
- Q: O(log n) example?
- A: Binary search
SC-114
- Q: O(n²) means what?
- A: Runtime grows quadratically with input
SC-115
- Q: Why drop constants?
- A: Big-O measures growth rate, not exact time
SC-116
- Q: Merge sort complexity?
- A: O(n log n) time, O(n) space
SC-117
- Q: Recursion space?
- A: O(maximum call stack depth)
SC-118
- Q: How to optimize O(n²)?
- A: Hash maps, sorting, two pointers
SC-119
- Q: What is amortized analysis?
- A: Average cost over many operations
SC-120
- Q: Master theorem for T(n)=2T(n/2)+O(n)?
- A: O(n log n)
SC-121
- Q: What is Big-O?
- A: Upper bound on algorithm's growth rate
SC-122
- Q: O(1) example?
- A: Array access by index, hash lookup
SC-123
- Q: O(log n) example?
- A: Binary search
SC-124
- Q: O(n²) means what?
- A: Runtime grows quadratically with input
SC-125
- Q: Why drop constants?
- A: Big-O measures growth rate, not exact time
SC-126
- Q: Merge sort complexity?
- A: O(n log n) time, O(n) space
SC-127
- Q: Recursion space?
- A: O(maximum call stack depth)
SC-128
- Q: How to optimize O(n²)?
- A: Hash maps, sorting, two pointers
SC-129
- Q: What is amortized analysis?
- A: Average cost over many operations
SC-130
- Q: Master theorem for T(n)=2T(n/2)+O(n)?
- A: O(n log n)
SC-131
- Q: What is Big-O?
- A: Upper bound on algorithm's growth rate
SC-132
- Q: O(1) example?
- A: Array access by index, hash lookup
SC-133
- Q: O(log n) example?
- A: Binary search
SC-134
- Q: O(n²) means what?
- A: Runtime grows quadratically with input
SC-135
- Q: Why drop constants?
- A: Big-O measures growth rate, not exact time
SC-136
- Q: Merge sort complexity?
- A: O(n log n) time, O(n) space
SC-137
- Q: Recursion space?
- A: O(maximum call stack depth)
SC-138
- Q: How to optimize O(n²)?
- A: Hash maps, sorting, two pointers
SC-139
- Q: What is amortized analysis?
- A: Average cost over many operations
SC-140
- Q: Master theorem for T(n)=2T(n/2)+O(n)?
- A: O(n log n)
SC-141
- Q: What is Big-O?
- A: Upper bound on algorithm's growth rate
SC-142
- Q: O(1) example?
- A: Array access by index, hash lookup
SC-143
- Q: O(log n) example?
- A: Binary search
SC-144
- Q: O(n²) means what?
- A: Runtime grows quadratically with input
SC-145
- Q: Why drop constants?
- A: Big-O measures growth rate, not exact time
SC-146
- Q: Merge sort complexity?
- A: O(n log n) time, O(n) space
SC-147
- Q: Recursion space?
- A: O(maximum call stack depth)
SC-148
- Q: How to optimize O(n²)?
- A: Hash maps, sorting, two pointers
SC-149
- Q: What is amortized analysis?
- A: Average cost over many operations
SC-150
- Q: Master theorem for T(n)=2T(n/2)+O(n)?
- A: O(n log n)
SC-151
- Q: What is Big-O?
- A: Upper bound on algorithm's growth rate
SC-152
- Q: O(1) example?
- A: Array access by index, hash lookup
SC-153
- Q: O(log n) example?
- A: Binary search
SC-154
- Q: O(n²) means what?
- A: Runtime grows quadratically with input
SC-155
- Q: Why drop constants?
- A: Big-O measures growth rate, not exact time
SC-156
- Q: Merge sort complexity?
- A: O(n log n) time, O(n) space
SC-157
- Q: Recursion space?
- A: O(maximum call stack depth)
SC-158
- Q: How to optimize O(n²)?
- A: Hash maps, sorting, two pointers
SC-159
- Q: What is amortized analysis?
- A: Average cost over many operations
SC-160
- Q: Master theorem for T(n)=2T(n/2)+O(n)?
- A: O(n log n)
SC-161
- Q: What is Big-O?
- A: Upper bound on algorithm's growth rate
SC-162
- Q: O(1) example?
- A: Array access by index, hash lookup
SC-163
- Q: O(log n) example?
- A: Binary search
SC-164
- Q: O(n²) means what?
- A: Runtime grows quadratically with input
SC-165
- Q: Why drop constants?
- A: Big-O measures growth rate, not exact time
SC-166
- Q: Merge sort complexity?
- A: O(n log n) time, O(n) space
SC-167
- Q: Recursion space?
- A: O(maximum call stack depth)
SC-168
- Q: How to optimize O(n²)?
- A: Hash maps, sorting, two pointers
SC-169
- Q: What is amortized analysis?
- A: Average cost over many operations
SC-170
- Q: Master theorem for T(n)=2T(n/2)+O(n)?
- A: O(n log n)
SC-171
- Q: What is Big-O?
- A: Upper bound on algorithm's growth rate
SC-172
- Q: O(1) example?
- A: Array access by index, hash lookup
SC-173
- Q: O(log n) example?
- A: Binary search
SC-174
- Q: O(n²) means what?
- A: Runtime grows quadratically with input
SC-175
- Q: Why drop constants?
- A: Big-O measures growth rate, not exact time
SC-176
- Q: Merge sort complexity?
- A: O(n log n) time, O(n) space
SC-177
- Q: Recursion space?
- A: O(maximum call stack depth)
SC-178
- Q: How to optimize O(n²)?
- A: Hash maps, sorting, two pointers
SC-179
- Q: What is amortized analysis?
- A: Average cost over many operations
SC-180
- Q: Master theorem for T(n)=2T(n/2)+O(n)?
- A: O(n log n)
SC-181
- Q: What is Big-O?
- A: Upper bound on algorithm's growth rate
SC-182
- Q: O(1) example?
- A: Array access by index, hash lookup
SC-183
- Q: O(log n) example?
- A: Binary search
SC-184
- Q: O(n²) means what?
- A: Runtime grows quadratically with input
SC-185
- Q: Why drop constants?
- A: Big-O measures growth rate, not exact time
SC-186
- Q: Merge sort complexity?
- A: O(n log n) time, O(n) space
SC-187
- Q: Recursion space?
- A: O(maximum call stack depth)
SC-188
- Q: How to optimize O(n²)?
- A: Hash maps, sorting, two pointers
SC-189
- Q: What is amortized analysis?
- A: Average cost over many operations
SC-190
- Q: Master theorem for T(n)=2T(n/2)+O(n)?
- A: O(n log n)
SC-191
- Q: What is Big-O?
- A: Upper bound on algorithm's growth rate
SC-192
- Q: O(1) example?
- A: Array access by index, hash lookup
SC-193
- Q: O(log n) example?
- A: Binary search
SC-194
- Q: O(n²) means what?
- A: Runtime grows quadratically with input
SC-195
- Q: Why drop constants?
- A: Big-O measures growth rate, not exact time
SC-196
- Q: Merge sort complexity?
- A: O(n log n) time, O(n) space
SC-197
- Q: Recursion space?
- A: O(maximum call stack depth)
SC-198
- Q: How to optimize O(n²)?
- A: Hash maps, sorting, two pointers
SC-199
- Q: What is amortized analysis?
- A: Average cost over many operations
SC-200
- Q: Master theorem for T(n)=2T(n/2)+O(n)?
- A: O(n log n)