Episode 7 — DSA with JavaScript / 7.6 — Time and Space Complexity

7.6 — Interview Questions: Time & Space Complexity

<< Overview


Beginner

Q1. What is Big-O notation?

Answer: Big-O describes the upper bound of an algorithm's growth rate as input size increases. It tells us the worst-case scenario.

Common complexities (fastest to slowest): O(1) < O(log n) < O(n) < O(n log n) < O(n²) < O(2ⁿ) < O(n!)


Q2. What is O(1)? Give examples.

Answer: Constant time — execution doesn't depend on input size.

Examples: array access by index, hash map lookup, push/pop on stack, arithmetic operations.

arr[5];           // O(1)
map.get("key");   // O(1) average
stack.pop();      // O(1)

Q3. What is the difference between time and space complexity?

Answer:

  • Time complexity: How runtime grows with input size
  • Space complexity: How memory usage grows with input size
// O(n) time, O(1) space
function sum(arr) {
    let s = 0;
    for (const x of arr) s += x;
    return s;
}

// O(n) time, O(n) space
function doubled(arr) {
    return arr.map(x => x * 2);  // new array
}

Q4. What is the time complexity of binary search?

Answer: O(log n) — each step halves the search space.

For n = 1,000,000: only ~20 comparisons needed (log₂(10⁶) ≈ 20).


Q5. Why do we drop constants in Big-O?

Answer: Big-O measures growth rate, not exact time. O(2n) and O(100n) both grow linearly with n, so they're both O(n). The constant factor depends on hardware and implementation, not the algorithm.


Intermediate

Q6. What is the time complexity of merge sort? Explain why.

Answer: O(n log n).

  • The array is split in half at each level → log n levels
  • At each level, all n elements are merged → O(n) work per level
  • Total: O(n × log n)

Q7. Compare bubble sort, merge sort, and quick sort complexities.

Answer:

AlgorithmBestAverageWorstSpaceStable?
Bubble sortO(n)O(n²)O(n²)O(1)Yes
Merge sortO(n log n)O(n log n)O(n log n)O(n)Yes
Quick sortO(n log n)O(n log n)O(n²)O(log n)No

Q8. What is amortized O(1)?

Answer: An operation that is usually O(1) but occasionally O(n). When averaged over n operations, each operation costs O(1).

Example: vector.push_back() in C++ — most pushes are O(1), but when the capacity is exceeded, it copies all elements (O(n)). Doubling the capacity ensures this happens rarely enough that the average is O(1).


Q9. What is the space complexity of recursion?

Answer: O(depth of recursion) due to the call stack. Each recursive call adds a frame.

function factorial(n) {
    if (n <= 1) return 1;
    return n * factorial(n - 1);
}
// Space: O(n) — n frames on the call stack

Tail recursion can reduce this to O(1) with TCO optimization.


Q10. How do you analyze nested loops with different bounds?

Answer: Multiply the bounds. If inner loop depends on outer, sum the series.

for (let i = 0; i < n; i++) {
    for (let j = 0; j < m; j++) {  }
}
// O(n × m) — not O(n²) unless n === m

for (let i = 0; i < n; i++) {
    for (let j = 0; j < i; j++) {  }
}
// 0+1+2+...+(n-1) = n(n-1)/2 = O(n²)

Advanced

Q11. Explain the Master Theorem.

Answer: For recurrences T(n) = aT(n/b) + O(nᵈ):

CaseConditionResult
1d < log_b(a)O(n^(log_b(a)))
2d = log_b(a)O(nᵈ log n)
3d > log_b(a)O(nᵈ)

Binary search: a=1, b=2, d=0 → log_2(1)=0=d → Case 2 → O(log n) Merge sort: a=2, b=2, d=1 → log_2(2)=1=d → Case 2 → O(n log n)


Q12. Why is naive string concatenation in a loop O(n²)?

Answer: Strings are immutable. Each += creates a new string, copying all previous characters:

Iteration 1: copy 1 char
Iteration 2: copy 2 chars
...
Iteration n: copy n chars
Total: 1+2+...+n = n(n+1)/2 = O(n²)

Fix: use Array.join() or StringBuilder (C++: ostringstream).


Q13. What is the complexity of the Sieve of Eratosthenes?

Answer: O(n log log n) — nearly linear.

The inner loop runs n/2 + n/3 + n/5 + n/7 + ... ≈ n × Σ(1/p) for primes p ≤ n. By Mertens' theorem, this sum grows as log log n.


Q14. How do you prove an algorithm is optimal?

Answer: Show a lower bound — the minimum operations any algorithm must perform.

Examples:

  • Comparison-based sorting: Ω(n log n) — decision tree argument
  • Finding an element in unsorted array: Ω(n) — must check every element
  • If your algorithm matches the lower bound, it's optimal

Quick-fire table

#QuestionAnswer
1O(1) example?Array access, hash lookup
2O(log n) example?Binary search
3O(n) example?Linear search, single loop
4O(n log n) example?Merge sort
5O(n²) example?Nested loops, bubble sort
6Drop constants?Yes — O(3n) = O(n)
7Drop lower terms?Yes — O(n²+n) = O(n²)
8Recursion space?O(call stack depth)
9Amortized O(1)?Usually O(1), rarely O(n), averages O(1)
10Master theorem for merge sort?T(n)=2T(n/2)+O(n) → O(n log n)

Rapid self-check cards

SC-001

  • Q: What does Big-O measure?
  • A: Worst-case growth rate of time/space

SC-002

  • Q: O(1) means what?
  • A: Constant time — doesn't depend on input size

SC-003

  • Q: O(log n) example?
  • A: Binary search — halves input each step

SC-004

  • Q: O(n²) example?
  • A: Nested loops over the same array

SC-005

  • Q: Why drop constants?
  • A: Big-O measures growth rate, not exact time

SC-006

  • Q: Time complexity of merge sort?
  • A: O(n log n) — log n levels, n work each

SC-007

  • Q: Space complexity of recursion?
  • A: O(depth) due to call stack frames

SC-008

  • Q: What is amortized analysis?
  • A: Average cost over a sequence of operations

SC-009

  • Q: How to reduce O(n²) to O(n)?
  • A: Use hash maps or sorting + two pointers

SC-010

  • Q: What is tail recursion?
  • A: Recursive call is the last operation — can be optimized to O(1) space

SC-011

  • Q: What does Big-O measure?
  • A: Worst-case growth rate of time/space

SC-012

  • Q: O(1) means what?
  • A: Constant time — doesn't depend on input size

SC-013

  • Q: O(log n) example?
  • A: Binary search — halves input each step

SC-014

  • Q: O(n²) example?
  • A: Nested loops over the same array

SC-015

  • Q: Why drop constants?
  • A: Big-O measures growth rate, not exact time

SC-016

  • Q: Time complexity of merge sort?
  • A: O(n log n) — log n levels, n work each

SC-017

  • Q: Space complexity of recursion?
  • A: O(depth) due to call stack frames

SC-018

  • Q: What is amortized analysis?
  • A: Average cost over a sequence of operations

SC-019

  • Q: How to reduce O(n²) to O(n)?
  • A: Use hash maps or sorting + two pointers

SC-020

  • Q: What is tail recursion?
  • A: Recursive call is the last operation — can be optimized to O(1) space

SC-021

  • Q: What does Big-O measure?
  • A: Worst-case growth rate of time/space

SC-022

  • Q: O(1) means what?
  • A: Constant time — doesn't depend on input size

SC-023

  • Q: O(log n) example?
  • A: Binary search — halves input each step

SC-024

  • Q: O(n²) example?
  • A: Nested loops over the same array

SC-025

  • Q: Why drop constants?
  • A: Big-O measures growth rate, not exact time

SC-026

  • Q: Time complexity of merge sort?
  • A: O(n log n) — log n levels, n work each

SC-027

  • Q: Space complexity of recursion?
  • A: O(depth) due to call stack frames

SC-028

  • Q: What is amortized analysis?
  • A: Average cost over a sequence of operations

SC-029

  • Q: How to reduce O(n²) to O(n)?
  • A: Use hash maps or sorting + two pointers

SC-030

  • Q: What is tail recursion?
  • A: Recursive call is the last operation — can be optimized to O(1) space

SC-031

  • Q: What does Big-O measure?
  • A: Worst-case growth rate of time/space

SC-032

  • Q: O(1) means what?
  • A: Constant time — doesn't depend on input size

SC-033

  • Q: O(log n) example?
  • A: Binary search — halves input each step

SC-034

  • Q: O(n²) example?
  • A: Nested loops over the same array

SC-035

  • Q: Why drop constants?
  • A: Big-O measures growth rate, not exact time

SC-036

  • Q: Time complexity of merge sort?
  • A: O(n log n) — log n levels, n work each

SC-037

  • Q: Space complexity of recursion?
  • A: O(depth) due to call stack frames

SC-038

  • Q: What is amortized analysis?
  • A: Average cost over a sequence of operations

SC-039

  • Q: How to reduce O(n²) to O(n)?
  • A: Use hash maps or sorting + two pointers

SC-040

  • Q: What is tail recursion?
  • A: Recursive call is the last operation — can be optimized to O(1) space

SC-041

  • Q: What does Big-O measure?
  • A: Worst-case growth rate of time/space

SC-042

  • Q: O(1) means what?
  • A: Constant time — doesn't depend on input size

SC-043

  • Q: O(log n) example?
  • A: Binary search — halves input each step

SC-044

  • Q: O(n²) example?
  • A: Nested loops over the same array

SC-045

  • Q: Why drop constants?
  • A: Big-O measures growth rate, not exact time

SC-046

  • Q: Time complexity of merge sort?
  • A: O(n log n) — log n levels, n work each

SC-047

  • Q: Space complexity of recursion?
  • A: O(depth) due to call stack frames

SC-048

  • Q: What is amortized analysis?
  • A: Average cost over a sequence of operations

SC-049

  • Q: How to reduce O(n²) to O(n)?
  • A: Use hash maps or sorting + two pointers

SC-050

  • Q: What is tail recursion?
  • A: Recursive call is the last operation — can be optimized to O(1) space

SC-051

  • Q: What does Big-O measure?
  • A: Worst-case growth rate of time/space

SC-052

  • Q: O(1) means what?
  • A: Constant time — doesn't depend on input size

SC-053

  • Q: O(log n) example?
  • A: Binary search — halves input each step

SC-054

  • Q: O(n²) example?
  • A: Nested loops over the same array

SC-055

  • Q: Why drop constants?
  • A: Big-O measures growth rate, not exact time

SC-056

  • Q: Time complexity of merge sort?
  • A: O(n log n) — log n levels, n work each

SC-057

  • Q: Space complexity of recursion?
  • A: O(depth) due to call stack frames

SC-058

  • Q: What is amortized analysis?
  • A: Average cost over a sequence of operations

SC-059

  • Q: How to reduce O(n²) to O(n)?
  • A: Use hash maps or sorting + two pointers

SC-060

  • Q: What is tail recursion?
  • A: Recursive call is the last operation — can be optimized to O(1) space

SC-061

  • Q: What does Big-O measure?
  • A: Worst-case growth rate of time/space

SC-062

  • Q: O(1) means what?
  • A: Constant time — doesn't depend on input size

SC-063

  • Q: O(log n) example?
  • A: Binary search — halves input each step

SC-064

  • Q: O(n²) example?
  • A: Nested loops over the same array

SC-065

  • Q: Why drop constants?
  • A: Big-O measures growth rate, not exact time

SC-066

  • Q: Time complexity of merge sort?
  • A: O(n log n) — log n levels, n work each

SC-067

  • Q: Space complexity of recursion?
  • A: O(depth) due to call stack frames

SC-068

  • Q: What is amortized analysis?
  • A: Average cost over a sequence of operations

SC-069

  • Q: How to reduce O(n²) to O(n)?
  • A: Use hash maps or sorting + two pointers

SC-070

  • Q: What is tail recursion?
  • A: Recursive call is the last operation — can be optimized to O(1) space

SC-071

  • Q: What does Big-O measure?
  • A: Worst-case growth rate of time/space

SC-072

  • Q: O(1) means what?
  • A: Constant time — doesn't depend on input size

SC-073

  • Q: O(log n) example?
  • A: Binary search — halves input each step

SC-074

  • Q: O(n²) example?
  • A: Nested loops over the same array

SC-075

  • Q: Why drop constants?
  • A: Big-O measures growth rate, not exact time

SC-076

  • Q: Time complexity of merge sort?
  • A: O(n log n) — log n levels, n work each

SC-077

  • Q: Space complexity of recursion?
  • A: O(depth) due to call stack frames

SC-078

  • Q: What is amortized analysis?
  • A: Average cost over a sequence of operations

SC-079

  • Q: How to reduce O(n²) to O(n)?
  • A: Use hash maps or sorting + two pointers

SC-080

  • Q: What is tail recursion?
  • A: Recursive call is the last operation — can be optimized to O(1) space

SC-081

  • Q: What does Big-O measure?
  • A: Worst-case growth rate of time/space

SC-082

  • Q: O(1) means what?
  • A: Constant time — doesn't depend on input size

SC-083

  • Q: O(log n) example?
  • A: Binary search — halves input each step

SC-084

  • Q: O(n²) example?
  • A: Nested loops over the same array

SC-085

  • Q: Why drop constants?
  • A: Big-O measures growth rate, not exact time

SC-086

  • Q: Time complexity of merge sort?
  • A: O(n log n) — log n levels, n work each

SC-087

  • Q: Space complexity of recursion?
  • A: O(depth) due to call stack frames

SC-088

  • Q: What is amortized analysis?
  • A: Average cost over a sequence of operations

SC-089

  • Q: How to reduce O(n²) to O(n)?
  • A: Use hash maps or sorting + two pointers

SC-090

  • Q: What is tail recursion?
  • A: Recursive call is the last operation — can be optimized to O(1) space

SC-091

  • Q: What does Big-O measure?
  • A: Worst-case growth rate of time/space

SC-092

  • Q: O(1) means what?
  • A: Constant time — doesn't depend on input size

SC-093

  • Q: O(log n) example?
  • A: Binary search — halves input each step

SC-094

  • Q: O(n²) example?
  • A: Nested loops over the same array

SC-095

  • Q: Why drop constants?
  • A: Big-O measures growth rate, not exact time

SC-096

  • Q: Time complexity of merge sort?
  • A: O(n log n) — log n levels, n work each

SC-097

  • Q: Space complexity of recursion?
  • A: O(depth) due to call stack frames

SC-098

  • Q: What is amortized analysis?
  • A: Average cost over a sequence of operations

SC-099

  • Q: How to reduce O(n²) to O(n)?
  • A: Use hash maps or sorting + two pointers

SC-100

  • Q: What is tail recursion?
  • A: Recursive call is the last operation — can be optimized to O(1) space

SC-101

  • Q: What does Big-O measure?
  • A: Worst-case growth rate of time/space

SC-102

  • Q: O(1) means what?
  • A: Constant time — doesn't depend on input size

SC-103

  • Q: O(log n) example?
  • A: Binary search — halves input each step

SC-104

  • Q: O(n²) example?
  • A: Nested loops over the same array

SC-105

  • Q: Why drop constants?
  • A: Big-O measures growth rate, not exact time

SC-106

  • Q: Time complexity of merge sort?
  • A: O(n log n) — log n levels, n work each

SC-107

  • Q: Space complexity of recursion?
  • A: O(depth) due to call stack frames

SC-108

  • Q: What is amortized analysis?
  • A: Average cost over a sequence of operations

SC-109

  • Q: How to reduce O(n²) to O(n)?
  • A: Use hash maps or sorting + two pointers

SC-110

  • Q: What is tail recursion?
  • A: Recursive call is the last operation — can be optimized to O(1) space

SC-111

  • Q: What does Big-O measure?
  • A: Worst-case growth rate of time/space

SC-112

  • Q: O(1) means what?
  • A: Constant time — doesn't depend on input size

SC-113

  • Q: O(log n) example?
  • A: Binary search — halves input each step

SC-114

  • Q: O(n²) example?
  • A: Nested loops over the same array

SC-115

  • Q: Why drop constants?
  • A: Big-O measures growth rate, not exact time

SC-116

  • Q: Time complexity of merge sort?
  • A: O(n log n) — log n levels, n work each

SC-117

  • Q: Space complexity of recursion?
  • A: O(depth) due to call stack frames

SC-118

  • Q: What is amortized analysis?
  • A: Average cost over a sequence of operations

SC-119

  • Q: How to reduce O(n²) to O(n)?
  • A: Use hash maps or sorting + two pointers

SC-120

  • Q: What is tail recursion?
  • A: Recursive call is the last operation — can be optimized to O(1) space

SC-121

  • Q: What does Big-O measure?
  • A: Worst-case growth rate of time/space

SC-122

  • Q: O(1) means what?
  • A: Constant time — doesn't depend on input size

SC-123

  • Q: O(log n) example?
  • A: Binary search — halves input each step

SC-124

  • Q: O(n²) example?
  • A: Nested loops over the same array

SC-125

  • Q: Why drop constants?
  • A: Big-O measures growth rate, not exact time

SC-126

  • Q: Time complexity of merge sort?
  • A: O(n log n) — log n levels, n work each

SC-127

  • Q: Space complexity of recursion?
  • A: O(depth) due to call stack frames

SC-128

  • Q: What is amortized analysis?
  • A: Average cost over a sequence of operations

SC-129

  • Q: How to reduce O(n²) to O(n)?
  • A: Use hash maps or sorting + two pointers

SC-130

  • Q: What is tail recursion?
  • A: Recursive call is the last operation — can be optimized to O(1) space

SC-131

  • Q: What does Big-O measure?
  • A: Worst-case growth rate of time/space

SC-132

  • Q: O(1) means what?
  • A: Constant time — doesn't depend on input size

SC-133

  • Q: O(log n) example?
  • A: Binary search — halves input each step

SC-134

  • Q: O(n²) example?
  • A: Nested loops over the same array

SC-135

  • Q: Why drop constants?
  • A: Big-O measures growth rate, not exact time

SC-136

  • Q: Time complexity of merge sort?
  • A: O(n log n) — log n levels, n work each

SC-137

  • Q: Space complexity of recursion?
  • A: O(depth) due to call stack frames

SC-138

  • Q: What is amortized analysis?
  • A: Average cost over a sequence of operations

SC-139

  • Q: How to reduce O(n²) to O(n)?
  • A: Use hash maps or sorting + two pointers

SC-140

  • Q: What is tail recursion?
  • A: Recursive call is the last operation — can be optimized to O(1) space

SC-141

  • Q: What does Big-O measure?
  • A: Worst-case growth rate of time/space

SC-142

  • Q: O(1) means what?
  • A: Constant time — doesn't depend on input size

SC-143

  • Q: O(log n) example?
  • A: Binary search — halves input each step

SC-144

  • Q: O(n²) example?
  • A: Nested loops over the same array

SC-145

  • Q: Why drop constants?
  • A: Big-O measures growth rate, not exact time

SC-146

  • Q: Time complexity of merge sort?
  • A: O(n log n) — log n levels, n work each

SC-147

  • Q: Space complexity of recursion?
  • A: O(depth) due to call stack frames

SC-148

  • Q: What is amortized analysis?
  • A: Average cost over a sequence of operations

SC-149

  • Q: How to reduce O(n²) to O(n)?
  • A: Use hash maps or sorting + two pointers

SC-150

  • Q: What is tail recursion?
  • A: Recursive call is the last operation — can be optimized to O(1) space

SC-151

  • Q: What does Big-O measure?
  • A: Worst-case growth rate of time/space

SC-152

  • Q: O(1) means what?
  • A: Constant time — doesn't depend on input size

SC-153

  • Q: O(log n) example?
  • A: Binary search — halves input each step

SC-154

  • Q: O(n²) example?
  • A: Nested loops over the same array

SC-155

  • Q: Why drop constants?
  • A: Big-O measures growth rate, not exact time

SC-156

  • Q: Time complexity of merge sort?
  • A: O(n log n) — log n levels, n work each

SC-157

  • Q: Space complexity of recursion?
  • A: O(depth) due to call stack frames

SC-158

  • Q: What is amortized analysis?
  • A: Average cost over a sequence of operations

SC-159

  • Q: How to reduce O(n²) to O(n)?
  • A: Use hash maps or sorting + two pointers

SC-160

  • Q: What is tail recursion?
  • A: Recursive call is the last operation — can be optimized to O(1) space

SC-161

  • Q: What does Big-O measure?
  • A: Worst-case growth rate of time/space

SC-162

  • Q: O(1) means what?
  • A: Constant time — doesn't depend on input size

SC-163

  • Q: O(log n) example?
  • A: Binary search — halves input each step

SC-164

  • Q: O(n²) example?
  • A: Nested loops over the same array

SC-165

  • Q: Why drop constants?
  • A: Big-O measures growth rate, not exact time

SC-166

  • Q: Time complexity of merge sort?
  • A: O(n log n) — log n levels, n work each

SC-167

  • Q: Space complexity of recursion?
  • A: O(depth) due to call stack frames

SC-168

  • Q: What is amortized analysis?
  • A: Average cost over a sequence of operations

SC-169

  • Q: How to reduce O(n²) to O(n)?
  • A: Use hash maps or sorting + two pointers

SC-170

  • Q: What is tail recursion?
  • A: Recursive call is the last operation — can be optimized to O(1) space

SC-171

  • Q: What does Big-O measure?
  • A: Worst-case growth rate of time/space

SC-172

  • Q: O(1) means what?
  • A: Constant time — doesn't depend on input size

SC-173

  • Q: O(log n) example?
  • A: Binary search — halves input each step

SC-174

  • Q: O(n²) example?
  • A: Nested loops over the same array

SC-175

  • Q: Why drop constants?
  • A: Big-O measures growth rate, not exact time

SC-176

  • Q: Time complexity of merge sort?
  • A: O(n log n) — log n levels, n work each

SC-177

  • Q: Space complexity of recursion?
  • A: O(depth) due to call stack frames

SC-178

  • Q: What is amortized analysis?
  • A: Average cost over a sequence of operations

SC-179

  • Q: How to reduce O(n²) to O(n)?
  • A: Use hash maps or sorting + two pointers

SC-180

  • Q: What is tail recursion?
  • A: Recursive call is the last operation — can be optimized to O(1) space

SC-181

  • Q: What does Big-O measure?
  • A: Worst-case growth rate of time/space

SC-182

  • Q: O(1) means what?
  • A: Constant time — doesn't depend on input size

SC-183

  • Q: O(log n) example?
  • A: Binary search — halves input each step

SC-184

  • Q: O(n²) example?
  • A: Nested loops over the same array

SC-185

  • Q: Why drop constants?
  • A: Big-O measures growth rate, not exact time

SC-186

  • Q: Time complexity of merge sort?
  • A: O(n log n) — log n levels, n work each

SC-187

  • Q: Space complexity of recursion?
  • A: O(depth) due to call stack frames

SC-188

  • Q: What is amortized analysis?
  • A: Average cost over a sequence of operations

SC-189

  • Q: How to reduce O(n²) to O(n)?
  • A: Use hash maps or sorting + two pointers

SC-190

  • Q: What is tail recursion?
  • A: Recursive call is the last operation — can be optimized to O(1) space

SC-191

  • Q: What does Big-O measure?
  • A: Worst-case growth rate of time/space

SC-192

  • Q: O(1) means what?
  • A: Constant time — doesn't depend on input size

SC-193

  • Q: O(log n) example?
  • A: Binary search — halves input each step

SC-194

  • Q: O(n²) example?
  • A: Nested loops over the same array

SC-195

  • Q: Why drop constants?
  • A: Big-O measures growth rate, not exact time

SC-196

  • Q: Time complexity of merge sort?
  • A: O(n log n) — log n levels, n work each

SC-197

  • Q: Space complexity of recursion?
  • A: O(depth) due to call stack frames

SC-198

  • Q: What is amortized analysis?
  • A: Average cost over a sequence of operations

SC-199

  • Q: How to reduce O(n²) to O(n)?
  • A: Use hash maps or sorting + two pointers

SC-200

  • Q: What is tail recursion?
  • A: Recursive call is the last operation — can be optimized to O(1) space