Back to blog
dsabeginner

Big O Notation & Complexity Analysis

Understand Big O notation from first principles — time and space complexity, common complexity classes, and how to analyse your own algorithms.

LearnixoApril 16, 20266 min read
DSABig OAlgorithmsComplexityInterview PrepBeginner
Share:š•

Big O is the language interviewers use to talk about algorithm efficiency. You don't need a maths degree — just a clear mental model of the common complexity classes and how to read code to determine which one applies.


What Big O Measures

Big O describes how an algorithm's runtime (or memory usage) grows as the input size n grows. It always describes the worst case and ignores constants.

O(1)     — constant   — doesn't grow with input
O(log n) — logarithmic — grows very slowly (binary search)
O(n)     — linear     — grows directly with input
O(n log n) — linearithmic — most sorting algorithms
O(n²)    — quadratic  — nested loops
O(2ⁿ)    — exponential — recursion that branches twice
O(n!)    — factorial  — generating all permutations

Key insight: We drop constants and lower-order terms. O(3n + 50) becomes O(n). O(n² + n) becomes O(n²). We care about the dominant term as n → āˆž.


The Common Complexity Classes

O(1) — Constant Time

Same speed regardless of input size:

TYPESCRIPT
function getFirst(arr: number[]): number {
  return arr[0];   // always one operation
}

function isEven(n: number): boolean {
  return n % 2 === 0;
}

// Hash map lookups are O(1) on average
const map = new Map([['alice', 1]]);
map.get('alice');  // O(1)

O(log n) — Logarithmic

Halves the problem space each step — can handle billions of items efficiently:

TYPESCRIPT
function binarySearch(arr: number[], target: number): number {
  let lo = 0, hi = arr.length - 1;

  while (lo <= hi) {
    const mid = Math.floor((lo + hi) / 2);
    if (arr[mid] === target) return mid;
    if (arr[mid] < target)  lo = mid + 1;
    else                     hi = mid - 1;
  }

  return -1;
}
// For 1 billion items: only ~30 iterations needed

O(n) — Linear

One pass through the input:

TYPESCRIPT
function sum(arr: number[]): number {
  let total = 0;
  for (const n of arr) total += n;  // visits each element once
  return total;
}

function findMax(arr: number[]): number {
  return arr.reduce((max, n) => n > max ? n : max, -Infinity);
}

O(n log n) — Linearithmic

The best you can do for comparison-based sorting:

TYPESCRIPT
// Merge sort — divides (log n levels) and merges (n work per level)
function mergeSort(arr: number[]): number[] {
  if (arr.length <= 1) return arr;

  const mid   = Math.floor(arr.length / 2);
  const left  = mergeSort(arr.slice(0, mid));
  const right = mergeSort(arr.slice(mid));

  return merge(left, right);
}

JavaScript's built-in Array.sort() is O(n log n) — TimSort.

O(n²) — Quadratic

Nested loops where each iterates over the full input:

TYPESCRIPT
// Check for duplicate pairs — O(n²)
function hasDuplicate(arr: number[]): boolean {
  for (let i = 0; i < arr.length; i++) {
    for (let j = i + 1; j < arr.length; j++) {
      if (arr[i] === arr[j]) return true;
    }
  }
  return false;
}

// Better: O(n) with a hash set
function hasDuplicateFast(arr: number[]): boolean {
  const seen = new Set<number>();
  for (const n of arr) {
    if (seen.has(n)) return true;
    seen.add(n);
  }
  return false;
}

How to Analyse Code

Consecutive operations — add complexities

TYPESCRIPT
function example(arr: number[]): void {
  // O(n)
  for (const x of arr) console.log(x);

  // O(n)
  for (const x of arr) console.log(x * 2);
}
// Total: O(n) + O(n) = O(2n) = O(n)

Nested loops — multiply complexities

TYPESCRIPT
function pairs(arr: number[]): void {
  for (const a of arr) {             // O(n)
    for (const b of arr) {           // O(n) for each outer iteration
      console.log(a, b);
    }
  }
}
// Total: O(n Ɨ n) = O(n²)

Each level of recursion

TYPESCRIPT
// T(n) = 2 * T(n/2) + O(n) → O(n log n) by Master Theorem
function mergeSort(arr: number[]): number[] { /* ... */ }

// T(n) = 2 * T(n-1) + O(1) → O(2ⁿ) — exponential!
function badFib(n: number): number {
  if (n <= 1) return n;
  return badFib(n - 1) + badFib(n - 2);  // two branches per call
}

Space Complexity

Space complexity measures extra memory used (not counting the input).

TYPESCRIPT
// O(1) space — only a counter variable
function countEvens(arr: number[]): number {
  let count = 0;
  for (const n of arr) if (n % 2 === 0) count++;
  return count;
}

// O(n) space — stores a copy of the input
function doubleAll(arr: number[]): number[] {
  return arr.map(n => n * 2);  // new array of same size
}

// O(n) space — call stack depth proportional to input
function factorial(n: number): number {
  if (n <= 1) return 1;
  return n * factorial(n - 1);  // n frames on the call stack
}

Real-World Complexity Examples

| Operation | Data Structure | Time | |---|---|---| | Array access by index | Array | O(1) | | Array search (unsorted) | Array | O(n) | | Array search (sorted) | Array | O(log n) | | Array insert/delete at end | Array | O(1) amortized | | Array insert/delete at start | Array | O(n) | | HashMap get/set/delete | Map/Object | O(1) average | | Set has/add/delete | Set | O(1) average | | Sort | Array.sort() | O(n log n) | | Binary search tree lookup | BST | O(log n) average | | Stack push/pop | Stack | O(1) | | Queue enqueue/dequeue | Queue | O(1) |


The Golden Rule for Interviews

When you see your solution uses O(n²), ask: "Can I use a hash map to get to O(n)?"

Most O(n²) solutions can be reduced to O(n) by pre-storing data in a hash map and doing O(1) lookups instead of nested scanning.

TYPESCRIPT
// O(n²) — nested scan
function twoSum(nums: number[], target: number): number[] {
  for (let i = 0; i < nums.length; i++)
    for (let j = i + 1; j < nums.length; j++)
      if (nums[i] + nums[j] === target) return [i, j];
  return [];
}

// O(n) — hash map
function twoSumOptimal(nums: number[], target: number): number[] {
  const seen = new Map<number, number>();  // value → index
  for (let i = 0; i < nums.length; i++) {
    const complement = target - nums[i];
    if (seen.has(complement)) return [seen.get(complement)!, i];
    seen.set(nums[i], i);
  }
  return [];
}

Quick Reference

O(1)        Constant    Array index, hash map lookup
O(log n)    Log         Binary search, BST operations
O(n)        Linear      Single loop, linear search
O(n log n)  Linearithmic Merge sort, heap sort
O(n²)       Quadratic   Nested loops, bubble sort
O(2ⁿ)       Exponential Recursive subsets, naive Fibonacci
O(n!)       Factorial   Generating all permutations

Analysis rules:
  Drop constants:        O(3n) → O(n)
  Drop smaller terms:    O(n² + n) → O(n²)
  Sequential:            O(n) + O(n) = O(n)
  Nested:                O(n) Ɨ O(n) = O(n²)

Space complexity counts extra memory, not input size.
Call stack counts as space — recursive calls use O(depth) space.

Enjoyed this article?

Explore the learning path for more.

Found this helpful?

Share:š•

Leave a comment

Have a question, correction, or just found this helpful? Leave a note below.