Algorithm Analysis and Complexity: A Comprehensive Reference

Asymptotic Notations

1. Big-O Notation (Upper Bound)

Definition: Big-O gives the maximum growth rate of a function, representing the worst-case complexity.

  • Formula: f(n) = O(g(n)) where 0 ≤ f(n) < c · g(n) for all n ≥ n₀
  • Meaning: Function f(n) does not grow faster than g(n).
  • Example: 3n² + 5n + 2 = O(n²)

2. Big-Omega Notation (Lower Bound)

Definition: Big-Omega gives the minimum growth rate, representing the best-case complexity.

  • Formula: f(n) = Ω(g(n)) where 0 ≤ c · g(n) ≤ f(n) for all n ≥ n₀
  • Meaning: Function f(n) grows at least as fast as g(n).
  • Example: 3n² + 5n + 2 = Ω(n²)

3. Big-Theta Notation (Tight Bound)

Definition: Theta gives the exact growth rate (both upper and lower bounds).

  • Formula: f(n) = Θ(g(n)) where c₁g(n) ≤ f(n) ≤ c₂g(n) for n ≥ n₀
  • Meaning: Function grows at the same rate as g(n).
  • Example: 3n² + 5n + 2 = Θ(n²)

4. Little-o Notation (Strict Upper Bound)

Definition: Represents growth strictly slower than g(n).

  • Formula: f(n) = o(g(n)) if limn→∞ f(n)/g(n) = 0
  • Example: n = o(n²)

5. Little-omega Notation (Strict Lower Bound)

Definition: Represents growth strictly faster than g(n).

  • Formula: f(n) = ω(g(n)) if limn→∞ f(n)/g(n) = ∞
  • Example: n² = ω(n)

Dynamic Programming vs. Greedy Method

Dynamic Programming (DP)

DP solves problems by breaking them into overlapping subproblems and storing results to avoid re-computation.

  • Memoization (Top-Down): Solve recursively and store results in a table.
  • Tabulation (Bottom-Up): Solve smallest subproblems first and fill a table iteratively.
  • Examples: 0/1 Knapsack, Fibonacci, Floyd–Warshall, Matrix Chain Multiplication.

Greedy Method

Makes the locally optimal choice at each step, assuming it leads to a global optimum.

  • Characteristics: No overlapping subproblems, no recursion needed, fast.
  • Examples: Kruskal’s algorithm, Prim’s algorithm, Dijkstra’s algorithm, Fractional Knapsack.

Complexity Analysis

Time Complexity

Measures the time an algorithm takes as a function of input size n.

  • Expressed using asymptotic notations (O, Θ, Ω).
  • Counts basic operations rather than seconds.

Space Complexity

Measures the total memory required, including input space and auxiliary space.

Backtracking and Branch and Bound

Backtracking

Builds solutions step-by-step and prunes branches that violate constraints. Used for decision problems like N-Queens or Sudoku.

Branch and Bound

Uses a bounding function to discard branches that cannot produce a better solution than the current best. Used for optimization problems like the Traveling Salesman Problem.

Complexity Classes

  • Class P: Problems solvable in polynomial time (e.g., Sorting, Shortest Path).
  • Class NP: Problems whose solutions can be verified in polynomial time.
  • NP-Hard: Problems at least as hard as the hardest problems in NP.
  • NP-Complete: Problems that are both in NP and NP-Hard (e.g., SAT, Clique, Vertex Cover).

Vertex Cover and Reducibility

Vertex Cover is NP-Complete. We prove this by reducing the Clique problem to Vertex Cover. Because it is NP-Complete, we often use a 2-Approximation Algorithm that guarantees a solution no more than twice the optimal size.

Sorting Algorithms

Merge Sort

A divide-and-conquer algorithm with a time complexity of O(n log n).

Quick Sort

Uses a partition strategy. Best case: O(n log n); Worst case: O(n²).