Analyzing Time Efficiency of Algorithms: Recursive, Non-Recursive, and Sorting Techniques

General Plan for Analyzing the Time Efficiency of Recursive Algorithms

To analyze the time efficiency of a recursive algorithm, follow these steps:

  1. Decide on a parameter (or parameters) indicating an input’s size.
  2. Identify the algorithm’s basic operation.
  3. Check whether the number of times the basic operation is executed can vary for different inputs of the same size. If it can, the worst-case, average-case, and best-case efficiencies must be investigated separately.
  4. Set up a recurrence relation, with an appropriate initial condition, for the number of times the basic operation is executed.
  5. Solve the recurrence or, at least, ascertain the order of growth of its solution.

Example:

Consider the factorial function:

{ if (n=0)| return 1;|else|return factorial(n-1)*n;}

The recurrence relation for the number of multiplications (basic operation) is:

m(n) = m(n-1)+1 = [m(n-2)+1]+1 = m(n-2)+2 = [m(n-3)+1]+2 = m(n-3)+3…..=M(n-i)+i=m(n-n)+n=n.

Therefore, the time complexity of the factorial function is O(n).

Steps of Designing an Algorithm

Designing an algorithm involves a systematic process:

  1. Understanding the problem: Clearly define the problem and its requirements.
  2. Decide on computational device: Determine the resources available (e.g., memory, processing power).
  3. Exact vs. approximate solving: Decide whether an exact solution is required or if an approximation is acceptable.
  4. Algorithm design technique: Choose an appropriate design technique (e.g., divide and conquer, dynamic programming).
  5. Design the algorithm: Develop a step-by-step procedure to solve the problem.
  6. Prove correctness: Demonstrate that the algorithm produces the desired output for all valid inputs.
  7. Analyze the algorithm: Determine the time and space complexity of the algorithm.
  8. Code the algorithm: Implement the algorithm in a programming language.

General Plan for Analyzing the Time Efficiency of Non-recursive Algorithms

To analyze the time efficiency of a non-recursive algorithm, follow these steps:

  1. Decide on a parameter (or parameters) indicating an input’s size.
  2. Identify the algorithm’s basic operation. (As a rule, it is located in the innermost loop.)
  3. Check whether the number of times the basic operation is executed depends only on the size of an input. If it also depends on some additional property, the worst-case, average-case, and, if necessary, best-case efficiencies have to be investigated separately.
  4. Set up a sum expressing the number of times the algorithm’s basic operation is executed.
  5. Using standard formulas and rules of sum manipulation, either find a closed form formula for the count or, at the very least, establish its order of growth.

Example:

Consider the algorithm to find the maximum value in an array:

Maxvalmaxval|maxval

The basic operation is the comparison (a[i] > maxval). The number of comparisons is n-1. Therefore, the time complexity of this algorithm is O(n).

Algorithm: Bubble Sort

The bubble sort algorithm sorts an array by repeatedly stepping through the list, comparing adjacent elements and swapping them if they are in the wrong order.

algorithm bubblesort(a[0...n-1])|for i

Time Efficiency:

The time complexity of bubble sort is O(n^2) in the worst and average cases. In the best case, when the array is already sorted, the time complexity is O(n).

Worst Case:

c(n) = (n-1)n/2 belongs to theta(n^2)

Best Case:

c(n) = n-1 belongs to theta(n)

Algorithm: Merge Sort

Merge sort is a divide-and-conquer algorithm that recursively divides the input array into two halves, sorts each half, and then merges the sorted halves.

ALGORITHM Merge(B[0..p-1], C[0..q-1], A[0..p+q-1])

//Merges two sorted arrays into one sorted array
//Input: Arrays B[0..p1] and C[0..1] both sorted
 //Output: Sorted array A[0..p+q1] of the elements of B and C
 iwhile i  if B[i]C[j]
 A[k]B[i]; i else A[k]C[j]; j←j+1
 k-k+1 
if i = p 
copy C[j..q-1] to A[k..p+q-1]
 else copy B[i..p-1] to A[k..p+q-1]

Time Complexity:

The time complexity of merge sort is O(n log n) in all cases (worst, average, and best).

c(n) = n log2n+n-1 belongs to theta(n log n)

Algorithm: Insertion Sort

Insertion sort is a simple sorting algorithm that builds the final sorted array one item at a time. It iterates through the input array, taking each element and inserting it into its correct position in the sorted portion of the array.

algorithm insertionsort(a[0...n-1])|||sorts the given array by insertion sortv||||an array A[0...n-1] of n orderable elements||||| array A[0...n-1]sorted in non decreasing order|||| for i=0 and a[j]>v do|||A[j+1]

Time Complexity:

The time complexity of insertion sort depends on the nature of the input.

Worst Case:

cworst(n) = (n-1)n/2 belongs to theta(n^2)

Best Case:

cbest(n) = n-1 belongs to theta(n)

Average Case:

cavg ~ n^2/4 belongs to theta(n^2)

Algorithm: Quick Sort

Quick sort is a divide-and-conquer algorithm that partitions the input array around a pivot element and recursively sorts the subarrays.

algorithm hoare partion(A[l...r]) | partition of sub array by hoares algorithm using the 1st element as pivot | subarray of array A[0...n-1] | partition of A[l...r],with split position | p=p | repeat j

Note:

p > i — increment i
p p is not greater than i — stop increment and compare with j
p is not less than j stop increment and compare i
If the index of i is less than the index of j, swap i, j.

Algorithm: Heap Sort

Heap sort is a sorting algorithm that uses a binary heap data structure. It builds a max-heap from the input array and then repeatedly extracts the maximum element from the heap, placing it at the end of the sorted array.

Properties of a Heap:

  • Shape Property: A heap is a complete binary tree, meaning all levels are filled except possibly the last level, which is filled from left to right.
  • Parental Dominance: In a max-heap, the key of each node is greater than or equal to the keys of its children. In a min-heap, the key of each node is less than or equal to the keys of its children.

Key Properties of a Heap:

  • There exists exactly one essentially complete binary tree with n nodes. Its height is [log base 2 n].
  • The root of a heap contains the largest element (in a max-heap) or the smallest element (in a min-heap).
  • A node of a heap considered with all its descendants is also a heap.

Algorithm: Horsepool Algorithm

The Horsepool algorithm is a string searching algorithm that uses a shift table to efficiently skip over portions of the text that cannot contain the pattern.

horsepools algo.. | shift table (p[0...m-1]) | i

Complexity Classes

P Problems

P problems are problems that can be solved quickly (in”polynomial tim”) by a computer. This means there is an algorithm that can find a solution efficiently as the size of the problem grows.

Example: Sorting a list of numbers.

NP Problems

NP problems are problems where, if you are given a solution, you can verify it quickly (in”polynomial tim”). However, it’s not necessarily quick to find the solution in the first place.

Example: The Sudoku puzzle.

NP-Complete Problems

NP-complete problems are the toughest problems in the NP category. If you can solve one NP-complete problem quickly, you could solve all NP problems quickly. However, no one knows if a fast solution for NP-complete problems exists.

Example: The Traveling Salesman Problem (TSP).

NP-Hard Problems

NP-hard problems are problems that are at least as hard as NP-complete problems, but they don’t necessarily have to be in NP (meaning we don’t even have to be able to verify a solution quickly). These problems are very challenging and can include problems that might not have a clear”ye” or”n” answer.

Example: The Halting Problem.

Algorithm Design Techniques

Backtracking

Method: Try different choices step-by-step; undo if you hit a dead end.

Use: Solving problems like puzzles and constraint satisfaction.

Branch and Bound

Method: Explore branches of possibilities and prune unpromising ones using estimates.

Use: Optimization problems like finding the best route or solution.