Core Concepts in Data Structures and Algorithms
Queue Implementation with Linked Lists
A queue is implemented using a linked list by maintaining two pointers:
Key Pointers
- Front: Points to the front node of the queue.
- Rear: Points to the last node.
Operations
- Enqueue (Insertion): Create a new node, link it at the end, and update the rear pointer.
- Dequeue (Deletion): Remove the front node and update the front pointer.
Advantages
- Dynamic size, no overflow unless memory is full.
General Tree to Binary Tree Conversion
Conversion Steps
- Left-Child Right-Sibling Representation:
- For each node:
- Make the leftmost child the left child in the binary tree.
- Make the next sibling the right child.
This maintains parent-child and sibling relationships using binary tree structure.
AVL Tree and Balance Factor Explained
An AVL tree is a self-balancing Binary Search Tree (BST). For every node, the difference in heights of left and right subtrees (balance factor) must be:
- -1, 0, or 1.
Balance Factor = Height(left subtree) – Height(right subtree)
Rotations (LL, RR, LR, RL) are used to maintain balance after insertion/deletion.
Graph Representations with Examples
- Adjacency Matrix: A 2D array where cell (i, j) is 1 if an edge exists.
- Adjacency List: An array of lists; each list contains neighbors of a vertex.
- Edge List: A list of all edges represented as pairs of vertices.
Example Graph Representation
Graph: A-B, A-C
- Matrix: A=[0,1,1], B=[1,0,0], C=[1,0,0]
- List: A→B→C, B→A, C→A
BFS Traversal Pseudocode for BST
BFS Algorithm (Pseudocode)
if root is NULL:
return
create an empty queue Q
enqueue root to Q
while Q is not empty:
node = dequeue Q
print node.data
if node.left is not NULL:
enqueue node.left
if node.right is not NULL:
enqueue node.right
Hashing and Hash Function Methods
Hashing: Converts keys to indices in a hash table using a hash function.
Hash Function Methods
- Division Method: h(k) = k % m
- Multiplication Method: h(k) = floor(m * (k*A mod 1)) (0 < A < 1)
- Mid-Square Method: Square the key and extract middle bits.
- Folding Method: Split the key, sum the parts.
General Tree to Binary Tree Conversion Example
Given General Tree
A
├── B
├── C
└── D
Converted Binary Tree
A
/
B
\n C
\n D
Explanation
B is the left child; C is the right child of B; D is the right child of C.
Tree Traversal Methods and Examples
- Inorder (LNR): Left, Node, Right
- Preorder (NLR): Node, Left, Right
- Postorder (LRN): Left, Right, Node
- Level Order (BFS): Level by level using a queue
Example Tree
A
/ \n B C
- Inorder: B A C
- Preorder: A B C
- Postorder: B C A
- Level Order: A B C
Binary Tree Definition and Example
A binary tree is a tree where each node has at most two children.
Example Binary Tree
10
/ \n 5 15
- Root: 10
- Left Child: 5
- Right Child: 15
Distinguishing NULL and VOID
- NULL: A macro representing a null pointer (0 or
nullptr
). - VOID: A keyword indicating no value (used in function return type or pointers).
Applications of Graph Data Structures
- Social Networks
- Google Maps (Shortest Path)
- Network Routing Algorithms
- Dependency Graphs (Build Systems)
- Web Crawlers
- AI (Game Trees, State Graphs)
Rehashing vs. Extendible Hashing
- Rehashing: When the load factor exceeds a limit, a new hash table (larger size) is created and all elements are reinserted using a new hash function.
- Extendible Hashing: Uses a directory of pointers to buckets; directory size doubles only when needed, minimizing rehashing. Ideal for databases.
AVL Tree Abstract Data Type (ADT)
AVL Tree ADT includes:
- Structure: BST with a balance condition.
- Balance Factor: -1, 0, +1
- Operations:
- Insertion (with rotations if needed)
- Deletion (with rebalancing)
- Searching
- Rotations: LL, RR, LR, RL
- Advantages: Faster search due to balance.
- Time Complexity:
- Search/Insert/Delete: O(log n)
Binary Search Algorithm with Example
Binary Search Algorithm
Binary Search is an efficient algorithm to find an element in a sorted array. It works by repeatedly dividing the search interval in half.
Algorithm (Iterative Approach)
BinarySearch(arr, key)
low = 0
high = length(arr) - 1
while low <= high:
mid = (low + high) / 2
if arr[mid] == key:
return mid
else if arr[mid] < key:
low = mid + 1
else:
high = mid - 1
return -1 // key not found
Example
Let arr = [10, 20, 30, 40, 50, 60, 70], and we want to search for 50.
Step-by-step Execution
- Step 1: low = 0, high = 6 → mid = (0 + 6) / 2 = 3 → arr[3] = 40
Since 50 > 40, search in the right half. - Step 2: low = 4, high = 6 → mid = (4 + 6) / 2 = 5 → arr[5] = 60
Since 50 < 60, search in the left half. - Step 3: low = 4, high = 4 → mid = (4 + 4) / 2 = 4 → arr[4] = 50
Match found at index 4.
Time Complexity
- Best case: O(1) (element at mid)
- Worst and Average case: O(log n)
Asymptotic Notations for Algorithm Analysis
Asymptotic notations are mathematical tools to describe the running time or space complexity of an algorithm in terms of input size n, particularly as n becomes large. The main types are:
1. Big O Notation (O) – Upper Bound
- Describes the worst-case time complexity.
- Represents the maximum time taken by the algorithm for any input.
- It provides an upper limit.
Example
If an algorithm takes at most 5n + 2 steps, it’s O(n).
Notation
If f(n) = O(g(n)), then there exist constants c and n₀ such that:
f(n) ≤ c * g(n) for all n ≥ n₀
2. Omega Notation (Ω) – Lower Bound
- Describes the best-case time complexity.
- Represents the minimum time taken.
- It provides a lower limit.
Example
If an algorithm takes at least 3n steps in the best case, it’s Ω(n).
Notation
If f(n) = Ω(g(n)), then there exist constants c and n₀ such that:
f(n) ≥ c * g(n) for all n ≥ n₀
3. Theta Notation (Θ) – Tight Bound
- Describes the average-case or exact time complexity.
- When both upper and lower bounds are the same.
- Gives the exact growth rate of the algorithm.
Example
If an algorithm always takes 4n + 3 steps, then it’s Θ(n).
Notation
If f(n) = Θ(g(n)), then there exist constants c₁, c₂ and n₀ such that:
c₁ * g(n) ≤ f(n) ≤ c₂ * g(n) for all n ≥ n₀
Insertion Sort and Selection Sort Complexity
1. Insertion Sort
Algorithm Idea
Builds the final sorted array one element at a time by comparing and inserting each element into its correct position in the already sorted part of the array.
Example
Input: [5, 3, 4, 1, 2]
Step-by-step
- Start from index 1: Compare 3 with 5 → insert before → [3, 5, 4, 1, 2]
- Next, 4 with 5 → insert before 5 → [3, 4, 5, 1, 2]
- Next, 1 → move 5, 4, 3 → insert at beginning → [1, 3, 4, 5, 2]
- Next, 2 → insert between 1 and 3 → [1, 2, 3, 4, 5]
Time Complexity
- Best Case (Already Sorted): O(n)
- Average Case: O(n²)
- Worst Case (Reverse Sorted): O(n²)
Space Complexity
- O(1) — In-place sorting, no extra memory needed.
2. Selection Sort
Algorithm Idea
Repeatedly finds the minimum element from the unsorted part and places it at the beginning.
Example
Input: [5, 3, 4, 1, 2]
Step-by-step
- Find min (1) → swap with 5 → [1, 3, 4, 5, 2]
- Find min (2) in remaining → swap with 3 → [1, 2, 4, 5, 3]
- Next min (3) → swap with 4 → [1, 2, 3, 5, 4]
- Next min (4) → swap with 5 → [1, 2, 3, 4, 5]
Time Complexity
- Best, Average, and Worst Case: O(n²) — always performs the same number of comparisons.
Space Complexity
- O(1) — In-place sorting, no extra memory used.