Sorting Algorithms
Selection Sort
Intuition
- The selection sort algorithm sorts an array by repeatedly finding the minimum element from the unsorted part and putting it at the beginning.
- The largest element will end up at the last index of the array.
Approach
- Select the starting index of the unsorted part using a loop with i from 0 to n-1.
- Find the smallest element in the range from i to n-1 using an inner loop.
- Swap this smallest element with the element at index i.
- Repeat the process for the next starting index.
Algorithm
class Solution {
public int[] selectionSort(int[] nums) {
for (int i = 0; i < nums.length - 1; i++) {
int min = i;
for (int j = i + 1; j < nums.length; j++) {
if (nums[j] < nums[min]) min = j;
}
int temp = nums[i];
nums[i] = nums[min];
nums[min] = temp;
}
return nums;
}
}
Time & Space Complexity
-
Time Complexity: O(N2) where N is the length of the input array. The outer loop runs through each element, and the inner loop finds the smallest element in the unsorted portion of the array.
-
Space Complexity: O(1) as it is an in-place sorting algorithm and does not require additional storage proportional to the input size.
Bubble Sort
Intuition
- The bubble sort algorithm sorts an array by repeatedly swapping adjacent elements if they are in the wrong order.
- The largest elements "bubble" to the end of the array with each pass.
Approach
-
Run a loop i from n-1 to 0.
-
Run a inner loop from j from 0 to i-1.
-
If arr[j] > arr[j+1], swap them.
-
Continue until the array is sorted.
Note: Here, after each iteration, the array becomes sorted up to the last index of the range. That is why the last index of the range decreases by 1 after each iteration. This decrement is managed by the outer loop, where the last index is represented by the variable i. The inner loop (variable j) helps to push the maximum element of the range [0...i] to the last index (i.e., index i).
Algorithm
class Solution {
public int[] bubbleSort(int[] nums) {
int n = nums.length;
for (int i = n - 1; i >= 0; i--) {
boolean isSwapped = false;
for (int j = 0; j <= i - 1; j++) {
if (nums[j] > nums[j + 1]) {
int temp = nums[j];
nums[j] = nums[j + 1];
nums[j + 1] = temp;
isSwapped = true;
}
}
if (!isSwapped) break;
}
return nums;
}
}
Time & Space Complexity
-
Time Complexity: O(N2) for the worst and average cases and O(N) for the best case. Here, N is the size of the array.
-
Space Complexity: O(1), because Bubble Sort is an in-place sorting algorithm, meaning it only requires a constant amount of extra space for its operations, regardless of the size of the input array.
Insertion Sort
Intuition
- Insertion sort builds a sorted array one element at a time by repeatedly picking the next element and inserting it into its correct position within the already sorted part of the array.
Approach
- In each iteration, select an element from the unsorted part of the array using an outer loop.
- Place this element in its correct position within the sorted part of the array.
- Use an inner loop to shift the remaining elements as necessary to accommodate the selected element. This involves swapping elements until the selected element is in its correct position.
- Continue this process until the entire array is sorted.
Algorithm
public int[] insertionSort(int[] nums) {
int n = nums.length;
// Traverse through the array
for (int i = 1; i < n; i++) {
int key = nums[i];
int j = i - 1;
// Swap elements till we reach greater element
while (j >= 0 && nums[j] > key) {
nums[j + 1] = nums[j];
j = j - 1;
}
nums[j + 1] = key;
}
return nums;
}
Time & Space Complexity
-
Time Complexity: O(N2) for the worst and average cases, where N is the size of the array. This is because the outer loop runs N times, and for each pass, the inner loop runs up to N times as well, resulting in approximately N x N operations, hence O(N2). The best-case time complexity occurs when the array is already sorted, in which case the inner loop doesn't run at all, leading to a time complexity of O(N).
-
Space Complexity: O(1) because Insertion Sort is an in-place sorting algorithm, meaning it sorts the array by modifying the original array without using additional data structures that grow with the size of the input.
Merge Sort
Intuition
-
Merge Sort is a powerful sorting algorithm that follows the divide-and-conquer approach. The array is divided into two equal halves until each sub-array contains only one element. Each pair of smaller sorted arrays is then merged into a larger sorted array.
-
The algorithm consists of two main functions:
-
merge(): This function merges the two halves of the array, assuming both parts are already sorted.
-
mergeSort(): This function divides the array into 2 parts: low to mid and mid+1 to high where, low is the leftmost index of the array, high is the rightmost index of the array, and mid is the middle index of the array.
-
By repeating these steps recursively, Merge Sort efficiently sorts the entire array.
Approach
To implement Merge Sort, we will create two functions: mergeSort() and merge().
mergeSort(arr[], low, high)
- Divide the Array: Split the given array into two halves by splitting the range. For any range from low to high, the splits will be low to mid and mid+1 to high, where mid = (low + high) / 2. This process continues until the range size is 1.
- Recursive Division: In mergeSort(), divide the array around the middle index by making recursive calls: mergeSort(arr, low, mid) for the left half and mergeSort(arr, mid+1, high) for the right half. Here, low is the leftmost index, high is the rightmost index, and mid is the middle index of the array.
- Base Case: To complete the recursive function, define the base case. The recursion ends when the array has only one element left, meaning low and high are the same, pointing to a single element. If low >= high, the function returns.
merge(arr[], low, mid, high)
- Use a temporary array to store the elements of the two sorted halves after merging. The range of the left half is from low to mid and the range of the right half is from mid+1 to high.
- Use two pointers, left starting from low and right starting from mid+1. Using a while loop (while left
<=
mid && right<=
high), compare the elements from each half and insert the smaller one into the temporary array. After the loop, any leftover elements in both halves are copied into the temporary array. - Transfer the elements from the temporary array back to the original array in the range low to high.
- This approach ensures that the array is efficiently sorted using the divide-and-conquer strategy of Merge Sort.
Algorithm
import java.util.*;
class Solution {
public void merge(int[] nums, int low, int mid, int high) {
List<Integer> li = new ArrayList<>();
int left = low;
int right = mid + 1;
while (left <= mid && right <= high) {
if (nums[left] <= nums[right]) {
li.add(nums[left]);
left++;
} else {
li.add(nums[right]);
right++;
}
}
while (left <= mid) {
li.add(nums[left]);
left++;
}
while (right <= high) {
li.add(nums[right]);
right++;
}
// Copy back to original array
for (int i = 0; i < li.size(); i++) {
nums[low + i] = li.get(i); // Copy back to correct position
}
}
public void mergeSortHelper(int[] nums, int low, int high) {
if (low >= high) return;
int mid = (low + high) / 2;
mergeSortHelper(nums, low, mid);
mergeSortHelper(nums, mid + 1, high);
merge(nums, low, mid, high);
}
public int[] mergeSort(int[] nums) {
mergeSortHelper(nums, 0, nums.length - 1);
return nums;
}
}
Time & Space Complexity
-
Time Complexity: O(nlogn). At each step, we divide the whole array, which takes logn steps, and we assume n steps are taken to sort the array. So, the overall time complexity is nlogn.
-
Space Complexity: O(n). We are using a temporary array to store elements in sorted order.
Quick Sort
Intuition
Quick Sort is a divide-and-conquer algorithm like Merge Sort. However, unlike Merge Sort, Quick Sort does not use an extra array for sorting (though it uses an auxiliary stack space). This makes Quick Sort slightly better than Merge Sort from a space perspective.
This algorithm follows two simple steps repeatedly:
- Pick a pivot and place it in its correct position in the sorted array.
- Move smaller elements (i.e., smaller than the pivot) to the left of the pivot and larger ones to the right.
To summarize: The main goal is to place the pivot at its final position in each recursion call, where it should be in the final sorted array.
Approach
To implement Quick Sort, we will create two functions: quickSort() and partition().
quickSort(arr[], low, high)
- Initial Setup: The low pointer points to the first index, and the high pointer points to the last index of the array.
- Partitioning: Use the partition() function to get the index where the pivot should be placed after sorting. This index, called the partition index, separates the left and right unsorted subarrays.
- Recursive Calls: After placing the pivot at the partition index, recursively call quickSort() for the left and right subarrays. The range of the left subarray will be [low to partition index - 1] and the range of the right subarray will be [partition index + 1 to high].
- Base Case: The recursion continues until the range becomes 1.
partition(arr[], low, high)
- Select pivot (e.g., arr[low]).
- Use pointers i (low) and j (high). Move i forward to find element > pivot, and j backward to find element < pivot. Ensure i
<=
high - 1 and j>=
low + 1. - If i < j, swap arr[i] and arr[j].
- Continue until j < i.
- Swap pivot (arr[low]) with arr[j] and return j as partition index.
This approach ensures that Quick Sort efficiently sorts the array using the divide-and-conquer strategy.
Algorithm
class Solution {
public int findPivotIndex(int[] nums, int low, int high) {
int pivot = nums[low];
int i = low;
int j = high;
while (i < j) {
while (i <= high - 1 && nums[i] <= pivot) i++;
while (j >= low + 1 && nums[j] >= pivot) j--;
if (i < j) {
int temp = nums[i];
nums[i] = nums[j];
nums[j] = temp;
}
}
int temp = nums[low];
nums[low] = nums[j];
nums[j] = temp;
return j;
}
public void quickSortHelper(int[] nums, int low, int high) {
if (low >= high) return;
int pivotIndex = findPivotIndex(nums, low, high);
quickSortHelper(nums, low, pivotIndex - 1);
quickSortHelper(nums, pivotIndex + 1, high);
}
public int[] quickSort(int[] nums) {
quickSortHelper(nums, 0, nums.length - 1);
return nums;
}
}
Time & Space Complexity
-
Time Complexity: O(nlogn), where N = size of the array. At each step, we divide the whole array, which takes logN steps, and n steps are taken for the partition() function, so overall time complexity will be N*logN. The following recurrence relation can be written for Quick sort:
F(n) = F(k) + F(n-1-k)
Here, k is the number of elements smaller or equal to the pivot and n-1-k denotes elements greater than the pivot.
There can be 2 cases:
Worst Case: This case occurs when the pivot is the greatest or smallest element of the array. If the partition is done and the last element is the pivot, then the worst case would be either in the increasing order of the array or in the decreasing order of the array.
Recurrence:
F(n) = F(0) + F(n-1) or F(n) = F(n-1) + F(0)
Worst Case Time Complexity: O(n2)
Best Case: This case occurs when the pivot is the middle element or near to middle element of the array.
Recurrence:
F(n) = 2F(n/2)
Time Complexity for the best and average case: O(N*logN)
-
Space Complexity: O(1) + O(N) auxiliary stack space, where N = size of the array.