Sort, Space and Time Complexity
public class InsertionSort {
public static void main(String[] args) {
int[] arr = {5, 2, 8, 1, 9, 7, 4, 3, 6};
System.out.print("Before Sort: ");
for (int i = 0; i < arr.length; i++) {
System.out.print(arr[i] + " ");
}
// call insertion sort method
insertionSort(arr);
System.out.println();
System.out.print("After Sort: ");
// print sorted array
for (int i = 0; i < arr.length; i++) {
System.out.print(arr[i] + " ");
}
}
public static void insertionSort(int[] arr) {
int n = arr.length;
for (int i = 1; i < n; i++) {
int key = arr[i];
int j = i - 1;
/* Move elements of arr[0..i-1], that are greater than key,
to one position ahead of their current position */
while (j >= 0 && arr[j] > key) {
arr[j + 1] = arr[j];
j = j - 1;
}
arr[j + 1] = key;
}
}
}
InsertionSort.main(null);
Big O Analysis: O(n^2)
-
The worst-case time complexity of insertion sort is O(n^2), where n is the number of elements in the array. This means that as the size of the input array increases, the running time of the algorithm grows quadratically.
-
In the worst case, when the input array is sorted in reverse order, each element in the array must be compared with and moved past every other element to reach its correct position, resulting in n-1 comparisons and n-1 swaps. This gives a total of (n-1) + (n-2) + ... + 1 = n(n-1)/2 comparisons and swaps, which is O(n^2).
-
However, in the best case, when the input array is already sorted, the algorithm only needs to make n-1 comparisons and no swaps, giving a time complexity of O(n).
-
In practice, insertion sort is often used for small arrays or as a subroutine in other sorting algorithms, such as quicksort or merge sort, where it can be used to sort small subarrays efficiently.
Merge Sort
public class MergeSort {
public static void main(String[] args) {
int[] arr = {5, 2, 8, 1, 9, 7, 4, 3, 6};
System.out.print("Before Sort: ");
for (int i = 0; i < arr.length; i++) {
System.out.print(arr[i] + " ");
}
// call merge sort method
mergeSort(arr, 0, arr.length - 1);
System.out.println();
System.out.print("After Sort: ");
// print sorted array
for (int i = 0; i < arr.length; i++) {
System.out.print(arr[i] + " ");
}
}
public static void mergeSort(int[] arr, int left, int right) {
if (left < right) {
int mid = (left + right) / 2;
// sort left and right halves recursively
mergeSort(arr, left, mid);
mergeSort(arr, mid + 1, right);
// merge the sorted halves
merge(arr, left, mid, right);
}
}
public static void merge(int[] arr, int left, int mid, int right) {
int n1 = mid - left + 1;
int n2 = right - mid;
// create temporary arrays
int[] L = new int[n1];
int[] R = new int[n2];
// copy data to temporary arrays
for (int i = 0; i < n1; i++) {
L[i] = arr[left + i];
}
for (int j = 0; j < n2; j++) {
R[j] = arr[mid + 1 + j];
}
// merge the temporary arrays back into arr
int i = 0, j = 0, k = left;
while (i < n1 && j < n2) {
if (L[i] <= R[j]) {
arr[k] = L[i];
i++;
} else {
arr[k] = R[j];
j++;
}
k++;
}
// copy any remaining elements from L and R
while (i < n1) {
arr[k] = L[i];
i++;
k++;
}
while (j < n2) {
arr[k] = R[j];
j++;
k++;
}
}
}
MergeSort.main(null);
Big O Analysis: O(n*log(n))
-
The worst-case time complexity of merge sort is O(n*log(n)), where n is the number of elements in the array. This means that as the size of the input array increases, the running time of the algorithm grows at a rate that is roughly proportional to n times the logarithm of n.
-
The merge sort algorithm works by dividing the input array into halves recursively until each subarray contains only one element. It then merges the sorted subarrays back together, comparing the first element of each subarray and moving the smaller element into the final sorted array until all elements have been merged. This process is repeated until the entire array is sorted.
-
The key to the efficiency of merge sort is that the merge step is able to combine two sorted subarrays into a single sorted array in O(n) time, where n is the total number of elements in both subarrays. This means that the overall time complexity of merge sort is dominated by the recursive splitting of the input array into halves, which requires O(log(n)) levels of recursion. Since each level of recursion performs O(n) work, the total time complexity of merge sort is O(n*log(n)).
-
In practice, merge sort is a highly efficient algorithm for sorting large arrays or lists, and is widely used in industry and academia for its stability, predictability, and scalability.
Bubble Sort
public class BubbleSort {
public static void main(String[] args) {
int[] arr = {5, 2, 8, 1, 9, 7, 4, 3, 6};
System.out.print("Before Sort: ");
for (int i = 0; i < arr.length; i++) {
System.out.print(arr[i] + " ");
}
// call bubble sort method
bubbleSort(arr);
System.out.println();
System.out.print("After Sort: ");
// print sorted array
for (int i = 0; i < arr.length; i++) {
System.out.print(arr[i] + " ");
}
}
public static void bubbleSort(int[] arr) {
int n = arr.length;
for (int i = 0; i < n - 1; i++) {
for (int j = 0; j < n - i - 1; j++) {
if (arr[j] > arr[j + 1]) {
// swap arr[j] and arr[j+1]
int temp = arr[j];
arr[j] = arr[j + 1];
arr[j + 1] = temp;
}
}
}
}
}
BubbleSort.main(null);
Big O Analysis: O(n^2)
-
The worst-case time complexity of the bubble sort algorithm is O(n^2), where n is the number of elements in the array being sorted.
-
In the worst-case scenario, when the input array is in reverse order, the bubble sort algorithm will need to perform n-1 passes through the array to sort it, and each pass requires n-i-1 comparisons and swaps. This means that the total number of comparisons and swaps required is approximately (n-1) * (n-1) = n^2 - 2n + 1. When we ignore the lower-order terms and the constant coefficient, the time complexity of bubble sort reduces to O(n^2).
-
Bubble sort is not very efficient for sorting large arrays or lists, as its time complexity grows quadratically with the size of the input. However, it has the advantage of being a simple and easy-to-understand algorithm, and can be useful for sorting small datasets or for educational purposes. In practice, more efficient sorting algorithms like merge sort, quicksort, or heapsort are typically used for larger datasets.
Selection Sort
public class SelectionSort {
public static void main(String[] args) {
int[] arr = {5, 2, 8, 1, 9, 7, 4, 3, 6};
System.out.print("Before Sort: ");
for (int i = 0; i < arr.length; i++) {
System.out.print(arr[i] + " ");
}
// call selection sort method
selectionSort(arr);
System.out.println();
System.out.print("After Sort: ");
// print sorted array
for (int i = 0; i < arr.length; i++) {
System.out.print(arr[i] + " ");
}
}
public static void selectionSort(int[] arr) {
int n = arr.length;
// traverse the array
for (int i = 0; i < n - 1; i++) {
// find the minimum element in the unsorted part of the array
int minIdx = i;
for (int j = i + 1; j < n; j++) {
if (arr[j] < arr[minIdx]) {
minIdx = j;
}
}
// swap the minimum element with the first element in the unsorted part of the array
int temp = arr[minIdx];
arr[minIdx] = arr[i];
arr[i] = temp;
}
}
}
SelectionSort.main(null);
Big O Analysis: O(n^2)
-
The worst-case time complexity of the selection sort algorithm is O(n^2), where n is the number of elements in the array being sorted.
-
In the worst-case scenario, when the input array is in reverse order, the selection sort algorithm will need to perform n-1 passes through the array to sort it, and each pass requires n-i-1 comparisons and swaps. This means that the total number of comparisons and swaps required is approximately (n-1) * (n-1) = n^2 - 2n + 1. When we ignore the lower-order terms and the constant coefficient, the time complexity of selection sort reduces to O(n^2).
-
Selection sort is not very efficient for sorting large arrays or lists, as its time complexity grows quadratically with the size of the input. However, it has the advantage of being a simple and easy-to-understand algorithm, and can be useful for sorting small datasets or for educational purposes. In practice, more efficient sorting algorithms like merge sort, quicksort, or heapsort are typically used for larger datasets.
Hashmaps
- this hashmap code generates an array of 5000 random integers between 0 and 4999, and then inserts each of these integers into a hashmap
- measures the time it takes to perform a lookup for the integer value 40 in the hashmap, and the time it takes to perform a binary search for the value 40 in the sorted array of integers
- the output of the program displays the time it took for each search algorithm to execute in nanoseconds
import java.util.HashMap;
import java.util.Random;
public class Hash {
public static void main(String[] args) {
// Create a new hashmap and list
HashMap<Integer, Integer> hashmap = new HashMap<>();
int[] list = generateRandomList(5000);
// Fill the hashmap with integers from the list as keys and values
for (int i = 0; i < list.length; i++) {
hashmap.put(list[i], list[i]);
}
//For each element in the array, the "put" method of the HashMap is called with the element
//as both the key and the value, this results in the key-value pair being added to the HashMap
// Test the lookup and binary search algorithms with a value of 40
int value = 40;
long lookUpTime = measureLookUpTime(hashmap, value);
System.out.println("Time to search for quiz result of 40 in hashmap: " + lookUpTime + " nanoseconds");
long binarySearchTime = measureBinarySearchTime(list, value);
System.out.println("Time to search for result of 40 in binary: " + binarySearchTime + " nanoseconds");
}//unsorted data and has a constant time complexity
//latter is used for sorted data and has a logarithmic time complexity
// Helper method to generate a random list of given size
private static int[] generateRandomList(int size) {
int[] list = new int[size];
Random random = new Random();
for (int i = 0; i < size; i++) {
list[i] = random.nextInt(size);
}
return list;
}
// Helper method to measure the time it takes to look up a value in the hashmap
private static long measureLookUpTime(HashMap<Integer, Integer> hashmap, int value) {
long start = System.nanoTime();
hashmap.containsKey(value);
long end = System.nanoTime();
return (end - start);
}
// Helper method to measure the time it takes to perform a binary search on a sorted list
private static long measureBinarySearchTime(int[] list, int value) {
long start = System.nanoTime();
// Sort the list
quickSort(list, 0, list.length - 1);
//After the quickSort method completes, the list array will be sorted in ascending order.
// Perform binary search
int low = 0;
int high = list.length - 1;
while (low <= high) {
int mid = (low + high) / 2;
if (list[mid] == value) {
break;
} else if (list[mid] < value) {
low = mid + 1;
} else {
high = mid - 1;
}
}
long end = System.nanoTime();
return (end - start);
}
// Helper method to perform quick sort on the list
private static void quickSort(int[] list, int low, int high) {
if (low < high) {
int partIndex = partition(list, low, high);
quickSort(list, low, partIndex - 1);
quickSort(list, partIndex + 1, high);
}
} //used to divide the sub-array into two sub-arrays
// Helper method to partition the list for quick sort
private static int partition(int[] list, int low, int high) {
//partition: part into 2 sub arrays
int pivot = list[high];
//pivot element, which is the last element in the sub-array indicated by the "high"
int i = low - 1;
for (int j = low; j <= high - 1; j++) {
// iterates through the subarray from "low" to "high - 1" with another index "j"
if (list[j] < pivot) {
i++;
swap(list, i, j);
//"i" and "j" are swapped so that the lesser element is in the first subarray
}
}
// pivot element is swapped with the element at index "i + 1", so that it is now in between the two subarrays
swap(list, i + 1, high);
return i + 1;
}//The method then returns the index of the pivot element, which is now in its final sorted position.
// Helper method to swap two elements in the list
private static void swap(int[] list, int i, int j) {
int temp = list[i];
list[i] = list[j];
list[j] = temp;
}
}
Hash.main(null);