Sorting means listing things in ascending order or descending order in size.
Alignment algorithms can also be classified in terms of stability. Stability means that if there are multiple records with the same key value in the input data, the relative positions of these records do not change even after alignment.
A sorting algorithm is said to be adaptive if it takes advantage of existing order in the input.
Selection sort is the easiest sorting method to understand. Selection sort is an alignment technique that repeats the operation of selecting the smallest number in a list and moving to the left.
public class SelectionSort {
public static void selectionSort(int[] arr) {
int n = arr.length;
// Outer loop: for each position in the array
for (int i = 0; i < n - 1; i++) {
int minIndex = i; // Assume the current index has the smallest value
// Inner loop: find the actual minimum in the rest of the array
for (int j = i + 1; j < n; j++) {
if (arr[j] < arr[minIndex]) {
minIndex = j;
}
}
// Swap only if needed (minIndex has changed)
if (minIndex != i) {
int temp = arr[i];
arr[i] = arr[minIndex];
arr[minIndex] = temp;
}
}
}
}
| Case | Time Complexity |
|---|---|
| Best | O(n²) |
| Average | O(n²) |
| Worst | O(n²) |
Not stable by default.
Because selection sort swaps non-adjacent elements, so the relative order of equal elements might change.
Example:
arr = { (4, 'a'), (4, 'b'), (3, 'c') }
Sorting based on the first element:
{ (3, 'c'), (4, 'b'), (4, 'a') } // unstable: (4, 'a') and (4, 'b') swapped
| Property | Value |
|---|---|
| Time Complexity | O(n²) |
| Space Complexity | O(1) |
| Stable | ❌ No |
| In-place | ✅ Yes |
| Adaptive | ❌ No |
| Best Use Case | Small data or almost sorted arrays with few swaps |
Insertion Sort works the same way you sort playing cards in your hand:
It builds the final sorted array one element at a time, inserting elements into their correct position in the sorted part of the array.
public class InsertionSort {
public static void insertionSort(int[] arr) {
int n = arr.length;
// Start from the second element
for (int i = 1; i < n; i++) {
int key = arr[i]; // current element to insert
int j = i - 1;
// Move elements of arr[0..i-1] that are greater than key
// to one position ahead of their current position
while (j >= 0 && arr[j] > key) {
arr[j + 1] = arr[j]; // shift element to the right
j--;
}
arr[j + 1] = key; // insert the key in correct position
}
}
}
| Case | Comparisons / Time |
|---|---|
| Best (sorted) | O(n) |
| Average | O(n²) |
| Worst (reverse) | O(n²) |
key, j).✅ Stable
Because it does not swap equal elements; it inserts them after existing equal elements, preserving their relative order.
✅ Yes
It runs faster when the input is already or nearly sorted.
Best case: O(n)
| Property | Value |
|---|---|
| Time Complexity | Best: O(n), Avg: O(n²), Worst: O(n²) |
| Space Complexity | O(1) |
| Stable | ✅ Yes |
| In-place | ✅ Yes |
| Adaptive | ✅ Yes |
| Best Use Case | Small or nearly sorted datasets |
Bubble Sort repeatedly compares adjacent elements and swaps them if they are in the wrong order, effectively "bubbling" the largest (or smallest) element to its correct position in each pass.
It works like how bubbles rise to the top in water — hence the name.
Given array: [4, 1, 5, 2]
[1, 4, 5, 2][1, 4, 5, 2][1, 4, 2, 5] ✅ Largest element 5 is now at the end.[1, 2, 4, 5] ✅ Next largest element 4 is now in correct position.public class BubbleSort {
public static void bubbleSort(int[] arr) {
int n = arr.length;
boolean swapped;
// Outer loop for all passes
for (int i = 0; i < n - 1; i++) {
swapped = false;
// Inner loop for pairwise comparison
for (int j = 0; j < n - i - 1; j++) {
if (arr[j] > arr[j + 1]) {
// Swap if in wrong order
int temp = arr[j];
arr[j] = arr[j + 1];
arr[j + 1] = temp;
swapped = true;
}
}
// Optimization: if no swaps in this pass, array is sorted
if (!swapped) break;
}
}
}
| Case | Comparisons / Time |
|---|---|
| Best (sorted) | O(n) – if optimized with swap check |
| Average | O(n²) |
| Worst (reverse) | O(n²) |
O(1) → In-place sorting
✅ Stable
It only swaps if needed, and only adjacent elements, so equal elements retain their relative order.
✅ Yes (if optimized with a swapped flag)
If no swaps are made in a pass, it stops early → O(n) best case
| Property | Value |
|---|---|
| Time Complexity | Best: O(n), Avg/Worst: O(n²) |
| Space Complexity | O(1) |
| Stable | ✅ Yes |
| In-place | ✅ Yes |
| Adaptive | ✅ Yes (with early stop optimization) |
| Best Use Case | Educational use, or small arrays that are nearly sorted |
Shell Sort is an optimization over Insertion Sort.
It improves efficiency by allowing the exchange of far-apart elements.
Named after its inventor Donald Shell, the idea is to:
Given array: [8, 5, 3, 7, 6, 2, 4]
Assume gap sequence: gap = n/2 → n/4 → ... → 1
Let’s say:
gap = 3: Compare and sort elements that are 3 indices apart[7, 5, 2, 8, 6, 3, 4]gap = 1: Now do normal insertion sort → Finish sorting.You can think of it as gapped insertion sort.
public class ShellSort {
public static void shellSort(int[] arr) {
int n = arr.length;
// Start with a big gap, then reduce it
for (int gap = n / 2; gap > 0; gap /= 2) {
// Do a gapped insertion sort for this gap size
for (int i = gap; i < n; i++) {
int temp = arr[i];
int j;
// Shift earlier gap-sorted elements up until the correct location is found
for (j = i; j >= gap && arr[j - gap] > temp; j -= gap) {
arr[j] = arr[j - gap];
}
arr[j] = temp;
}
}
}
}
| Gap Sequence | Worst Case Time Complexity |
|---|---|
| Shell's original | O(n²) |
| Hibbard | O(n^(3/2)) |
| Sedgewick | O(n^(4/3)) |
| Tokuda | ~O(n^1.25) |
| Optimized (Ciura) | Between O(n) and O(n²) |
Because elements can be moved far apart, the relative order of equal elements may not be preserved.
Adaptive
- Not Adaptive
Shell Sort doesn’t take advantage of partially sorted arrays like insertion sort does.
Even sorted arrays may be fully processed.
| Property | Value |
|---|---|
| Time Complexity | Best: O(n log n), Avg: ~O(n^1.25), Worst: O(n²) |
| Space Complexity | O(1) |
| Stable | ❌ No |
| In-place | ✅ Yes |
| Adaptive | ❌ No |
| Best Use Case | Medium-size arrays where fast in-place sort is needed |
Merge Sort is a classic Divide and Conquer algorithm.
Core Idea:
Given: [6, 3, 8, 5, 2, 7, 4, 1]
[6, 3, 8, 5, 2, 7, 4, 1][6, 3, 8, 5] and [2, 7, 4, 1][6] and [3] → [3, 6][8] and [5] → [5, 8][3, 6] and [5, 8] → [3, 5, 6, 8][3, 5, 6, 8] and [1, 2, 4, 7] → fully sorted arraypublic class MergeSort {
public static void mergeSort(int[] arr, int left, int right) {
if (left < right) {
int mid = left + (right - left) / 2;
// Sort first and second halves
mergeSort(arr, left, mid);
mergeSort(arr, mid + 1, right);
// Merge the sorted halves
merge(arr, left, mid, right);
}
}
public static void merge(int[] arr, int left, int mid, int right) {
// Sizes of the two subarrays
int n1 = mid - left + 1;
int n2 = right - mid;
// Temporary arrays
int[] L = new int[n1];
int[] R = new int[n2];
// Copy data into temporary arrays
for (int i = 0; i < n1; i++)
L[i] = arr[left + i];
for (int j = 0; j < n2; j++)
R[j] = arr[mid + 1 + j];
// Merge the two arrays
int i = 0, j = 0, k = left;
while (i < n1 && j < n2) {
if (L[i] <= R[j]) {
arr[k++] = L[i++];
} else {
arr[k++] = R[j++];
}
}
// Copy remaining elements
while (i < n1) {
arr[k++] = L[i++];
}
while (j < n2) {
arr[k++] = R[j++];
}
}
}
| Case | Time Complexity |
|---|---|
| Best | O(n log n) |
| Average | O(n log n) |
| Worst | O(n log n) |
log n levelsO(n) to merge → Total: O(n log n)Equal elements maintain original order during merge (e.g., if
L[i] <= R[j]we pickL[i]first)
Doesn’t take advantage of partial order — it always does full divide-merge process.
| Property | Value |
|---|---|
| Time Complexity | Best/Average/Worst: O(n log n) |
| Space Complexity | O(n) |
| Stable | ✅ Yes |
| In-place | ❌ No (uses auxiliary space for merging) |
| Adaptive | ❌ No |
| Best Use Case | Large datasets where guaranteed O(n log n) is needed |
Quick Sort is a Divide and Conquer algorithm just like Merge Sort — but it sorts in-place and is usually faster in practice.
Core Idea:
Sorting happens during the partitioning step — no merge phase required.
log n.Given: [9, 3, 7, 5, 6, 4, 8, 2]
Assume pivot = 4
[3, 2, 4, 5, 6, 7, 8, 9] (pivot in correct position)[3, 2] → sorted to [2, 3][5, 6, 7, 8, 9] → sorted recursively[2, 3, 4, 5, 6, 7, 8, 9]public class QuickSort {
public static void quickSort(int[] arr, int low, int high) {
if (low < high) {
// pi is partition index
int pi = partition(arr, low, high);
// Recursively sort elements before and after partition
quickSort(arr, low, pi - 1);
quickSort(arr, pi + 1, high);
}
}
// Lomuto partition scheme (pivot = last element)
public static int partition(int[] arr, int low, int high) {
int pivot = arr[high]; // pivot
int i = low - 1; // index of smaller element
for (int j = low; j < high; j++) {
if (arr[j] <= pivot) {
i++;
// swap arr[i] and arr[j]
int temp = arr[i];
arr[i] = arr[j];
arr[j] = temp;
}
}
// swap arr[i+1] and arr[high] (pivot)
int temp = arr[i + 1];
arr[i + 1] = arr[high];
arr[high] = temp;
return i + 1;
}
}
| Case | Time Complexity |
|---|---|
| Best | O(n log n) |
| Average | O(n log n) |
| Worst | O(n²) (when partition is highly unbalanced, e.g., already sorted array with bad pivot) |
With randomized pivot or median-of-three, worst case is very rare.
Equal elements can be reordered during swaps
Always partitions the array — doesn't optimize for nearly sorted data
| Property | Value |
|---|---|
| Time Complexity | Best/Average: O(n log n), Worst: O(n²) |
| Space Complexity | O(log n) (for recursion) |
| Stable | ❌ No |
| In-place | ✅ Yes |
| Adaptive | ❌ No |
| Best Use Case | Large datasets where performance and memory matter |
std::sort)Heap Sort is a comparison-based, in-place, and non-recursive sorting algorithm that uses a binary heap data structure.
Core Idea:
A Max Heap ensures the largest element is always at the root.
Given: [4, 10, 3, 5, 1]
[10, 5, 3, 4, 1][1, 5, 3, 4, 10][5, 4, 3, 1, 10][1, 4, 3, 5, 10]... repeat until sorted.
public class HeapSort {
public static void heapSort(int[] arr) {
int n = arr.length;
// Step 1: Build max heap (heapify entire array)
for (int i = n / 2 - 1; i >= 0; i--) {
heapify(arr, n, i);
}
// Step 2: Extract elements one by one from heap
for (int i = n - 1; i > 0; i--) {
// Move current root to end
int temp = arr[0];
arr[0] = arr[i];
arr[i] = temp;
// Heapify reduced heap
heapify(arr, i, 0);
}
}
// To heapify a subtree rooted at index i, size = n
public static void heapify(int[] arr, int n, int i) {
int largest = i; // Initialize largest as root
int left = 2 * i + 1; // left child
int right = 2 * i + 2; // right child
// If left child is larger than root
if (left < n && arr[left] > arr[largest])
largest = left;
// If right child is larger than largest so far
if (right < n && arr[right] > arr[largest])
largest = right;
// If largest is not root
if (largest != i) {
int swap = arr[i];
arr[i] = arr[largest];
arr[largest] = swap;
// Recursively heapify the affected subtree
heapify(arr, n, largest);
}
}
}
| Case | Time Complexity |
|---|---|
| Best | O(n log n) |
| Average | O(n log n) |
| Worst | O(n log n) |
So heap sort always runs in O(n log n) time — very predictable!
Swapping elements may reorder equal values
To make it stable, you'd need to modify the heap to store original indices — increasing complexity.
Doesn't benefit from partially sorted input — it always does full heap operations
| Property | Value |
|---|---|
| Time Complexity | Best/Average/Worst: O(n log n) |
| Space Complexity | O(1) |
| Stable | ❌ No |
| In-place | ✅ Yes |
| Adaptive | ❌ No |
| Best Use Case | When in-place and worst-case guaranteed performance is needed |
Radix Sort is a non-comparative sorting algorithm that sorts integers (or strings) by processing individual digits one at a time, either least significant digit first (LSD) or most significant digit first (MSD).
It leverages a stable sorting algorithm (typically Counting Sort) at each digit level.
Since the sorting at each digit is stable, the relative order of previous digits is preserved
Given: [170, 45, 75, 90, 802, 24, 2, 66]
[170, 90, 802, 2, 24, 45, 75, 66][802, 2, 24, 45, 66, 170, 75, 90][2, 24, 45, 66, 75, 90, 170, 802] ✅ Sorted!public class RadixSort {
// Main radix sort function
public static void radixSort(int[] arr) {
int max = getMax(arr);
// Apply counting sort for every digit (exp = 1, 10, 100, ...)
for (int exp = 1; max / exp > 0; exp *= 10) {
countingSort(arr, exp);
}
}
// Helper function to get maximum value
private static int getMax(int[] arr) {
int max = arr[0];
for (int num : arr) {
if (num > max) max = num;
}
return max;
}
// Counting sort adapted to work on a specific digit
private static void countingSort(int[] arr, int exp) {
int n = arr.length;
int[] output = new int[n]; // output array
int[] count = new int[10]; // digits 0–9
// Count occurrences of digits
for (int i = 0; i < n; i++) {
int digit = (arr[i] / exp) % 10;
count[digit]++;
}
// Convert count to cumulative count
for (int i = 1; i < 10; i++) {
count[i] += count[i - 1];
}
// Build output array (stable sort: right-to-left)
for (int i = n - 1; i >= 0; i--) {
int digit = (arr[i] / exp) % 10;
output[count[digit] - 1] = arr[i];
count[digit]--;
}
// Copy output back to arr
System.arraycopy(output, 0, arr, 0, n);
}
}
Let:
n = number of elementsd = number of digits in the largest numberk = base of digits (usually 10)| Case | Time Complexity |
|---|---|
| Best | O(n × d) |
| Average | O(n × d) |
| Worst | O(n × d) |
→ Linear for fixed d (e.g., 32-bit integers)
count[] and output[]Counting sort (used internally) is stable, preserving the order of equal elements.
Always processes all digits regardless of input's sortedness.
| Property | Value |
|---|---|
| Time Complexity | O(n × d), linear for fixed-width integers |
| Space Complexity | O(n + k) |
| Stable | ✅ Yes |
| In-place | ❌ No (uses extra arrays) |
| Adaptive | ❌ No |
| Best Use Case | Large arrays of fixed-width integers (e.g., 32-bit) |
Tim Sort is a hybrid stable sorting algorithm, combining the strengths of:
It was invented by Tim Peters for Python and is also used in:
list.sort())It takes advantage of existing order in data (adaptive) and combines it with efficient merge techniques.
Java doesn’t expose its TimSort implementation in a simple way (it's in java.util.TimSort, deep in the internals), but here's a simplified version to demonstrate how Tim Sort works:
class TimSort {
static final int RUN = 32;
public static void insertionSort(int[] arr, int left, int right) {
for (int i = left + 1; i <= right; i++) {
int temp = arr[i];
int j = i - 1;
while (j >= left && arr[j] > temp) {
arr[j + 1] = arr[j];
j--;
}
arr[j + 1] = temp;
}
}
public static void merge(int[] arr, int l, int m, int r) {
int len1 = m - l + 1, len2 = r - m;
int[] left = new int[len1];
int[] right = new int[len2];
System.arraycopy(arr, l, left, 0, len1);
System.arraycopy(arr, m + 1, right, 0, len2);
int i = 0, j = 0, k = l;
while (i < len1 && j < len2) {
arr[k++] = (left[i] <= right[j]) ? left[i++] : right[j++];
}
while (i < len1) arr[k++] = left[i++];
while (j < len2) arr[k++] = right[j++];
}
public static void timSort(int[] arr) {
int n = arr.length;
// Step 1: Sort small runs using insertion sort
for (int i = 0; i < n; i += RUN) {
insertionSort(arr, i, Math.min((i + RUN - 1), (n - 1)));
}
// Step 2: Merge runs using merge sort logic
for (int size = RUN; size < n; size *= 2) {
for (int left = 0; left < n; left += 2 * size) {
int mid = left + size - 1;
int right = Math.min((left + 2 * size - 1), (n - 1));
if (mid < right) {
merge(arr, left, mid, right);
}
}
}
}
}
| Case | Time Complexity |
|---|---|
| Best | O(n) |
| Average | O(n log n) |
| Worst | O(n log n) |
It maintains the relative order of equal elements — essential for sorting objects with secondary keys.
Leverages existing sorted segments ("runs") in the input to reduce work.
| Property | Value |
|---|---|
| Time Complexity | Best: O(n), Avg/Worst: O(n log n) |
| Space Complexity | O(n) |
| Stable | ✅ Yes |
| In-place | ❌ No (uses extra space for merging) |
| Adaptive | ✅ Yes |
| Best Use Case | Real-world sorting of objects with possible partial order |
Arrays.sort(Object[]), Python’s sorted()).list.sort() and sorted()Arrays.sort() for Object[]int[] in Java (uses Dual-Pivot Quick Sort there)| Algorithm | Best Case | Average Case | Worst Case | Space | Stable | In-Place | Adaptive | Real-World Use Case |
|---|---|---|---|---|---|---|---|---|
| Selection | O(n²) | O(n²) | O(n²) | O(1) | ❌ | ✅ | ❌ | Educational, very small arrays |
| Insertion | O(n) | O(n²) | O(n²) | O(1) | ✅ | ✅ | ✅ | Small or nearly sorted arrays |
| Bubble | O(n) | O(n²) | O(n²) | O(1) | ✅ | ✅ | ✅ (with flag) | Educational, nearly sorted arrays |
| Shell | O(n log n) or better | O(n^1.25–n^1.5) | O(n²) | O(1) | ❌ | ✅ | ❌ | Medium-sized arrays, space-constrained cases |
| Merge | O(n log n) | O(n log n) | O(n log n) | O(n) | ✅ | ❌ | ❌ | Linked lists, stable sort, large data |
| Quick | O(n log n) | O(n log n) | O(n²) | O(log n) | ❌ | ✅ | ❌ | Fastest in practice, general-purpose |
| Heap | O(n log n) | O(n log n) | O(n log n) | O(1) | ❌ | ✅ | ❌ | Real-time systems (predictable time) |
| Radix | O(nk) | O(nk) | O(nk) | O(n + k) | ✅ | ❌ | ❌ | Integer sorting, fixed-width keys |
| Tim Sort | O(n) | O(n log n) | O(n log n) | O(n) | ✅ | ❌ | ✅ | Python/Java built-in sorting |
Where:
n = number of elementsk = number of digits (in radix sort)When to Use Which:
Quick Sort (if stability not needed)Merge Sort or Tim Sort (if stability is needed)Insertion Sort or Tim Sort (adaptive algorithms)Heap Sort or Quick Sort (in-place, small stack)Radix SortBubble, Selection, Insertion — good for learning basics