12. 정렬

Jerry·2025년 8월 1일

12.1 Sorting

Sorting means listing things in ascending order or descending order in size.
Alignment algorithms can also be classified in terms of stability. Stability means that if there are multiple records with the same key value in the input data, the relative positions of these records do not change even after alignment.
A sorting algorithm is said to be adaptive if it takes advantage of existing order in the input.

12.2 Selection sort

Principles

Selection sort is the easiest sorting method to understand. Selection sort is an alignment technique that repeats the operation of selecting the smallest number in a list and moving to the left.

Java Code

public class SelectionSort {
    public static void selectionSort(int[] arr) {
        int n = arr.length;

        // Outer loop: for each position in the array
        for (int i = 0; i < n - 1; i++) {
            int minIndex = i; // Assume the current index has the smallest value

            // Inner loop: find the actual minimum in the rest of the array
            for (int j = i + 1; j < n; j++) {
                if (arr[j] < arr[minIndex]) {
                    minIndex = j;
                }
            }

            // Swap only if needed (minIndex has changed)
            if (minIndex != i) {
                int temp = arr[i];
                arr[i] = arr[minIndex];
                arr[minIndex] = temp;
            }
        }
    }
}

Time Complexity

CaseTime Complexity
BestO(n²)
AverageO(n²)
WorstO(n²)

Space Complexity

  • O(1) – In-place sorting algorithm.
  • Only a few extra variables for index tracking.

Stability

Not stable by default.

Because selection sort swaps non-adjacent elements, so the relative order of equal elements might change.

Example:

arr = { (4, 'a'), (4, 'b'), (3, 'c') }

Sorting based on the first element:

  • 3 comes first, so it is swapped with the first 4.
  • Now the array becomes:
{ (3, 'c'), (4, 'b'), (4, 'a') }  // unstable: (4, 'a') and (4, 'b') swapped
  • To make it stable, avoid swapping, and instead shift elements (like insertion sort).

Summary Table

PropertyValue
Time ComplexityO(n²)
Space ComplexityO(1)
Stable❌ No
In-place✅ Yes
Adaptive❌ No
Best Use CaseSmall data or almost sorted arrays with few swaps

12.3 Insertion sort

Principles

Insertion Sort works the same way you sort playing cards in your hand:

  1. Start with the second element, compare it to the first.
  2. Insert it into the correct position in the already sorted left portion.
  3. Repeat this for all remaining elements.

It builds the final sorted array one element at a time, inserting elements into their correct position in the sorted part of the array.

Java code

public class InsertionSort {
    public static void insertionSort(int[] arr) {
        int n = arr.length;

        // Start from the second element
        for (int i = 1; i < n; i++) {
            int key = arr[i]; // current element to insert
            int j = i - 1;

            // Move elements of arr[0..i-1] that are greater than key
            // to one position ahead of their current position
            while (j >= 0 && arr[j] > key) {
                arr[j + 1] = arr[j]; // shift element to the right
                j--;
            }

            arr[j + 1] = key; // insert the key in correct position
        }
    }
}

Time Complexity

CaseComparisons / Time
Best (sorted)O(n)
AverageO(n²)
Worst (reverse)O(n²)
  • Best case occurs when the array is already sorted → Only 1 comparison per element.
  • Worst case occurs when array is sorted in descending order → Maximum shifts per element.

Space Complexity

  • O(1) → It sorts in-place using only a few extra variables (like key, j).

Stability

✅ Stable

Because it does not swap equal elements; it inserts them after existing equal elements, preserving their relative order.

Adaptive

✅ Yes

It runs faster when the input is already or nearly sorted.
Best case: O(n)

Summary Table

PropertyValue
Time ComplexityBest: O(n), Avg: O(n²), Worst: O(n²)
Space ComplexityO(1)
Stable✅ Yes
In-place✅ Yes
Adaptive✅ Yes
Best Use CaseSmall or nearly sorted datasets

12.4 Bubble sort

Principles

Bubble Sort repeatedly compares adjacent elements and swaps them if they are in the wrong order, effectively "bubbling" the largest (or smallest) element to its correct position in each pass.
It works like how bubbles rise to the top in water — hence the name.

How It Works

Given array: [4, 1, 5, 2]

  1. First Pass:
    • Compare 4 and 1 → swap → [1, 4, 5, 2]
    • Compare 4 and 5 → no swap → [1, 4, 5, 2]
    • Compare 5 and 2 → swap → [1, 4, 2, 5] ✅ Largest element 5 is now at the end.
  2. Second Pass:
    • Compare 1 and 4 → no swap
    • Compare 4 and 2 → swap → [1, 2, 4, 5] ✅ Next largest element 4 is now in correct position.
  3. Third Pass:
    • Compare 1 and 2 → no swap
      ✅ No swaps → array is already sorted → we stop early if optimized with a swap flag.

Java code

public class BubbleSort {
    public static void bubbleSort(int[] arr) {
        int n = arr.length;
        boolean swapped;

        // Outer loop for all passes
        for (int i = 0; i < n - 1; i++) {
            swapped = false;

            // Inner loop for pairwise comparison
            for (int j = 0; j < n - i - 1; j++) {
                if (arr[j] > arr[j + 1]) {
                    // Swap if in wrong order
                    int temp = arr[j];
                    arr[j] = arr[j + 1];
                    arr[j + 1] = temp;
                    swapped = true;
                }
            }

            // Optimization: if no swaps in this pass, array is sorted
            if (!swapped) break;
        }
    }
}

Time Complexity

CaseComparisons / Time
Best (sorted)O(n) – if optimized with swap check
AverageO(n²)
Worst (reverse)O(n²)
  • Performs up to n−1 passes
  • Each pass compares adjacent elements → about n(n−1)/2 comparisons

Space Complexity

O(1) → In-place sorting

Stability

✅ Stable

It only swaps if needed, and only adjacent elements, so equal elements retain their relative order.

Adaptive

✅ Yes (if optimized with a swapped flag)
If no swaps are made in a pass, it stops early → O(n) best case

Summary Table

PropertyValue
Time ComplexityBest: O(n), Avg/Worst: O(n²)
Space ComplexityO(1)
Stable✅ Yes
In-place✅ Yes
Adaptive✅ Yes (with early stop optimization)
Best Use CaseEducational use, or small arrays that are nearly sorted

12.5 Shell sort

Principles

Shell Sort is an optimization over Insertion Sort.
It improves efficiency by allowing the exchange of far-apart elements.

Named after its inventor Donald Shell, the idea is to:

  • Compare and sort elements that are far apart first,
  • Gradually reduce the gap until it becomes 1 (like insertion sort at the end).

Why It Works

  • In insertion sort, elements are moved only one step at a time.
  • In shell sort, we start with a big gap and reduce it:
    • This lets smaller elements move toward the beginning faster.
    • Final pass with gap = 1 finishes the sorting with minimal shifts.

How It Works

Given array: [8, 5, 3, 7, 6, 2, 4]
Assume gap sequence: gap = n/2 → n/4 → ... → 1

Let’s say:

  1. gap = 3: Compare and sort elements that are 3 indices apart
    • Compare (8, 7), (5, 6), (3, 2) → Result: [7, 5, 2, 8, 6, 3, 4]
  2. gap = 1: Now do normal insertion sort → Finish sorting.

You can think of it as gapped insertion sort.

Java Code

public class ShellSort {
    public static void shellSort(int[] arr) {
        int n = arr.length;

        // Start with a big gap, then reduce it
        for (int gap = n / 2; gap > 0; gap /= 2) {
            // Do a gapped insertion sort for this gap size
            for (int i = gap; i < n; i++) {
                int temp = arr[i];
                int j;

                // Shift earlier gap-sorted elements up until the correct location is found
                for (j = i; j >= gap && arr[j - gap] > temp; j -= gap) {
                    arr[j] = arr[j - gap];
                }

                arr[j] = temp;
            }
        }
    }
}

Time Complexity

Gap SequenceWorst Case Time Complexity
Shell's originalO(n²)
HibbardO(n^(3/2))
SedgewickO(n^(4/3))
Tokuda~O(n^1.25)
Optimized (Ciura)Between O(n) and O(n²)
  • Best Case: O(n log n) (with good gaps like Ciura’s)
  • Average Case: O(n^1.25) to O(n^1.5)
  • Worst Case: O(n²) (with Shell’s original gap sequence)

Space Complexity

  • O(1) → In-place sorting algorithm

Stability

  • Not Stable

Because elements can be moved far apart, the relative order of equal elements may not be preserved.

Adaptive

  • Not Adaptive

Shell Sort doesn’t take advantage of partially sorted arrays like insertion sort does.
Even sorted arrays may be fully processed.

Summary Table

PropertyValue
Time ComplexityBest: O(n log n), Avg: ~O(n^1.25), Worst: O(n²)
Space ComplexityO(1)
Stable❌ No
In-place✅ Yes
Adaptive❌ No
Best Use CaseMedium-size arrays where fast in-place sort is needed

12.6 Merge sort

Principles

Merge Sort is a classic Divide and Conquer algorithm.

Core Idea:

  1. Divide the array into two halves recursively until each subarray has only one element.
  2. Conquer by merging two sorted subarrays into a single sorted array.

Why It Works

  • Sorting is done during the merge step, where two sorted halves are combined efficiently.
  • Since single-element arrays are already sorted, merging them builds up the solution.

How It Works

Given: [6, 3, 8, 5, 2, 7, 4, 1]

  1. Divide Phase:
    • [6, 3, 8, 5, 2, 7, 4, 1]
    • Split → [6, 3, 8, 5] and [2, 7, 4, 1]
    • Recursively split each → until all subarrays are of size 1
  2. Merge Phase:
    • Merge [6] and [3][3, 6]
    • Merge [8] and [5][5, 8]
    • Then merge [3, 6] and [5, 8][3, 5, 6, 8]
    • Repeat for other half
    • Finally, merge [3, 5, 6, 8] and [1, 2, 4, 7] → fully sorted array

Java Code

public class MergeSort {
    public static void mergeSort(int[] arr, int left, int right) {
        if (left < right) {
            int mid = left + (right - left) / 2;

            // Sort first and second halves
            mergeSort(arr, left, mid);
            mergeSort(arr, mid + 1, right);

            // Merge the sorted halves
            merge(arr, left, mid, right);
        }
    }

    public static void merge(int[] arr, int left, int mid, int right) {
        // Sizes of the two subarrays
        int n1 = mid - left + 1;
        int n2 = right - mid;

        // Temporary arrays
        int[] L = new int[n1];
        int[] R = new int[n2];

        // Copy data into temporary arrays
        for (int i = 0; i < n1; i++)
            L[i] = arr[left + i];
        for (int j = 0; j < n2; j++)
            R[j] = arr[mid + 1 + j];

        // Merge the two arrays
        int i = 0, j = 0, k = left;
        while (i < n1 && j < n2) {
            if (L[i] <= R[j]) {
                arr[k++] = L[i++];
            } else {
                arr[k++] = R[j++];
            }
        }

        // Copy remaining elements
        while (i < n1) {
            arr[k++] = L[i++];
        }
        while (j < n2) {
            arr[k++] = R[j++];
        }
    }
}

Time Complexity

CaseTime Complexity
BestO(n log n)
AverageO(n log n)
WorstO(n log n)
  • Always divides array in half → log n levels
  • Each level requires O(n) to merge → Total: O(n log n)

Space Complexity

  • O(n) – due to temporary arrays used in merging
  • Not in-place by default

Stability

  • Stable

Equal elements maintain original order during merge (e.g., if L[i] <= R[j] we pick L[i] first)

Adaptive

  • Not Adaptive

Doesn’t take advantage of partial order — it always does full divide-merge process.

Summary Table

PropertyValue
Time ComplexityBest/Average/Worst: O(n log n)
Space ComplexityO(n)
Stable✅ Yes
In-place❌ No (uses auxiliary space for merging)
Adaptive❌ No
Best Use CaseLarge datasets where guaranteed O(n log n) is needed

When to Use Merge Sort:

  • When stability is important (e.g., sorting records by name then age)
  • For large input size with guaranteed performance
  • Often used in external sorting (e.g., large files on disk)

12.7 Quick sort

Principles

Quick Sort is a Divide and Conquer algorithm just like Merge Sort — but it sorts in-place and is usually faster in practice.

Core Idea:

  1. Pick a pivot element from the array.
  2. Partition the array into two parts:
    • Elements less than the pivot go to the left.
    • Elements greater than the pivot go to the right.
  3. Recursively sort the left and right subarrays.

Sorting happens during the partitioning step — no merge phase required.

Why It Works

  • It reduces the problem size at every step.
  • If partitioning is balanced, the recursion depth is log n.
  • In-place sorting with good cache performance.

How It Works

Given: [9, 3, 7, 5, 6, 4, 8, 2]
Assume pivot = 4

  1. Partition:
    • Move elements < 4 to the left, > 4 to the right.
    • Result: [3, 2, 4, 5, 6, 7, 8, 9] (pivot in correct position)
  2. Recurse on:
    • Left: [3, 2] → sorted to [2, 3]
    • Right: [5, 6, 7, 8, 9] → sorted recursively
  3. Combine results:
    • Fully sorted array: [2, 3, 4, 5, 6, 7, 8, 9]

Java Code

public class QuickSort {
    public static void quickSort(int[] arr, int low, int high) {
        if (low < high) {
            // pi is partition index
            int pi = partition(arr, low, high);

            // Recursively sort elements before and after partition
            quickSort(arr, low, pi - 1);
            quickSort(arr, pi + 1, high);
        }
    }

    // Lomuto partition scheme (pivot = last element)
    public static int partition(int[] arr, int low, int high) {
        int pivot = arr[high];  // pivot
        int i = low - 1;        // index of smaller element

        for (int j = low; j < high; j++) {
            if (arr[j] <= pivot) {
                i++;
                // swap arr[i] and arr[j]
                int temp = arr[i];
                arr[i] = arr[j];
                arr[j] = temp;
            }
        }

        // swap arr[i+1] and arr[high] (pivot)
        int temp = arr[i + 1];
        arr[i + 1] = arr[high];
        arr[high] = temp;

        return i + 1;
    }
}

Time Complexity

CaseTime Complexity
BestO(n log n)
AverageO(n log n)
WorstO(n²) (when partition is highly unbalanced, e.g., already sorted array with bad pivot)

With randomized pivot or median-of-three, worst case is very rare.

Space Complexity

  • O(log n) auxiliary stack space for recursion
  • In-place (doesn't use extra array like Merge Sort)

Stability

  • Not Stable

Equal elements can be reordered during swaps

Adaptive

  • Not Adaptive

Always partitions the array — doesn't optimize for nearly sorted data

Summary Table

PropertyValue
Time ComplexityBest/Average: O(n log n), Worst: O(n²)
Space ComplexityO(log n) (for recursion)
Stable❌ No
In-place✅ Yes
Adaptive❌ No
Best Use CaseLarge datasets where performance and memory matter

When to Use Quick Sort:

  • When in-place sort with high performance is needed
  • Very fast on average, even better than merge sort for practical inputs
  • Popular for library implementations (e.g., C++ STL std::sort)

Variants:

  • Lomuto partition (simpler, more common in teaching)
  • Hoare partition (more efficient in practice)
  • Randomized Quick Sort (to avoid worst-case on sorted input)
  • 3-way Quick Sort (for arrays with many duplicate elements)

12.8 Heap sort

Principles

Heap Sort is a comparison-based, in-place, and non-recursive sorting algorithm that uses a binary heap data structure.

Core Idea:

  1. Build a Max Heap from the input array.
  2. Swap the maximum element (root) with the last element of the heap.
  3. Reduce the heap size by one and heapify the root.
  4. Repeat until all elements are sorted.

A Max Heap ensures the largest element is always at the root.

Why It Works

  • A heap allows efficient access to the max element (in O(1)).
  • Each removal + heapify takes O(log n).
  • Entire sort done in-place without recursion (in iterative version).

How It Works (Step-by-Step)

Given: [4, 10, 3, 5, 1]

  1. Build Max Heap:
    [10, 5, 3, 4, 1]
  2. Swap root (10) with last element → [1, 5, 3, 4, 10]
    Heapify root → [5, 4, 3, 1, 10]
  3. Swap root (5) with second last → [1, 4, 3, 5, 10]
    Heapify root → [4, 1, 3, 5, 10]

... repeat until sorted.

Java Code

public class HeapSort {

    public static void heapSort(int[] arr) {
        int n = arr.length;

        // Step 1: Build max heap (heapify entire array)
        for (int i = n / 2 - 1; i >= 0; i--) {
            heapify(arr, n, i);
        }

        // Step 2: Extract elements one by one from heap
        for (int i = n - 1; i > 0; i--) {
            // Move current root to end
            int temp = arr[0];
            arr[0] = arr[i];
            arr[i] = temp;

            // Heapify reduced heap
            heapify(arr, i, 0);
        }
    }

    // To heapify a subtree rooted at index i, size = n
    public static void heapify(int[] arr, int n, int i) {
        int largest = i;       // Initialize largest as root
        int left = 2 * i + 1;  // left child
        int right = 2 * i + 2; // right child

        // If left child is larger than root
        if (left < n && arr[left] > arr[largest])
            largest = left;

        // If right child is larger than largest so far
        if (right < n && arr[right] > arr[largest])
            largest = right;

        // If largest is not root
        if (largest != i) {
            int swap = arr[i];
            arr[i] = arr[largest];
            arr[largest] = swap;

            // Recursively heapify the affected subtree
            heapify(arr, n, largest);
        }
    }
}

Time Complexity

CaseTime Complexity
BestO(n log n)
AverageO(n log n)
WorstO(n log n)
  • Building the heap takes O(n)
  • Each of the n removals takes O(log n)

So heap sort always runs in O(n log n) time — very predictable!

Space Complexity

  • O(1) → In-place sorting
  • Only constant extra space used

Stability

  • Not Stable

Swapping elements may reorder equal values

To make it stable, you'd need to modify the heap to store original indices — increasing complexity.

Adaptive

  • Not Adaptive

Doesn't benefit from partially sorted input — it always does full heap operations

Summary Table

PropertyValue
Time ComplexityBest/Average/Worst: O(n log n)
Space ComplexityO(1)
Stable❌ No
In-place✅ Yes
Adaptive❌ No
Best Use CaseWhen in-place and worst-case guaranteed performance is needed

When to Use Heap Sort:

  • When O(n log n) time is needed with no extra space
  • Suitable for embedded systems where memory is tight
  • Useful in real-time systems where worst-case time matters

Notes:

  • Unlike merge sort (which is stable and uses extra space), heap sort is space-efficient but not stable.
  • Slower than quick sort in practice due to more data movement and poor cache behavior.

12.9 Radix sort

Principles

Radix Sort is a non-comparative sorting algorithm that sorts integers (or strings) by processing individual digits one at a time, either least significant digit first (LSD) or most significant digit first (MSD).

It leverages a stable sorting algorithm (typically Counting Sort) at each digit level.

Core Idea (LSD Version):

  1. Sort all numbers by the least significant digit (units place).
  2. Then sort again by the next digit (tens place) — but using a stable sort.
  3. Repeat until the most significant digit has been processed.

Since the sorting at each digit is stable, the relative order of previous digits is preserved

Why It Works

  • It avoids comparisons between elements.
  • For fixed-width integers, it's linear time.
  • Relies on the number of digits (not the number size) — great for bounded keys.

How It Works (Step-by-Step)

Given: [170, 45, 75, 90, 802, 24, 2, 66]

  1. Sort by units digit:
    [170, 90, 802, 2, 24, 45, 75, 66]
  2. Sort by tens digit:
    [802, 2, 24, 45, 66, 170, 75, 90]
  3. Sort by hundreds digit:
    [2, 24, 45, 66, 75, 90, 170, 802] ✅ Sorted!

Java Code

public class RadixSort {

    // Main radix sort function
    public static void radixSort(int[] arr) {
        int max = getMax(arr);

        // Apply counting sort for every digit (exp = 1, 10, 100, ...)
        for (int exp = 1; max / exp > 0; exp *= 10) {
            countingSort(arr, exp);
        }
    }

    // Helper function to get maximum value
    private static int getMax(int[] arr) {
        int max = arr[0];
        for (int num : arr) {
            if (num > max) max = num;
        }
        return max;
    }

    // Counting sort adapted to work on a specific digit
    private static void countingSort(int[] arr, int exp) {
        int n = arr.length;
        int[] output = new int[n];  // output array
        int[] count = new int[10];  // digits 0–9

        // Count occurrences of digits
        for (int i = 0; i < n; i++) {
            int digit = (arr[i] / exp) % 10;
            count[digit]++;
        }

        // Convert count to cumulative count
        for (int i = 1; i < 10; i++) {
            count[i] += count[i - 1];
        }

        // Build output array (stable sort: right-to-left)
        for (int i = n - 1; i >= 0; i--) {
            int digit = (arr[i] / exp) % 10;
            output[count[digit] - 1] = arr[i];
            count[digit]--;
        }

        // Copy output back to arr
        System.arraycopy(output, 0, arr, 0, n);
    }
}

Time Complexity

Let:

  • n = number of elements
  • d = number of digits in the largest number
  • k = base of digits (usually 10)
CaseTime Complexity
BestO(n × d)
AverageO(n × d)
WorstO(n × d)

→ Linear for fixed d (e.g., 32-bit integers)

Space Complexity

  • O(n + k) → uses extra space for count[] and output[]

Stability

  • Stable

Counting sort (used internally) is stable, preserving the order of equal elements.

Adaptive

  • Not Adaptive

Always processes all digits regardless of input's sortedness.

Summary Table

PropertyValue
Time ComplexityO(n × d), linear for fixed-width integers
Space ComplexityO(n + k)
Stable✅ Yes
In-place❌ No (uses extra arrays)
Adaptive❌ No
Best Use CaseLarge arrays of fixed-width integers (e.g., 32-bit)

When to Use Radix Sort:

  • When sorting many integers or strings with bounded length
  • When comparison-based sorts (like Quick Sort) are hitting limits
  • Great for postal codes, phone numbers, IP addresses, etc.

Notes:

  • Can also be implemented with MSD (most significant digit first) — useful for strings
  • For small n, overhead may outweigh benefits — prefer insertion/merge

12.10 Tim sort

Principles

Tim Sort is a hybrid stable sorting algorithm, combining the strengths of:

  • Merge Sort (for performance on large datasets),
  • Insertion Sort (for performance on small or nearly sorted data).

It was invented by Tim Peters for Python and is also used in:

  • Python’s built-in sort (list.sort())
  • Java’s Arrays.sort() for non-primitives (Objects)

Core Idea:

  1. Divide the array into small chunks called "runs" (already sorted or sorted using insertion sort).
  2. Use insertion sort to sort small runs (e.g., length ≤ 32).
  3. Merge runs together using an optimized merge sort strategy.

It takes advantage of existing order in data (adaptive) and combines it with efficient merge techniques.

Java-Like Pseudocode (Simplified Tim Sort)

Java doesn’t expose its TimSort implementation in a simple way (it's in java.util.TimSort, deep in the internals), but here's a simplified version to demonstrate how Tim Sort works:

class TimSort {
    static final int RUN = 32;

    public static void insertionSort(int[] arr, int left, int right) {
        for (int i = left + 1; i <= right; i++) {
            int temp = arr[i];
            int j = i - 1;
            while (j >= left && arr[j] > temp) {
                arr[j + 1] = arr[j];
                j--;
            }
            arr[j + 1] = temp;
        }
    }

    public static void merge(int[] arr, int l, int m, int r) {
        int len1 = m - l + 1, len2 = r - m;
        int[] left = new int[len1];
        int[] right = new int[len2];
        System.arraycopy(arr, l, left, 0, len1);
        System.arraycopy(arr, m + 1, right, 0, len2);

        int i = 0, j = 0, k = l;
        while (i < len1 && j < len2) {
            arr[k++] = (left[i] <= right[j]) ? left[i++] : right[j++];
        }
        while (i < len1) arr[k++] = left[i++];
        while (j < len2) arr[k++] = right[j++];
    }

    public static void timSort(int[] arr) {
        int n = arr.length;

        // Step 1: Sort small runs using insertion sort
        for (int i = 0; i < n; i += RUN) {
            insertionSort(arr, i, Math.min((i + RUN - 1), (n - 1)));
        }

        // Step 2: Merge runs using merge sort logic
        for (int size = RUN; size < n; size *= 2) {
            for (int left = 0; left < n; left += 2 * size) {
                int mid = left + size - 1;
                int right = Math.min((left + 2 * size - 1), (n - 1));
                if (mid < right) {
                    merge(arr, left, mid, right);
                }
            }
        }
    }
}

Time Complexity

CaseTime Complexity
BestO(n)
AverageO(n log n)
WorstO(n log n)
  • Best case is linear O(n) when the array is already sorted (insertion sort kicks in).
  • Worst and average cases are O(n log n) due to merge sort.

Space Complexity

  • O(n) → due to auxiliary space used during merging.

Stability

  • Stable

It maintains the relative order of equal elements — essential for sorting objects with secondary keys.

Adaptive

  • Yes

Leverages existing sorted segments ("runs") in the input to reduce work.

Summary Table

PropertyValue
Time ComplexityBest: O(n), Avg/Worst: O(n log n)
Space ComplexityO(n)
Stable✅ Yes
In-place❌ No (uses extra space for merging)
Adaptive✅ Yes
Best Use CaseReal-world sorting of objects with possible partial order

When to Use Tim Sort:

  • When you want stable, adaptive, and efficient sorting for real-world data.
  • Ideal for lists of objects (e.g., Java’s Arrays.sort(Object[]), Python’s sorted()).

Real-world Use:

  • Python: list.sort() and sorted()
  • Java: Arrays.sort() for Object[]
  • Not used for int[] in Java (uses Dual-Pivot Quick Sort there)

12.11 Performance Comparison

Performance Comparison Table

AlgorithmBest CaseAverage CaseWorst CaseSpaceStableIn-PlaceAdaptiveReal-World Use Case
SelectionO(n²)O(n²)O(n²)O(1)Educational, very small arrays
InsertionO(n)O(n²)O(n²)O(1)Small or nearly sorted arrays
BubbleO(n)O(n²)O(n²)O(1)✅ (with flag)Educational, nearly sorted arrays
ShellO(n log n) or betterO(n^1.25–n^1.5)O(n²)O(1)Medium-sized arrays, space-constrained cases
MergeO(n log n)O(n log n)O(n log n)O(n)Linked lists, stable sort, large data
QuickO(n log n)O(n log n)O(n²)O(log n)Fastest in practice, general-purpose
HeapO(n log n)O(n log n)O(n log n)O(1)Real-time systems (predictable time)
RadixO(nk)O(nk)O(nk)O(n + k)Integer sorting, fixed-width keys
Tim SortO(n)O(n log n)O(n log n)O(n)Python/Java built-in sorting

Where:

  • n = number of elements
  • k = number of digits (in radix sort)
  • Stable = maintains relative order of equal elements
  • Adaptive = performs better on partially sorted data

Insights & Recommendations

When to Use Which:

  • Best All-Around (General Purpose):
    • Quick Sort (if stability not needed)
    • Merge Sort or Tim Sort (if stability is needed)
  • Nearly Sorted Input:
    • Insertion Sort or Tim Sort (adaptive algorithms)
  • Memory-Constrained Environments:
    • Heap Sort or Quick Sort (in-place, small stack)
  • Large Numbers with Fixed Digits (e.g., phone numbers):
    • Radix Sort
  • Educational/Teaching Algorithms:
    • Bubble, Selection, Insertion — good for learning basics
profile
Backend engineer

0개의 댓글