Logic Overview

Quicksort is a divide-and-conquer algorithm that sorts an array by partitioning it into two sub-arrays, then sorting the sub-arrays recursively. The key to its efficiency is the choice of the pivot - an element around which the partition is performed.

Pivot: The choice of pivot can affect the performance. A good pivot can evenly divide the list into halves, leading to an evenly balanced recursion tree. The middle element is often chosen as the pivot to avoid the degraded performance on already sorted or nearly sorted data.

Low and High Pointers: These pointers represent the current bounds of the sub-array being sorted.

Partitioning: This is the process of rearranging the elements based on the pivot. Elements less than the pivot move to the left, and those greater move to the right.

How it works?

Lets take an example and understand how it works?

            [3, 2, 8, 5, 1, 7, 6, 4]   Indices
             l        p           h    l = 0
                                       p = 3
                      │                h = 7
                      │ First Partitioning
                      │ Elements < pivot go to LHS
                      │ Elements > pivot go to RHS
            ┌─────────┴──────────┐
            │                    │

Indices [3, 2, 4, 1]         [7, 6, 8] Indices
l = 0    l  p     h           l  p  h  l = 0
p = 1                                  p = 1
h = 3       │                    │     h = 2
            │                    │
            │                    │
     ┌──────┴──────┐      ┌──────┴──────┐
     │             │      │             │

    [1]           [4,3]  [7]           [8]
                   p

              ┌────┴────┐
              │         │
             [3]       [4]

    Base Case: Arrays with 1 element are sorted

    Final Sorted Array:[1, 2, 3, 4, 5, 6, 7, 8]

Initial Setup:

  • Start with the array [3, 2, 8, 5, 1, 7, 6, 4].
  • Choose the middle element as the pivot, here it is 5 (middle in terms of position, not value).
  1. First Partition:
    • Rearrange elements to move those less than 5 to the left and those greater to the right, resulting in two partitions [3, 2, 1, 4] and [8, 7, 6].
    • The pivot 5 is now in its correct position in the sorted array.
  2. Recursion on Sub-arrays:
    • The sub-arrays created on either side of the pivot 5 are not necessarily sorted at this stage.
    • We now apply the same partitioning logic to these sub-arrays.
  3. Recursive Application:
    • Take the left sub-array [3, 2, 1, 4] and choose a pivot, in this case, let’s pick 2.
    • Partition into [1] and [3, 4] around pivot 2. Pivot 2 is now correctly positioned.
    • For the right sub-array [8, 7, 6], choose pivot 7 and partition into [6] and [8] with 7 correctly positioned.
  4. Further Recursion:
    • Continue to recursively apply this partitioning logic.
    • Each recursive call will select a pivot and partition that sub-array.
    • This is repeated until sub-arrays are of size 1 or empty, at which point they are considered sorted.
  5. Completion:
    • With each recursive step, pivots are placed in their correct positions, and the sub-arrays get smaller.
    • The process naturally orders the entire array as each element finds its correct position.
    • The recursion bottoms out when all sub-arrays are sorted, leaving us with the sorted array [1, 2, 3, 4, 5, 6, 7, 8].

In essence, after each partitioning pass, the pivot is correctly positioned. The elements to the left and right of the pivot are not guaranteed to be in the correct order yet. Hence, we recursively apply the partitioning process to the left and right sub-arrays. Through recursion, the array elements are progressively sorted until no unsorted elements remain. This process continues until the base condition of the recursion is met (sub-array length <= 1), at which point the entire array is sorted.

Code

def sort(nums, low, high):
    if low >= high:
        return

    l = low
    h = high
    m = l + (h - l) // 2
    pivot = nums[m]

    while l <= h:

        while nums[l] < pivot:
            l += 1
        while nums[h] > pivot:
            h -= 1

        if l <= h:
            nums[l], nums[h] = nums[h], nums[l]
            l += 1
            h -= 1

    # now pivot is at correct index, sort two partitions
    sort(nums, low, h)
    sort(nums, l, high)

# Test the sort function
arr = [3, 2, 8, 5, 1, 7, 6, 4]
sort(arr, 0, len(arr) - 1)
print(arr)

Complexity Analysis

Time Complexity:

  1. Best Case and Average Case (O(n log n)):
    • In the best case, partitioning divides the array into two equal parts. Therefore, we can express the time complexity with the recurrence relation T(n) = 2T(n/2) + O(n), which is O(n log n).
    • The average case is also O(n log n), which assumes that the partitions are reasonably balanced. In practice, the average case is quite close to the best case.
  2. Worst Case (O(n^2)):
    • The worst case occurs when the partitioning routine produces one subproblem with n-1 elements and one with 0 elements. This can happen if the pivot is the smallest or largest element in the list, leading to an unbalanced partition. In this case, the recurrence relation is T(n) = T(n-1) + T(0) + O(n), which simplifies to T(n) = T(n-1) + O(n). When solved, this gives a time complexity of O(n^2).
    • The worst-case behavior occurs when the pivot is always the smallest or largest element in the array, such as when the array is already sorted or reverse-sorted. This is why choosing a good pivot is crucial for efficiency and why many implementations use heuristics to choose a pivot to avoid the worst-case scenario.

Space Complexity:

  1. Best Case and Average Case (O(log n)):
    • Space complexity is mainly due to the recursive nature of quicksort. In the best and average cases, space complexity is O(log n) due to the log n recursive calls, each requiring its frame on the call stack.
  2. Worst Case (O(n)):
    • In the worst case, space complexity can become O(n) if the depth of the recursion tree becomes n.