Why Learn Sorting Algorithms?
Sorting is one of the most fundamental operations in computer science. Understanding different sorting algorithms helps you choose the right one for your use case and builds a foundation for more complex algorithmic thinking.
Bubble Sort - The Simple One
Bubble sort repeatedly steps through the list, compares adjacent elements, and swaps them if they are in the wrong order. Simple but inefficient for large datasets.
def bubble_sort(arr):
n = len(arr)
for i in range(n):
swapped = False
for j in range(0, n - i - 1):
if arr[j] > arr[j + 1]:
arr[j], arr[j + 1] = arr[j + 1], arr[j]
swapped = True
if not swapped:
break
return arr
# Time: O(n^2) worst/avg, O(n) best
# Space: O(1)
# Stable: YesSelection Sort
Finds the minimum element and places it at the beginning, then repeats for the remaining unsorted portion.
def selection_sort(arr):
n = len(arr)
for i in range(n):
min_idx = i
for j in range(i + 1, n):
if arr[j] < arr[min_idx]:
min_idx = j
arr[i], arr[min_idx] = arr[min_idx], arr[i]
return arr
# Time: O(n^2) always
# Space: O(1)
# Stable: NoInsertion Sort
Builds the sorted array one item at a time. Efficient for small datasets and nearly sorted arrays.
def insertion_sort(arr):
for i in range(1, len(arr)):
key = arr[i]
j = i - 1
while j >= 0 and arr[j] > key:
arr[j + 1] = arr[j]
j -= 1
arr[j + 1] = key
return arr
# Time: O(n^2) worst, O(n) best
# Space: O(1)
# Stable: YesMerge Sort - Divide and Conquer
Divides array into halves, sorts each half recursively, then merges them. Guaranteed O(n log n) performance.
def merge_sort(arr):
if len(arr) <= 1:
return arr
mid = len(arr) // 2
left = merge_sort(arr[:mid])
right = merge_sort(arr[mid:])
return merge(left, right)
def merge(left, right):
result = []
i = j = 0
while i < len(left) and j < len(right):
if left[i] <= right[j]:
result.append(left[i])
i += 1
else:
result.append(right[j])
j += 1
result.extend(left[i:])
result.extend(right[j:])
return result
# Time: O(n log n) always
# Space: O(n)
# Stable: YesQuick Sort - The Fast One
Selects a pivot element, partitions the array around it, then recursively sorts the sub-arrays. Average O(n log n) but O(n^2) worst case.
def quick_sort(arr, low=0, high=None):
if high is None:
high = len(arr) - 1
if low < high:
pivot_idx = partition(arr, low, high)
quick_sort(arr, low, pivot_idx - 1)
quick_sort(arr, pivot_idx + 1, high)
return arr
def partition(arr, low, high):
pivot = arr[high]
i = low - 1
for j in range(low, high):
if arr[j] <= pivot:
i += 1
arr[i], arr[j] = arr[j], arr[i]
arr[i + 1], arr[high] = arr[high], arr[i + 1]
return i + 1
# Time: O(n log n) avg, O(n^2) worst
# Space: O(log n) recursion stack
# Stable: No (standard version)Choosing the Right Algorithm
Use insertion sort for small arrays (n < 20) or nearly sorted data. Use merge sort when stability is required or worst-case guarantee is needed. Use quick sort for general-purpose sorting with good average performance. Python's built-in sort uses TimSort, a hybrid of merge sort and insertion sort, which gives O(n log n) worst case with optimization for real-world data patterns.
Comparison Table
Bubble Sort: O(n^2) time, O(1) space, stable. Selection Sort: O(n^2) time, O(1) space, unstable. Insertion Sort: O(n^2) worst / O(n) best, O(1) space, stable. Merge Sort: O(n log n) always, O(n) space, stable. Quick Sort: O(n log n) avg, O(log n) space, unstable. Heap Sort: O(n log n) always, O(1) space, unstable.
Comments