Code:
class MergeSort(): def sort(self, arr): if len(arr) <= 1: return arr mid = len(arr) / 2 arr_left = self.sort(arr[:mid]) arr_right = self.sort(arr[mid:]) i, j, k = 0, 0, 0 while i < len(arr_left) and j < len(arr_right): if arr_left[i] < arr_right[j]: arr[k] = arr_left[i] i += 1 else: arr[k] = arr_right[j] j += 1 k += 1 while i < len(arr_left): arr[k] = arr_left[i] i += 1 k += 1 while j < len(arr_right): arr[k] = arr_right[j] j += 1 k += 1 return arr if __name__ == "__main__": print MergeSort().sort([3, 4, 5, 2, 1])
Idea:
Divide and Conquer.
Assume the list length is N. The time complexity is O(NlogN) and space complexity is O(N+logN) = O(N). In the space complexity, O(logN) is the stack size when you recursively call self.sort
. O(N) is the size of the new two arrays arr_left
and arr_right
.
If the list is linked list, merge sort has the same time complexity O(NlogN) and but space complexity is only O(logN), which is purely the stack size. See: http://stackoverflow.com/questions/24171242/why-is-mergesort-space-complexity-ologn-with-linked-lists