This is the 9th day of my participation in Gwen Challenge

The sorting

Bubble sort, insert sort, select sort, merge sort, quicksort, count sort, radix sort, bucket sort

The execution efficiency of sorting algorithm

  1. Best case, worst case, average case time complexity

    Also say what the raw data to be sorted is for the best and worst time complexity.

  2. Time complexity coefficients, constants, low order

  3. Number of comparisons and number of swaps (or moves)

    The execution process of comparison based sorting algorithm involves two operations, one is the comparison of the size of elements, and the other is the exchange or movement of elements. Therefore, if we analyze the efficiency of sorting algorithms, we should also take into account the number of comparisons and swaps (or moves).

Sorted in place. Sorted in place is an algorithm whose space complexity is O(1){O(1)}O(1).

Stability of sorting algorithm

Stable sorting algorithm and unstable sorting algorithm

Bubble sort is an in-place sorting algorithm with a space complexity of O(1){O(1)}O(1). Bubble sort is a stable sorting algorithm. The best time complexity of bubble sort is O(n){O(n)}O(n), that is, the data to be sorted is already ordered, and only one bubble operation is required. In the worst case, the data to be sorted happens to be in reverse order, requiring n bubbling operations, so the worst-case time complexity is O(n2){O(n^2)}O(n2). The average time complexity is O(n2){O(n^2)}O(n2).

Insertion sort is an in-place sorting algorithm with a spatial complexity of O(1){O(1)}O(1). Insertion sort is a stable sorting algorithm. The core idea is to take the elements of the unsorted interval, find the appropriate position to insert them in the sorted interval, and ensure that the sorted interval data is always in order. The initial sorted interval has only one element, the first element of the array. The best time complexity for insertion sort is O(n){O(n)}O(n), and note that the ordered data is traversed from end to end. If the array is in reverse order, each insertion is equivalent to inserting new data in the first position of the array, so a large amount of data needs to be moved, so the worst-case time complexity is O(n2){O(n^2)}O(n2). The average time complexity is O(n2){O(n^2)}O(n2).

Selection sort is an in situ sorting algorithm with spatial complexity O(1){O(1)}O(1). It is an unstable in situ sorting algorithm. The idea of the algorithm is somewhat similar to insertion sort, but also divided into sorted and unsorted intervals. But selection sort finds the smallest element in the unsorted range every time and puts it at the end of the sorted range. Select the best, worst, the average time complexity is O(n2){O(n^2)}O(n2).

Although bubble sort and insert sort have the same time complexity, both are O(n2){O(n^2)}O(n2), the data exchange of bubble sort is more complex than the data movement of insert sort, but if we want to maximize performance, we must choose insert sort (optimization: Hill sort).

Merge sort is not the Achilles heel of the in-place sort algorithm because the merge function of merge sort requires extra storage space to merge two ordered arrays into one. It is a stable sorting algorithm, at best, at worst, with average time complexity O(nlogn){O(nlogn)}O(nlogn), and space complexity O(n){O(n)}O(n).

Quicksort is an in-place sorting algorithm, not a stable sorting algorithm. The best time complexity is O(n){O(n)}O(n) under partitioning and equalization; The worst time complexity is O(n2){O(n^2)}O(n2) in the case of extremely unbalanced partitions; The average time complexity is O(nlogn){O(nlogn)}O(nlogn).

Bucket sort, which is suitable for external sorting. External sort means that data is stored on an external disk with a large amount of data and limited memory. Therefore, data cannot be fully loaded into the memory.

** Counting sort is a special case of bucket sorting. When the n numbers that we want to sort are on a small scale, for example, the maximum is k, we can divide the data into k buckets. The data values in each bucket are the same, saving the sorting time in the bucket. The time complexity is O(n){O(n)}O(n). The time complexity of counting sort is O(n){O(n)}O(n).

Count sort can only be used in scenarios where the data range is small. If the data range K is much larger than the data to be sorted n, count sort is not suitable. Moreover, counting sort can only sort non-negative integers. If the data to be sorted is of another type, it is converted to a non-negative integer without changing its relative size.

Radix sort