This is the 21st day of my participation in the 18th Gwen Challenge. For details, see: the last Gwen Challenge 2021

classification Best time complexity Average time complexity Worst time complexity Spatial complexity Stable situation
Direct insertion sort O(n) O(n^2) O(n^2) O(1) stable
Hill sorting unstable
Selection sort O(n^2) O(n^2) O(n^2) The O (1) unstable
Heap sort O(nlog2n) O(nlog2n) O(nlog2n) The O (1) unstable
Bubble sort O(n) O(n^2) O(n^2) The O (1) stable
Quick sort O(nlog2n) O(nlog2n) O(n^2) O(log2n) stable
Merge sort O(nlog2n) O(nlog2n) O(nlog2n) O(n) stable

The above main comparison is sorting algorithm performance analysis comparison.

How to judge whether an algorithm is stable or not:

If the relative order of these records remains unchanged, that is, in the original sequence, r[I]=r[j], and r[I] is before r[j], and in the sorted sequence, r[I] is still before r[j], then the sorting algorithm is stable. To put it simply: if the two elements to be sorted are a and B and a is the same as the corresponding keywords before B, they are stable if their relative positions remain the same after sorting

  • The best case of direct insert sort is to insert directly after it, and the worst case is to reverse the order of each insert to loop n times.

  • Selection sort

You have to pick the data and put the selected data behind so the time is n^2 regardless of whether the sequence is ordered or not

  • Hill sort (reduced incremental sort)

It’s unstable because what we can see from hill’s sorting process is that every time he picks a group in an incremental order the relative position of the group may change

  • Bubble sort

When the data is ordered it doesn’t have to swap and it just walks through the sequence, but if it’s unordered it takes n^2 time because bubble sort swaps less or more than that so it’s a stable algorithm

  • Quick sort

His performance is mainly good for taking this base data so he better take nlog2n as well

  • Heap sort
Void BuildMaxHeap(ElementType A[], int len) {for (int I = len / 2; i > 0; i--) { HeadAdjust(A, i, len); Void HeadAdjust(ElementType A[], int k, int len) {A[0] = A[k]; for (int i = 2 * k; i <= len; i *= 2) { if (i < len && A[i] < A[i + 1]) {i++; } if (A[i] <= A[0]) break; else { A[k] = A[i]; k=i; } } A[k] = A[0]; Void HeapSort(ElementType A[], int len) {BuildMaxHeap(A, len); for (int i = len; i > 1; i--){ swap(A[i], A[1]); // Swap position HeadAdjust(A, 1, i-1); }}Copy the code

To analyze it from the code above, we build a heap from the last non-leaf node to the root node (recording backwards) to determine whether the left and right subtrees need to be adjusted and then we compare them up until we compare them to the root node and so on so we know from this code that the time complexity is nlog2n