1. Bubble sort

The basic idea of Bubble Sorting is to compare the values of neighboring elements from front to back (starting with the elements with smaller subscripts) and then swap them if they are in reverse order, gradually moving the elements with larger values from front to back, like bubbles bubbling upward under water.

Optimization: Because in the sorting process, each element is constantly close to its position, if there is no exchange after a comparison, it means that the sequence is orderly, so a flag should be set in the sorting process to determine whether the element has been exchanged. Thus reducing unnecessary comparisons.

public static void bubbleAsc(int[] array) { int temp; for (int i = 0; i < array.length - 1; i++) { boolean flag = true; For (int j = 0; j < array.length - 1 - i; J ++) {if (array[j] > array[j + 1]) {// If (array[j] > array[j + 1]); temp = array[j]; array[j] = array[j + 1]; array[j + 1] = temp; }} if (flag) {// If no exchange occurs, the sorting is complete break; }}}Copy the code

2. Select sort

The basic idea of Select Sorting is: Select the minimum value from ARR [0] ~ ARR [n-1] for the first time and exchange with ARR [0]; select the minimum value from ARR [1] ~ ARR [n-1] for the second time and exchange with ARR [1]; select the minimum value from ARR [2] ~ ARR [n-1] for the third time. Exchange with ARR [2]… , the minimum value is selected from arR [i-1] ~ ARr [n-1] for the i-th time, and exchanged with ARr [i-1]… , the minimum value is selected from ARR [n-2] ~ ARR [n-1] for the n-1st time, and exchanged with ARR [n-2]. After a total of n-1 times, an ordered sequence is obtained from the smallest to the largest according to the sorting code.

public static void selectAsc(int[] array) { int min, minIndex; for (int i = 0; i < array.length - 1; i++) { min = array[i]; MinIndex = I; minIndex = I; For (int j = I + 1; j < array.length; J ++) {if (min > array[j]) {// If (min > array[j]) {// If (min > array[j]); minIndex = j; } } if (minIndex ! Array [I] = array[I]; array[i] = min; }}}Copy the code

3. Insert sort

The basic idea of Insertion Sorting is: Read the n to sort elements as an ordered list and an unordered list, at the beginning of the order in the table contains only one element, disorder in the table contains a element of n – 1, each time in the process of sorting table to retrieve the first element from the chaos, combine its sort code in comparing with orderly table element ordering code, in an orderly table to insert it into the appropriate location, Make it the new ordered table.

public static void insertAsc(int[] array) { int insertValue, insertIndex; for (int i = 1; i < array.length; i++) { insertValue = array[i]; // insertIndex = i-1; While (insertIndex >= 0 && array[insertIndex] > insertValue) { Array [insertIndex + 1] = array[insertIndex]; insertIndex--; } if (insertIndex ! InsertIndex = insertValue; }}}Copy the code

4. Hill sort

Insert sort possible problems: array arr = {2, 3, 4, 5, 6, 1} then need to insert the number 1 (minimum), need to swap 5 times, and 1 assignment minimum; When the number to be inserted is small, the number of backshift increases obviously, which affects the efficiency.

Hill sort is a sort algorithm proposed by Donald Shell in 1959. Hill sort is also an insertion sort, which is a more efficient version of a modified simple insertion sort, also known as reduced incremental sort.

The basic idea of Shell Sorting is that records are grouped incrementally into subscripts and each group is sorted using an insertion Sorting algorithm. As the increments decrease, each group contains more and more keywords, and when the increments decrease to 1, the entire file is exactly grouped and the algorithm terminates.

You just need to modify the insert sort slightly: nest a layer of loop externally that introduces the step size, and then change the +1 or -1 part of the insert sort to the + or – step size.

public static void shellAsc(int[] array) { int insertValue, insertIndex; For (int stepSize = array.length / 2; stepSize > 0; stepSize /= 2) { for (int i = stepSize; i < array.length; i++) { insertValue = array[i]; insertIndex = i - stepSize; while (insertIndex >= 0 && array[insertIndex] > insertValue) { array[insertIndex + stepSize] = array[insertIndex]; insertIndex -= stepSize; } if (insertIndex ! = i - stepSize) { array[insertIndex + stepSize] = insertValue; }}}}Copy the code

5. Quicksort

Quick Sorting is an improvement on bubbling Sorting. The basic idea is: Through a sorting, the sorted data is divided into two independent parts, and all the data of one part is smaller than all the data of the other part, and then according to this method, the two parts of the data are sorted, the whole sorting process is recursive, so as to achieve the whole data into an ordered sequence.

public static void quickAsc(int[] array) { quickAsc(array, 0, array.length - 1); } private static void quickAsc(int[] array, int left, int right) { if (left >= right) { return; Int key = array[left], l = left, r = right; While (l < r &&array [r] >= key) {r--; Array [l] = array[r]; while (l < r && array[l] <= key) { l++; Array [r] = array[l]; Array [l] = key; array[l] = key; // quickAsc(array, left, L-1); QuickAsc (array, l + 1, right); }Copy the code

6. Merge sort

Merge Sorting utilizes the idea of merging. It uses the classic divide-and-conquer strategy of dividing problems into small pieces and solving them recursively. The Conquer phase “tinkered” with the answers obtained in the conquer phases, i.e. divide and conquer.

Note: we can see that this structure is like a complete binary tree, the merge sort of this paper we use recursion to achieve (can also use the iterative way to achieve). Phasing can be understood as the process of recursive dismantling of molecular sequences.

Points are relatively simple, do not perform any operations on the data;

In the cure stage, we need to merge two ordered sub-sequences into one ordered sequence. For example, in the last merge in the figure above, we need to merge two ordered sub-sequences [4,5,7,8] and [1,2,3,6] into the final sequence [1,2,3,4,5,6,7,8]. Let’s see the implementation steps.

public static void mergeAsc(int[] array) { mergeDivideAsc(array, 0, array.length - 1, new int[array.length]); } public static void mergeDivideAsc(int[] array, int left, int right, int[] tempArray) { if (left >= right) { return; } int mid = (left + right) / 2; MergeDivideAsc (array, left, mid, tempArray); MergeDivideAsc (array, mid, right, tempArray); // mergeConquerAsc(array, left, mid, right, tempArray); } public static void mergeConquerAsc(int[] array, int left, int mid, int right, int[] tempArray) { int leftIndex = left, rightIndex = mid + 1, index = 0; // Compare the size of the left element with the right element, While (leftIndex <= mid && rightIndex <= right) {if (array[leftIndex] > array[rightIndex]) { tempArray[index++] = array[rightIndex++]; } else { tempArray[index++] = array[leftIndex++]; While (leftIndex <= mid) {tempArray[index++] = array[leftIndex++]; tempArray[index++] = array[leftIndex++]; While (rightIndex <= right) {tempArray[index++] = array[rightIndex++]; } index = 0; For (int I = left; i <= right; i++) { array[i] = tempArray[index++]; }}Copy the code

7. Radix sort

Radix Sorting is also known as “Bucket Sorting” or Bin Sorting, which, as its name implies, distributes the elements to be sorted into “buckets” through the values of each bit of the key value;

Radix sort was invented by Herman Holery in 1887. It is an efficient stability sort and an extension of bucket sort. It works like this: integers are cut into different numbers by number of digits, and then compared individually by number of digits.

The basic idea of radix sort: unify all the values to be compared into the same digit length, and fill zeros in front of the shorter digit. Then, sort the order one by one, starting with the lowest order. After sorting from the lowest to the highest, the sequence becomes an ordered sequence.

Description:

  • Radix sort is an extension of traditional bucket sort, which is very fast.

  • Radix sort is a classic space-for-time method, which occupies a large amount of memory. When sorting massive data, it is easy to cause OutOfMemoryError.

  • Stable when radix sort It is assumed that there are multiple records with the same keywords in the sequence of records to be sorted. If sorted, the relative order of these records remains unchanged, that is, in the original sequence, r[I] = r[j], and r[I] precedes r[j], and in the sorted sequence, r[I] still precedes r[j]. The sorting algorithm is said to be stable; otherwise, it is called unstable.

  • We do not use radix sort for arrays with negative numbers. If we want to support negative numbers, see code.i-harness.com/zh-CN/q/e98… .

    Public static void radixAsc(int[] array) {// Define 10 buckets int[][] bucket = new int[10][array.length]; Int [] bucketSize = new int[10]; bucketSize = new int[10]; for (int div = 1; ; div *= 10) { boolean zeroFlag = true; For (int I = 0; i < array.length; i++) { int d = array[i] / div; if (d > 0) { zeroFlag = false; } int mod = d % 10; bucket[mod][bucketSize[mod]++] = array[i]; } if (zeroFlag) {// if (zeroFlag) {// If (zeroFlag) = 0, then the sorting is complete break; } int index = 0; For (int I = 0; i < bucketSize.length; i++) { for (int j = 0; j < bucketSize[i]; j++) { array[index++] = bucket[i][j]; BucketSize [I] = 0; bucketSize[I] = 0; }}}

8. Heap sort

Heap sort is the use of heap data structure and design of a sort algorithm, heap sort is a choice sort, its worst, best, average time complexity is O(nlogn), it is also unstable sort.

A heap is a complete binary tree with the following property: the value of each node is greater than or equal to the value of its left and right children, called the big top heap. Note: there is no requirement for the relationship between the value of the left child and the value of the right child of the node; The value of each node is less than or equal to the value of its left and right children, called the small top heap.

Example of big top heap:

Small top heap example:

In general, large top heap is used for ascending and small top heap is used for descending.

The basic idea of Heap Sorting is as follows: the sequence to be sorted is structured into a big top Heap; At this point, the maximum value of the entire sequence is the root at the top of the heap. Swap it with the trailing element, where the trailing element is the maximum. The remaining n-1 elements are then reconstructed into a large top heap, which yields the subsmallest values of n elements. Do this repeatedly, and you get an ordered sequence. You can see that as you build the big top heap, the number of elements gets smaller and smaller, and you end up with an ordered sequence.

Public static void heapAsc(int[] array) { Array. length / 2-1 // For (int I = array.length / 2-1; i >= 0; i--) { heapAsc(array, i, array.length); } int temp; for (int i = 0; i < array.length - 1; I++) {temp = array[0]; temp = array[0]; array[0] = array[array.length - 1 - i]; array[array.length - 1 - i] = temp; HeapAsc (array, 0, array.length-1-i); }} /** * This method performs a big top heap sort on the elements at index position. * This method is called on the premise that both the left and right subtrees at index are big top heaps. * Only the root at index does not satisfy the big top heaps condition. * How can this premise be guaranteed? * Just build the big top heap from back to front. Private static void heapAsc(int[] array, int index, private static void heapAsc(int[] array, int index, int length) { int target = array[index]; for (int i = index * 2 + 1; i < length; If (I + 1 < length && array[I + 1] > array[I]) {I ++; } if (array[I] > target) {// If the child is larger than the parent array[index] = array[I]; // index = parent, I = child; } else {// Otherwise exit break; // If the index is the parent, I is the child. } // Index array[index] = target; // Index array[index] = target; }Copy the code

9. Summary and comparison of common sorting algorithms

  • Stability: if a is in front of B, and a= B, a is still in front of B after sorting;
  • Unstable: If a is in front of B and a= B, a may appear behind B after sorting;
  • Internal sort: All sort operations are done in memory;
  • External sort: Because the data is too large, the data is placed on disk, and the sort can only be carried out by disk and memory data transfer;
  • Time complexity: the amount of time an algorithm takes to execute;
  • Space complexity: the amount of memory required to run a program;
  • N: data scale;
  • K: number of buckets;
  • In-place: takes no extra memory;
  • Out-place: takes up extra memory.