A: What is the sort:

Suppose the sequence of n records is {R1, R2… Rn}, and its corresponding keyword sequence is {K1, K2… Kn} these keywords can be compared with each other namely there is a relationship between them: Kp1≤Kp2≤… ≤Kpn According to the inherent relation, the sequence of records above is rearranged as {Rp1, Rp2… The operation of Rpn} is called sorting. The regular ordering of a sequence of data from smallest to largest or from largest to smallest

Two: algorithm stability

Let Ki = Kj (1≤ I ≤n, 1≤j≤n, I ≠j), and Ri is ahead of Rj (that is, I < j) in the sequence before ordering. If Ri is still ahead of Rj in the sorted sequence, the sorting method used is stable. Otherwise, the sorting method used is said to be unstable.

== Common sorting algorithms are as follows ==

😊 Simple sort

One: selection sort

Basic idea:

Firstly, n-1 keyword comparison is performed to find the record with the smallest keyword among n records and exchange it with the first record. Then through n-2 comparisons, the record with lower key word is found out from the remaining N-1 records, and it is exchanged with the second record. The simplest and least useful sorting algorithm (time complexity O(n ²), extra space complexity O(1) unstable)

public class SelectionSort {
	public static void selectSort(int[] arrs) {
		long startTime = System.nanoTime();// Get the start time
		
		// Determine whether the array is empty or has only one element
		if(arrs.length <2 | arrs == null) {
			return ;
		}
		I < length-1, j < length-1, j < length-1, j < length-1, j < length-1, j < length-1
		for(int i = 0; i <arrs.length-1; i ++) {
			int minIndex = i;
			for(int j = i+1; j < arrs.length; j++) {
				minIndex = arrs[j] < arrs[minIndex] ? j : minIndex;
			}
			if(i != minIndex) {
				swap(arrs,i,minIndex);
			}
		}
		long endTime = System.nanoTime();// Get the end time
		System.out.println("The program is running."+(endTime-startTime) +"纳秒");
	}
	public static void swap(int[] arrs,int i ,int minIndex) {
		int temp = arrs[i];
		arrs[i] = arrs[minIndex];
		arrs[minIndex] = temp;
	}
	/ / the demo test
	public static void main(String[] args) {
		int[] arrs = {5.2.3.1};
		selectSort(arrs);
		for(int x:arrs) {
			System.out.print(x+"\t"); }}}Copy the code

Two: bubble sort

Basic idea:

Iterate through the column of elements to be sorted, compare two adjacent elements in turn, and swap them if the order (e.g., from large to small) is wrong: move the small (large) element forward (after) to O(N²) and the extra space is O(1) stable

public class BubbleSort {
	public static void bubbleSort(int[] nums) {
		// If the array length is 0 or less than 2
		if(nums.length <2 | nums == null) {
			return;
		}
		// The outer loop gets a maximum at a time
		for(int i = nums.length-1; i>0; i--) {
			for(int y = 0; y<i; y++) {
				if(nums[y] > nums[y+1]) {
					// Use the following bit operation only if the two variables are not in the same block of memory
					nums[y] = nums[y] ^ nums[y+1];
					nums[y+1] = nums[y] ^ nums[y+1];
					nums[y] = nums[y] ^ nums[y+1]; }}}}// Test bubble sort
	public static void main(String[] args) {
		int[] arr = {2.4.1.6.7.5};
		bubbleSort(arr);
		for(int x : arr) {
			System.out.print(x+"\t"); }}}Copy the code

Insert sort

Basic idea:

The first record in the sequence is regarded as an ordered subsequence, and then the second record is inserted one by one until the whole sequence is ordered

It’s best for basically ordered arrays, stable. The time is O(N ^ 2) and the extra space is O(1)

public class InsertionSort {
	public static int[] insertionSort(int[] arrays) {
		// If the array is empty or the array length is less than 2
		if(arrays.length <2 | arrays == null) {
			return arrays;
		}
		// Start with 0~0, 0~1, 0~2...
		for(int i = 1; i<arrays.length; i++) {// This step is to ensure that 0~ I is ordered
			// This step checks whether or not I needs to be swapped because I needs to be swapped and I -1 is definitely ordered
			for(int y = i; y>0; y--) {if(arrays[y]<arrays[y-1]) {
					// Switch the two
					arrays[y] = arrays[y] ^ arrays[y-1];
					arrays[y-1] = arrays[y] ^ arrays[y-1];
					arrays[y] = arrays[y] ^ arrays[y-1]; }}}return arrays;
		
	}
	public static void main(String[] args) {
		int[] arrays = {1.5.4.2.6.4.2};
		arrays = insertionSort(arrays);
		for(int x: arrays) {
			System.out.print(x+"\t"); }}}Copy the code

😗 Summary of simple sorting algorithms

Similarities: Time complexity O(N²) Extra space complexity O(1)

Bubble sort: basically no, slow selection sort: basically no, unstable insertion sort: relatively high efficiency when the sample is small and basically ordered

😉 Advanced sort

Four: heap sort

Basic idea:

HeapSort (English: HeapSort) refers to a sort algorithm designed by using heap data structure.

  • A heap is an approximately complete binary tree structure that simultaneously satisfies the properties of a heap:
  • That is, the child’s key value or index is always smaller than (or greater than) its parent
  • Time complexity: O(N*logN)
  • Stability: unstable

A complete binary tree is a large root heapif the maximum value of every subtree is at the top. A complete binary tree is a small root heapif the minimum value of every subtree is at the top 6. Priority queue structure, that is, heap structure

Heap sort

== root heap == root heap ==

  1. Top to bottom, order N logN.
  2. Bottom to top method, order N time.

Swap the maximum heap size with the value at the end of the heap, then reduce the heap size (discard the tail, that is, discard the maximum value), then adjust the heap, continue to go round and round, the time complexity is O(N * logN) 3, reduce the heap size to 0, complete the sorting

public class HeapSort {
	// Heap sort method implementation
	public static void heapSort(int[] arr) {
		Return if the array is empty or less than 2 in length
		if(arr == null || arr.length < 2) {
			return;
		}
		// The first step is to turn the entire array (which already has values) into a large root heap (swap the values inside the array as needed)
// for(int i = 0; i < arr.length; i++) { // O(N)
// heapInsert(arr , i ); // O(logN)
/ /}
		/* * or use the Heapify method to make the array a large root heap * from the right leaf of the tree * heapify from right to left and from bottom to top * this is slightly faster but the total time is the same */
		for( int i = arr.length-1; i>=0; i--) { heapify(arr,i,arr.length); }/ / the second step
		int heapSize = arr.length;
		// Swap the maximum value with the big root heap tail and discard the tail
		swap(arr, 0 , --heapSize);
		
		// Here are the steps to make the remaining tree still a large root heap
		
		// When heapSize==0, the array is sorted in ascending order
		while( heapSize > 0 ) {
			/* This number may no longer be the big root heap due to swapping the head and tail of the tree. Adjust the tree to be the big root heap (while loop -> let index swap with the maximum below it (left or right child) and end up being the big root heap again) */
			heapify(arr , 0 , heapSize); //O(logN)
			// Continue swapping the head and tail of the tree
			swap(arr, 0 , --heapSize); // O(1)
		}
		// The next smallest value of the array is discarded and the largest value is sorted in ascending order
	}
	
	
	/* * A number is now in index position and moves up to form a large root heap */
	public static void heapInsert(int[] arr,int index) {
		/* * If the current number is greater than that of the parent, the index is not greater than that of the parent. * 2: the index has reached the head of the entire tree
		while(arr[index] > arr[(index-1) /2]) { 
			swap(arr,index,(index-1) /2);
			/ / the index up
			index = (index - 1) /2; }}/* Resize the tree to the big root heap * * Can move a number down at index * * Resize the tree to the big root heap */
	public static void heapify(int[] arr , int index, int heapSize) {
		int left = index*2 +1; 			// Left child subscript
		while(left < heapSize) {		// There are children below (if there is no left child, there is no right child)
			// Compare which of the two children is the largest
			int largest = left +1 < heapSize && arr[left + 1] >arr[left] ? left +1 : left;
			// Compare the parent to the older child you just compared
			largest = arr[largest] > arr[index] ? largest : index;
			Break if the parent is still larger than the child
			if(largest == index) {
				break;
			}
			// Otherwise swap the parent (small value) with the child (large value)
			swap(arr,largest,index);
			// Drop index
			//index should always be the subscript of the parent node (with the current old subscript of the child node)
			index = largest;
			// The new left child subscript redetermines whether the left child exists
			left = index *2 +1; }}// Swap functions
	public static void swap(int[] arr , int num1, int num2) {
		if(num1 == num2) {
			return;
		}
		arr[num1] = arr[num1] ^ arr[num2];
		arr[num2] = arr[num1] ^ arr[num2];
		arr[num1] = arr[num1] ^ arr[num2];
 	}
	/ / test
	public static void main(String[] args) {
		int[] arr = {1.6.2.8.3.4.5.1.7.9};
		heapSort(arr);
		for(int x : arr) {
			System.out.print(x+"\t"); }}}Copy the code

Five: Hill sort

Basic idea:

Shell’s Sort, also known as Diminishing Increment Sort, is a more efficient and improved version of direct insert Sort. Hill sort is an unstable sort algorithm. Hill sort is to group the records by a certain increment of the index and sort each group by direct insertion sort algorithm. As the increments decrease, each group contains more and more keywords. When the increments decrease to 1, the whole file is exactly grouped into a group, and the algorithm terminates the improved insertion sort feature: When the interval is large, the number of moves is less. When the interval is small, the distance moved is shorter. Jumping sort is unstable.

public class ShellSort {
	public static void shellSort(int[] arrs) {
		Return if the array length is less than 2 or the array is empty
		if(arrs.length <2 | arrs == null) {
			return;
		}
		// The interval is obtained using the Knuth sequence
		int h = 1;
		while( h <= arrs.length/3) {
			h = h*3+1;
		}
		
		for(intgap = h; gap >0; gap = (gap-1) /3) {
			for(int i = gap; i < arrs.length; i++) {
				for(int j = i; j>gap-1; j-=gap) {
					// Insert sort
					if(arrs[j] < arrs[j-gap]) {	
						arrs[j] = arrs[j] ^ arrs[j-gap];
						arrs[j-gap] = arrs[j] ^ arrs[j-gap];
						arrs[j] = arrs[j] ^ arrs[j-gap];
					}
				}
			}
		}
	}
	public static void main(String[] args) {
		int[] arrays = {1.5.4.2.6.4.2.3.8.7};
		shellSort(arrays);
		for(int x: arrays) {
			System.out.print(x+"\t"); }}}Copy the code

Six: merge sort

Basic idea:

Merge-sort is an effective sorting algorithm based on MERGE operation. It is a typical application of Divide and Conquer. The ordered subsequence is merged to obtain a fully ordered sequence, that is, each subsequence is ordered first, and then the subsequence segments are ordered. If two ordered lists are merged into one ordered list, it is called binary merge essence: combining two or more sequences into a new ordered list is a simple recursion, the left is sorted, the right is sorted, so that the overall ordered time complexity O(N*logN), additional space complexity O(N).

public class MergeSort {
	public static void sort(int[] arrs, int left, int right) {
		// system.out. println(" left:"+left+" right:"+right);
		if(left == right ) {
			return;
		}
		int mid = left + ((right-left)>>1);// Prevent int overflow too large
		sort(arrs,left,mid); 	// Recurse on the left subbranch
		sort(arrs,mid + 1,right); 	// Recurse on the right subbranch
		mergeSort(arrs,left,mid+1,right); // Sorting the left and right subbranches that have just finished recursing does not recurse
		}
	
	public static void mergeSort(int[] arrs,int leftPtr , int rightPtr, int rightBound) {
		Println (" MergeSort left:"+leftPtr+" mid "+(rightptr-1) +" right:"+rightBound); // system.out.println (" enter MergeSort left:"+leftPtr+" mid "+(rightptr-1) +" right:"+rightBound);
		// Determine whether the array is empty or has a length less than 2
		if(arrs == null || arrs.length < 2 ) {
			return;
		}
		// Sort the array into two parts
		int mid = rightPtr-1; // The middle pointer precedes the right pointer
		int i = leftPtr; / / pointer to the left
		int j = rightPtr;/ / right pointer
		int num = 0;	// A pointer to a new array, starting at 0
		// Create an array of the same length
		int[] arrays = new int[rightBound-leftPtr+1];
		while( i <= mid && j <= rightBound ) {
			arrays[num++] = arrs[i]<=arrs[j] ? arrs[i++] : arrs[j++];
		}
		// Allow for the possibility that part of the array is fully concatenated but the other half is left (directly concatenated)
		while( i <= mid) {
			arrays[num++] = arrs[i++];
		}
		while( j <= rightBound ) {
			arrays[num++] = arrs[j++];
		}
		// Modify the elements of the old array directly after sorting
		for(int x = 0 ;x<arrays.length; x++) {
			arrs[leftPtr + x] =arrays[x];
		}
	}
	public static void main(String[] args) {
		// The array must be odd and the left and right sides must be ordered by 4 left and 3 right
// int[] arrs = {1,3,5,6,4,6,7};
		int[] arrs = {5.2.4.6.8.1.5.6};
		sort(arrs,0,arrs.length-1);
		for(int x: arrs) {
			System.out.print(x+"\t"); }}}Copy the code

Seven: Quicksort

Basic idea:

Select any record and use its keywords as the baseline. Records whose keywords are smaller than the baseline are moved to the front of the baseline and those whose keywords are larger than the baseline are moved to the back of the baseline.

First, the unordered record sequence is divided once, and then the two sub-sequences are sorted recursively.

V1.0 Time complexity O(N²)

In the whole array Based on the last number will be in front of the number according to the < = num | > num | = = num = = good Then = = num = = > and num exchange into the far left a number of < = num | = = num = = | > num and then let the < = > and num num These two parts repeat the behavior

V2.0 time complexity O(N²) features: applicable to the case of repeated value pairs

Using the Dutch flag issue, In the whole array Based on the last number will be in front of the number according to the < num | = = num | > num | = = num = = good Then = = num = = > and num exchange into the far left a number of < num | = = = = num = = | > num This is done one batch at a time, slightly faster than version 1.0

V3.0 changes the partition value (if good, the partition value should be the median in the array value)

Time complexity O (N * LogN) solution randomly divide the value Randomly divide the value in the whole array to the tail end of the array Then the previous number according to the < num | = = num | > num | = = num = = good Then the = = num = = > and Num exchange into the far left a number of < num | = = = = num = = | > num like this a batch of data to every time, every time the value of num is probability

Quicksort V3. 0version/ / V3.0
public class QuickSort {
	public static void quickSort(int[] arr) {
		// Return if the array length is less than 2 or empty
		if(arr == null || arr.length <2 ) {
			return;
		}
		quickSort(arr,0,arr.length-1);
	}
	//arr[L -> R]
	public static void quickSort(int[] arr , int L ,int R) {
		if(L < R) {
			// Select a position with equal probability and swap the number at the last position
			swap(arr,L+(int)(Math.random()*(R-L+1)),R);
			int[] p = partition(arr,L,R); // Return an array of length 2
			quickSort(arr,L,p[0] -1); / / < area
			quickSort(arr,p[1] +1,R);/ / > area}}// Switch methods
	public static void swap(int[] arr,int num1 , int num2) {
		int temp = 0;
		temp = arr[num1];
		arr[num1] = arr[num2];
		arr[num2] = temp;
	}
	/* * this is a function that handles arr[L....R]. By default, arr[R] is divided into 
	public static int[] partition(int[] arr, int L, int R) {
		int less = L -1; // < the right edge of the region
		int more = R;	 // > The left edge of the region
		while(L < more) { // L indicates the position of the current number. Arr [R] is a partition value indicating that L cannot stop until it is greater than the region
			if(arr[L] < arr[R]) { // The current number < partition value
				swap(arr,++less,L++);
			}else if(arr[L] > arr[R]) { // The current number is greater than the partition value
				swap(arr,--more,L);
			}else {
				L++;
			}
		}
		swap(arr,more,R);
		return new int[] {less+1,more};
	}
	
	public static void main(String[] args) {
		int[] arr = {1.6.2.4.8.1.5.3.4.8};
		quickSort(arr);
		for(int x: arr) { 
			System.out.print(x+"\t"); }}}Copy the code

Eight: bucket sort

Bucket sort is an updated version of counting sort. It makes use of the mapping relation of functions, and the key of efficiency lies in the determination of the mapping function. To make bucket sorting efficient, we need to do two things:

In the case of sufficient extra space, the mapping function used to increase the number of buckets as much as possible can evenly distribute the N input data into K buckets. Meanwhile, for the sorting of elements in buckets, the selection of comparison sorting algorithm is crucial to the performance

The time complexity and space complexity of bucket sorting are O(n), and bucket sorting is a stable sorting algorithm. However, the performance of bucket sorting is not absolutely stable, because if the elements are not evenly distributed, for example, five buckets are created and most elements are concentrated in the second bucket, the bucket sorting time will degrade to O(nlogn), and the space will be wasted.

import java.util.ArrayList;
import java.util.Collections;
import java.util.LinkedList;

public class BucketSort {
	public static double[] bucketSort(double[] array){
	    // Obtain the maximum and minimum values of the sequence, and calculate the travel value d
	    double max=array[0];
	    double min=array[0];
	    for (int i=1; i<array.length; i++){if (array[i]>max){
	            max=array[i];
	        }
	        if(array[i]<min){ min=array[i]; }}double d=max-min;

	    // Initialize the bucket
	    int bucketNum=array.length;
	    ArrayList<LinkedList<Double>> bucketList=new ArrayList<LinkedList<Double>>(bucketNum);
	    for (int i=0; i<bucketNum; i++){ bucketList.add(new LinkedList<Double>());
	    }

	    // Iterate through the original array to place each element in the bucket
	    for (int i=0; i<array.length; i++){int num=(int)((array[i]-min)*(bucketNum-1)/d);
	        bucketList.get(num).add(array[i]);
	    }

	    // Sort the inside of each bucket
	    for(int i=0; i<bucketList.size(); i++){// With collections.sort, the underlying implementation is an optimized version based on merge sort or merge sort
	        Collections.sort(bucketList.get(i));
	    }

	    // Output all elements
	    double[] sortedArray=new double[array.length];
	    int index=0;
	    for (LinkedList<Double> list:bucketList) {
	        for (doubleelement:list){ sortedArray[index]=element; index++; }}return sortedArray;
	}
	
	  public static void main(String[]args){
		    // Sort the array
		      double a[]={100.93.97.92.96.99.92.89.93.97.90.94.92.95};
		      double b[]=bucketSort(a);
		      for(double i:b){
		         System.out.print(i+""); }}}Copy the code

Nine: counting sort

Counting Sort is a non-comparison sorting algorithm. The Counting Sort algorithm has the advantage that the complexity of it is n+k (where K is the integer range) when sorting two integers within a certain range, which is faster than any comparison sorting algorithm. Of course, this is a way to sacrifice space for time, and when O(k)>O(nlog(n)), it is not as efficient as the time complexity based on comparison (the theoretical lower limit of time complexity based on comparison is O(nlog(n)), such as merge sort, heap sort), by counting down to O(n) time complexity.

Basic idea:

For each element x in a given input sequence, determine the number of elements in the sequence whose value is less than x (not by comparing the sizes of the elements, but by counting and summing the values of the elements). Once you have this information, you can place x directly in the correct position of the final output sequence. For example, if only 17 elements in the input sequence are less than the value of x, x can be stored directly in the 18th position of the output sequence. Of course, if there are multiple elements with the same value, we cannot put them in the same position in the output sequence, so the above scheme needs to be modified appropriately

Sorting process

Step 1: find the Max and min values in the array.

Step 2: Create a new array count of length max-min plus 1 with elements that default to 0.

Step 3: Iterate over the elements in the original array, take the elements in the original array as the index of the count array, and the number of occurrences of the elements in the original array as the element value of the count array

Count [I +1] = count[I +1] = count[I +1] + count[I];

Step 5: Create a result array of the same length as the original array.

Step 6: Count [A]-min [j]-min [j]-min [j]-min [j]-min [j]-min [j]-min [j]-min [j]-min [j]-min [j]-min [j]-min [j]-min [j]-min [j]-min Count [A[j]-min] = 1

public class CountSort{
 
	public static int[] countSort(int[] A) {
	    // Find the maximum and minimum values in array A
	    int max = Integer.MIN_VALUE;
	    int min = Integer.MAX_VALUE;
	    for (int num : A) {
	        max = Math.max(max, num);
	        min = Math.min(min, num);
	    }
	    // Initialize the count array count
	    // The length is maximum minus minimum plus 1
	    // Set the array length to max-min+1, i.e., find not only the maximum value, but also the minimum value
	    int[] count = new int[max-min+1];
	    // Assign to each element of the count array
	    for (int num : A) {
	        // The element in A is subtracted from the minimum value as the new index
	        count[num-min]++;
	    }
	    // Create the result array
	    int[] result = new int[A.length];
	    // Create the starting index of the result array
	    int index = 0;
	    // Iterate over the count array, populating the count array index into the result array
	    for (int i=0; i<count.length; i++) {
	        while (count[i]>0) {
	            // Add the minimum value subtractedresult[index++] = i+min; count[i]--; }}// Returns an array of results
	    return result;
	}
  
  
  public static void main(String[]args){
	    // Sort the array
	      int a[]={100.93.97.92.96.99.92.89.93.97.90.94.92.95};
	      int b[]=countSort(a);
	      for(int i:b){
	         System.out.print(i+""); }}}Copy the code

Counting sort result

The counting sort algorithm does not use comparisons between elements. It uses the actual values of elements to determine their position in the output array. Therefore, counting sort algorithm is not a comparison – based sorting algorithm, counting sort algorithm is a stable sorting algorithm. Although it can reduce the time complexity of sorting algorithm to O(N), there are two requirements: first, the elements to be sorted must be integers; second, the values of sorted elements must be in a certain range and relatively concentrated. Only when these two conditions are met can the advantages of counting sort be maximized

Ten: radix sort

Radix sort method is a stable sort can refer to the tutorial

Basic idea:

All values to be compared (positive integers) are unified into the same digit length, with the shorter digit preceded by zeros. Then, sort the order one by one, starting with the lowest order. So from the lowest order to the highest order, the sequence becomes an ordered sequence.

public class RadixSort {
	public static void radixSort(int[] arr) {
		if(arr == null || arr.length <2) {
			return;
		}
		radixSort(arr,0,arr.length-1,maxbits(arr));
	}
	public static int maxbits(int[] arr) {
		// The maximum value is the minimum of the system
		int max = Integer.MIN_VALUE;
		// find the maximum value
		for(int i = 0; i< arr.length; i++) { max = Math.max(max,arr[i]); }// The maximum number of decimal digits
		int res = 0;
		while(max ! =0 ) {
			res++;
			max /= 10;
		}
		return res;
	}
	//arr[begin...end] Sort digit Indicates how many decimal digits the largest value in this batch of data has
	public static void radixSort(int[] arr,int L,int R,int digit) {
		final int radix  = 10;
		int i = 0, j = 0;
		// Prepare as many auxiliary Spaces as possible
		int[] bucket = new int[R-L+1];
		for(int d = 1; d<= digit; d++) { // as many bits as possible
			// 10 Spaces
			int[] count = new int[radix];	//count[0...9]
			for(i = L; i<= R; i++) {// Which digit is required
				j = getDigit(arr[i],d);
				count[j]++;
			}
			// process count to prefix and
			for(i = 1; i<radix; i++) { count[i] = count[i]+count[i-1];
			}
			// If the array is traversed from right to left, the bucket is generated
			for(i = R; i >=L; i --) { j = getDigit(arr[i],d); bucket[count[j]-1] = arr[i];
				count[j]--;
			}
			// Export the data in the bucket to the array to maintain the result of the bucket
			for( i = L,j=0;i<=R;i++,j++) {
				arr[i] = bucket[j];
			}
		}
	}
	public static int getDigit(int x ,int d) {
		return ((x/((int)Math.pow(10, d-1))) %10);
	}
	public static void main(String[] args) {
		int[] arr = {5.1.4.6.2.8.4.1.5.3.6.8.7.9};
		radixSort(arr);
		for( int x :arr) {
			System.out.print(x+"\t"); }}}Copy the code

Radix sort vs counting sort vs bucket sort

These three sorting algorithms all use the concept of buckets, but there are obvious differences in the way buckets are used:

Radix sort: buckets are allocated according to each digit of the key value; Counting sort: each bucket stores a single key; Bucket sorting: Each bucket stores a range of values;

Conclusion:

Quicksort is generally used (version 3.0), heap sort is used if space is limited, and merge sort is used if stability is required

No algorithm whose time complexity is lower than O(N * logN) has been found based on comparison sorting, indicating that O(N * logN) is the limit

No algorithm with time complexity O(N * logN) and space complexity less than O(N) is found to be stable