instructions
Ten sorting algorithms can be said to be every programmer must have to master, spent a day to realize the code and tidy up, in order to facilitate everyone to learn, I put it into an article, each algorithm will have a simple algorithm thought description, in order to facilitate everyone to understand, I also found a GIF demonstration; This is not enough, I also attached the corresponding high quality article, after reading do not understand you cut me, feel good to give me a good look.
The term bedding
For those of you who don’t know what stable sorting, in-place sorting, time complexity, and space complexity are, let me briefly explain:
1, stable sorting: if a was in front of B, and a == B, a is still in front of B after sorting, it is stable sorting.
2, unstable sorting: if a was in front of B, and a == B, a may not be in front of B after sorting, then it is unstable sorting.
3, in situ sorting: in situ sorting refers to the sorting process does not apply for redundant storage space, only use the original storage space to store the data to be arranged for comparison and exchange of data sorting.
4. Out-of-place sort: Additional arrays are needed to help sort.
5. Time complexity: The amount of time an algorithm takes to execute.
6. Space complexity: The amount of memory required to run an algorithm.
Ten kinds explain the order
In order to facilitate everyone to find, I make a pseudo directory here, no jump function.
- Selection sort
- Insertion sort
- Bubble sort
- Non-optimized version
- Optimized version
- Hill sorting
- Merge sort
- Recursive merge sort
- Non-recursive merge sort
- Quick sort
- Heap sort
- Radix sort
- Non-optimized version
- Optimized version
- Bucket sort
- Radix sort
The other:
** code description: ** code I wrote, and are through several groups of data test, there should be no problem, if there is a mistake, please feedback, thank you.
** Pictures and animation are in Baidu search, if there is infringement, also hope to contact me to delete, thank you
First, selection sort
The process is simple: first, find the smallest element in the array. Second, swap it with the first element in the array (if the first element is the smallest, it swaps with itself). Second, find the smallest element among the remaining elements and swap it with the second element in the array. Repeat until the entire array is sorted. This method is called selection sorting.
I’ve also prepared giFs to help you understand:
The code is as follows:
public class SelectSort { public static int[] selectSort(int[] a) { int n = a.length; for (int i = 0; i < n - 1; i++) { int min = i; for (int j = i + 1; j < n; j++) { if(a[min] > a[j]) min = j; } // int temp = a[I]; a[i] = a[min]; a[min] = temp; } return a; }}Copy the code
Properties: 1, time complexity: O(n2) 2, space complexity: O(1) 3, unstable sorting 4, in situ sorting
Insert sort
When we were playing cards, how did you arrange the cards? A simple way to do this is to take each card one at a time, inserting each card into its proper place among the other cards that are already in order. When we sort an unordered array, in order to insert an element, we need to make room to move all the other elements one bit to the right before insertion, an algorithm we call insertion sort.
Brief description of the process:
1. Extract elements from the second element of the array.
2. Compare it to the first element on the left. If the first element on the left is larger than it, continue to compare it to the second element on the left until you reach an element that is not larger than it.
3, continue to choose the number 3, 4… Repeat Step 2 to insert n elements.
I’ve also prepared giFs to help you understand:
The code is as follows:
public class InsertSort { public static int[] insertSort(int[] arr) { if(arr == null || arr.length < 2) return arr; int n = arr.length; for (int i = 1; i < n; i++) { int temp = arr[i]; int k = i - 1; while(k >= 0 && arr[k] > temp) k--; // Select * from k + 1; for(int j = i ; j > k + 1; j--) arr[j] = arr[j-1]; Arr [k+1] = temp; } return arr; }}Copy the code
Properties: 1, time complexity: O(n2) 2, space complexity: O(1) 3, stable sorting 4, in situ sorting
Three, bubble sort
1. Compare the first element with the second, and if the first element is larger than the second, swap them. Then continue to compare the second and third elements, and if the second is larger than the third, switch their positions…
We do the same thing for each pair of adjacent elements, starting with the first pair and ending with the last pair, so that after a comparison swap, the element on the far right will be the largest.
Remove the rightmost element and do the same for the rest of the elements, repeating until the sorting is complete.
I’ve also prepared giFs to help you understand:
The following code
public class BubbleSort { public static int[] bubbleSort(int[] arr) { if (arr == null || arr.length < 2) { return arr; } int n = arr.length; for (int i = 0; i < n; i++) { for (int j = 0; j < n -i - 1; j++) { if (arr[j + 1] < arr[j]) { int t = arr[j]; arr[j] = arr[j+1]; arr[j+1] = t; } } } return arr; })Copy the code
Properties: 1, time complexity: O(n2) 2, space complexity: O(1) 3, stable sorting 4, in situ sorting
Optimize the bubble sort algorithm
If there is no swap between the adjacent elements from the first pair to the last pair at the end, this means that the element on the right is always greater than or equal to the element on the left, then the array is already in order, and we do not need to compare the remaining elements repeatedly.
The code is as follows:
public class BubbleSort { public static int[] bubbleSort(int[] arr) { if (arr == null || arr.length < 2) { return arr; } int n = arr.length; for (int i = 0; i < n; i++) { boolean flag = true; for (int j = 0; j < n -i - 1; j++) { if (arr[j + 1] < arr[j]) { flag = false; int t = arr[j]; arr[j] = arr[j+1]; arr[j+1] = t; }} // If (false) break; } return arr; }}Copy the code
Hill sort
Hill sort can be said to be a variant of insertion sort. Whether it’s an insert sort or a bubble sort, if the maximum value of the array happens to be the first digit, it takes n-1 moves to get it to the right place. That is, if an element in the array is far away from its correct location, it has to swap with neighboring elements many times to get to the correct location, which is relatively time-consuming.
Hill sort simply improves insertion sort to speed things up, swapping non-contiguous elements to sort parts of an array.
The idea of Hill sort is to use the method of insertion sort, to order any element in the array with an interval of H, h = n / 2 at the beginning, and then h = n / 4, let h keep shrinking, when h = 1, that is, any element with an interval of 1 in the array is ordered, And then the array is sorted.
I also have a picture for you to understand:
The following code
public class ShellSort { public static int[] shellSort(int arr[]) { if (arr == null || arr.length < 2) return arr; int n = arr.length; // Sort each group with interval h, at first h = n / 2; for (int h = n / 2; h > 0; // insert sort for (int I = h; i < n; I ++) {// insert arr[I] into the correct position of the grouping. } } return arr; } /** * insert arr[I] into the correct position of the group where arr[I] is located. arr[i-2*h],arr[i-h], arr[i+h] ... */ private static void insertI(int[] arr, int h, int i) { int temp = arr[i]; int k; for (k = i - h; k > 0 && temp < arr[k]; k -= h) { arr[k + h] = arr[k]; } arr[k + h] = temp; }}Copy the code
Note that instead of sorting one group and then the other, you sort each group in turn.
Properties: 1, time complexity: O(nlogn) 2, space complexity: O(1) 3, unstable sorting 4, in situ sorting
Merge sort
To order a large unordered array, we can divide the large array into two, sort the two arrays separately, and then combine the two arrays into one ordered array. Since both small arrays are ordered, merging is fast.
Divide the large array recursively until the size of the array is 1, and there is only one element, then the array is ordered, and then merge the two arrays of size 1 into one of size 2, and then merge the two arrays of size 2 into 4… Until all the small arrays merge.
I’ve also prepared giFs to help you understand:
The code is as follows:
Public static int[] MergeSort (int[] arr, int left, int right) {// If left == right, If (left < right) {int mid = (left + right) / 2; Arr = mergeSort(arr, left, mid); Arr = mergeSort(arr, mid + 1, right); // Merge (arR, left, mid, right); } return arr; } // arr[left..mif] represents an array, Arr [mid+1.. right] private static void merge(int[] arr, int left, int mid, Int [] a = new int[right-left + 1]; int[] a = new int[right-left + 1]; int i = left; int j = mid + 1; int k = 0; while (i <= mid && j <= right) { if (arr[i] < arr[j]) { a[k++] = arr[i++]; } else { a[k++] = arr[j++]; } } while(i <= mid) a[k++] = arr[i++]; while(j <= right) a[k++] = arr[j++]; // Copy the temporary array to the original array for (I = 0; i < k; i++) { arr[left++] = a[i]; }}}Copy the code
Properties: 1, time complexity: O(nlogn) 2, space complexity: O(n) 3, stable sorting 4, non-in-situ sorting
But what if the interviewer asks you to write a non-recursive merge sort? Don’t worry, I’ve also pulled out a non-recursive merge sort as follows:
Public static int[] MergeSort (int[] arr) {int n = arr.length; // subarray sizes are 1,2,4,8... // The first merged array size is 1, then 2, then 4.... for (int i = 1; i < n; Int left = 0; int left = 0; int mid = left + i - 1; int right = mid + i; While (right < n) {merge(arr, left, mid, right); merge(arr, left, mid, right); left = right + 1; mid = left + i - 1; right = mid + i; If (left < n && mid < n) {merge(arr, left, mid, n-1); } } return arr; }}Copy the code
Quicksort
We take an element from the array, let’s call it the central element, and we put all the elements in the array that are less than the central element to the left, and all the elements that are greater than or equal to the central element to the right, and obviously, the positions of the central elements are sorted. That is, we no longer need to move the axis element.
We cut the large array into two smaller arrays starting with the central element (neither of which contains the central element), and then recursively do the same for the array to the left and the array to the right of the central element until the array has size 1, at which point each element is in order.
I’ve also prepared giFs to help you understand:
The code is as follows:
public class QuickSort { public static int[] quickSort(int[] arr, int left, Int mid = partition(arr, left, right) {if (left < right) {int mid = partition(arr, left, right); Arr = quickSort(arr, left, mid-1); arr = quickSort(arr, mid + 1, right); } return arr; } private static int partition(int[] arr, int left, int right) {private static int partition(int[] arr, int left, int right) { int i = left + 1; int j = right; While (I <= j&&arr [I] <= pivot) I ++; While (I <= j&&arr [j] >= pivot) j--; if(i >= j) break; Pivot int temp = arr[I]; pivot int temp = arr[I]; pivot int temp = arr[I]; arr[i] = arr[j]; arr[j] = temp; } arr[left] = arr[j]; Arr [j] = pivot; return j; }}Copy the code
Properties: 1, time complexity: O(nlogn) 2, space complexity: O(logn) 3, unstable sorting 4, in situ sorting
Heap sort
The characteristic of a heap is that the element at the top of the heap is a maximum, the top of a large heap is a maximum, and the top of a small heap is a minimum.
Heap sort is the process of swapping the top element with the last element, which breaks the heap’s properties, and then swapping the top element with the last second element. This repeats until there is only one element left, and the array is sorted.
I’ve also prepared giFs to help you understand:
The code is as follows:
Public static int[] headSort(int[] arr) {int n = arr.length; For (int I = (n-2) / 2; i >= 0; i--) { downAdjust(arr, i, n - 1); } // heap sort for (int I = n-1; i >= 1; Int temp = arr[I]; int temp = arr[I]; arr[i] = arr[0]; arr[0] = temp; // downAdjust(ARr, 0, i-1); } return arr; } public static void downAdjust(int[] arr, int parent, int n) {int temp = arr[parent]; Int child = 2 * parent + 1; While (child <= n) {if(child + 1 <= n &&arr [child] < arr[child + 1]) child++; If (arr[child] <= temp) break; // If (arr[child] <= temp) break; Arr [parent] = arr[child]; parent = child; child = 2 * parent + 1; } arr[parent] = temp; }}Copy the code
Properties: 1, time complexity: O(nlogn) 2, space complexity: O(1) 3, unstable sorting 4, in situ sorting
Counting sort
Counting sort is a sort where the difference between a maximum and a minimum is not very large.
The basic idea is to use an array element as the index of the array, and then use a temporary array to count the number of occurrences of the element, such as temp[I] = m, indicating that the element I has occurred m times. Finally, the temporary array is aggregated from small to large, so that the aggregated data is in order.
I’ve also prepared giFs to help you understand:
The code is as follows:
public class Counting { public static int[] countSort(int[] arr) { if(arr == null || arr.length < 2) return arr; int n = arr.length; int max = arr[0]; For (int I = 1; i < n; i++) { if(max < arr[i]) max = arr[i]; } // Create a temporary array of Max size int[] temp = new int[Max + 1]; For (int I = 0; i < n; i++) { temp[arr[i]]++; } int k = 0; For (int I = 0; i <= max; i++) { for (int j = temp[i]; j > 0; j--) { arr[k++] = i; } } return arr; }}Copy the code
Properties: 1, time complexity: O(n+ K) 2, space complexity: O(k) 3, stable sorting 4, non-in-situ sorting
Note: K represents the size of the temporary array, same as below
To optimize the
In the above code, we create the array according to the size of Max. If the original array has only 10 elements, and the minimum value is min = 10000 and the maximum value is Max = 10005, then we can create the array with the size of 10005 + 1. The difference between the maximum and minimum is 5, so we’ll just create a temporary array of size 6.
That is, we create a temporary array size (max-min + 1) and then use min as the offset. The optimized code looks like this:
public class Counting { public static int[] sort(int[] arr) { if(arr == null || arr.length < 2) return arr; int n = arr.length; int min = arr[0]; int max = arr[0]; For (int I = 1; i < n; i++) { if(max < arr[i]) max = arr[i]; if(min > arr[i]) min = arr[i]; } int d = max - min + 1; Int [] temp = new int[d]; For (int I = 0; i < n; i++) { temp[arr[i] - min]++; } int k = 0; For (int I = 0; i < d; i++) { for (int j = temp[i]; j > 0; j--) { arr[k++] = i + min; } } return arr; }}Copy the code
9. Bucket sorting
Bucket sort is to divide the number between the maximum value and the minimum value, for example, into 10 ranges, 10 ranges correspond to 10 buckets, we put each element into the corresponding range of buckets, and then sort the number in each bucket, you can use merge sort, you can also use quicksort and so on.
Then the data in each bucket is in order, and we are merging and summarizing.
I also have a picture for you to understand:
The code is as follows:
public class BucketSort { public static int[] BucketSort(int[] arr) { if(arr == null || arr.length < 2) return arr; int n = arr.length; int max = arr[0]; int min = arr[0]; For (int I = 1; i < n; i++) { if(min > arr[i]) min = arr[i]; if(max < arr[i]) max = arr[i]; } // Create an offset of size min int d = max-min; Int bucketNum = d / 5 + 1; int bucketNum = d / 5 + 1; ArrayList<LinkedList<Integer>> bucketList = new ArrayList<>(bucketNum); // initialize bucket for (int I = 0; i < bucketNum; i++) { bucketList.add(new LinkedList<Integer>()); } for (int I = 0; i < n; i++) { bucketList.get((arr[i]-min)/d).add(arr[i] - min); } for (int I = 0; i < bucketNum; i++) { Collections.sort(bucketList.get(i)); } int k = 0; int k = 0; for (int i = 0; i < bucketNum; i++) { for (Integer t : bucketList.get(i)) { arr[k++] = t + min; } } return arr; }}Copy the code
Properties: 1, time complexity: O(N + K) 2, space complexity: O(n+ K) 3, stable sorting 4, non-in-situ sorting
Note: K represents the number of barrels, the same as below
Radix sort
Sort cardinality sort idea is like this: first to the size of the one digit to sort the data, then to the size of the ten digit to most sort, and then to the size of the hundreds…
At the end, you have an ordered set of elements. However, when he sorted by digit, he sorted by bucket.
Due to a certain number (ones/tens… So we need 10 buckets, and then we put the same number of buckets into the same bucket, and then we take out the buckets in the order from bucket 0 to bucket 9, and then we are done with a certain number of buckets
I’ve also prepared giFs to help you understand:
The code is as follows:
public class RadioSort { public static int[] radioSort(int[] arr) { if(arr == null || arr.length < 2) return arr; int n = arr.length; int max = arr[0]; For (int I = 1; i < n; i++) { if(max < arr[i]) max = arr[i]; Int num = 1; while (max / 10 > 0) { num++; max = max / 10; // Create 10 buckets ArrayList<LinkedList<Integer>> bucketList = new ArrayList<>(10); // initialize bucket for (int I = 0; i < 10; i++) { bucketList.add(new LinkedList<Integer>()); } for (int I = 1; i <= num; i++) { for (int j = 0; j < n; Int radio = (arr[j] / (int) math.pow (10,i-1)) % 10; Add (arr[j]); // Add (arr[j]); } // merge back to the original array int k = 0; for (int j = 0; j < 10; j++) { for (Integer t : bucketList.get(j)) { arr[k++] = t; } bucketlist.get (j).clear(); } } return arr; }}Copy the code
Properties: 1, time complexity: O(KN) 2, space complexity: O(N + K) 3, stable sorting 4, non-in-situ sorting
conclusion
A graph summarizes the properties of 10 sorting algorithms
If you are reviewing/learning the top 10 sorting algorithms, make sure you implement them manually without looking at the sample code.
Personal Java learning garden