Sorting algorithm
The ten common sorting algorithms can be divided into two broad categories:
- Comparison sorting: to determine the relative order of elements by comparison, because its time complexity can not break O(Nlogn), so it is also called nonlinear time comparison sorting.
- Non-comparative sorting: it does not determine the relative order of elements by comparison. It can break through the lower bounds of time based on comparison sorting and operates in linear time. Therefore, it is also called linear time non-comparative sorting.
Relevant concepts
- Stability: if a is in front of B, and a= B, a is still in front of B after sorting;
- Unstable: If a is in front of B and a= B, a may appear behind B after sorting;
- Internal sort: All sort operations are done in memory;
- External sort: Because the data is too large, the data is placed on disk, and the sort can only be carried out by disk and memory data transfer;
- Time complexity: The amount of time an algorithm takes to execute.
- Space complexity: The amount of memory required to run a program.
Bubble Sort
Algorithm principle:
Bubble Sort is a simple sorting algorithm. It repeatedly walks through the sequence to be sorted, comparing two elements at a time and swapping them if they are in the wrong order. The task of visiting an array is repeated until no more exchanges are needed, that is, the array has been sorted. The algorithm gets its name because smaller elements slowly “float” to the top of the list through swapping.
Algorithm description:
- Compare adjacent elements. If the first one is larger than the second, swap them both;
- Do the same for each pair of adjacent elements, starting with the first pair and ending with the last pair, so that the last element should be the largest;
- Repeat for all elements except the last one;
- Repeat steps 1 to 3 until the sorting is complete.
Code implementation:
var bubbleSort = function(list) {
for (let i = 0; i < list.length - 1; i++) {
for (let j = 0; j < list.length - 1 - i; j++) {
if (list[j + 1] < list[j]) {
let temp = list[j];
list[j] = list[j + 1];
list[j + 1] = temp;
}
}
}
return list;
}
Copy the code
Algorithm complexity:
- Time complexity (average) : O (n²)
- Time complexity (worst) : O (n²)
- Time complexity (best) : O (n)
- Space complexity: O (1)
- Sort by: in-place
- Stability: stability
Quick Sort
Algorithm principle:
The basic idea of quicksort is that the records to be sorted are separated into two independent parts through a sort. If the keywords of one part of the records are smaller than those of the other part, the two parts of the records can be sorted separately to achieve the order of the whole sequence.
Algorithm description:
Quicksort uses divide-and-conquer to divide a list into two sub-lists. The specific algorithm is described as follows:
- Pick an element from the sequence, called pivot;
- Reorder the array so that all elements smaller than the base value are placed in front of the base value and all elements larger than the base value are placed behind the base value (the same number can go to either side). After the partition exits, the benchmark is in the middle of the array. This is called a partition operation;
- Recursively sorts subsequences of elements less than a reference and subsequences of elements greater than a reference.
Code implementation:
var swap = function(arr, i, j) { var temp = arr[i]; arr[i] = arr[j]; arr[j] = temp; } var quickSort = function (list, left, right) { let parIndex; left = typeof left ! = 'number' ? 0 : left, right = typeof right ! = 'number' ? list.length - 1 : right; if (left < right) { parIndex = getParIndex2(list, left, right); quickSort(list, left, parIndex - 1); quickSort(list, parIndex + 1, right); } return list; } var getParIndex = function(list, left, right) { let pivot = left, index = pivot + 1; for (let i = index; i <= right; i++) { if (list[i] < list[pivot]) { swap(list, i, index); index++; } } swap(list, pivot, index - 1); return index - 1; } var getParIndex1 = function(list, left, right) { let temp = list[left]; while (left < right) { while (left < right && list[right] > temp) right--; list[left] = list[right]; left++; while (left < right && list[left] <= temp) left++; list[right] = list[left]; right--; } list[right] = temp; return right; } var getParIndex2 = function(list, left, right) { let pivot = list[left], temp, i = left; while (left < right) { while (left < right && list[right] > pivot) right--; while (left < right && list[left] <= pivot) left++; if (left < right) { swap(list, left, right); } } swap(list, i, right); return right; }Copy the code
Algorithm complexity:
- Time complexity (average) : O (nlogn)
- Time complexity (worst) : O (n²)
- Time complexity (best) : O (nlogn)
- Space complexity: O (nlogn)
- Sort by: in-place
- Stability: unstable
Insertion Sort
Algorithm principle:
Insertion Sort algorithm is a simple and intuitive sorting algorithm. It works by building an ordered sequence, scanning back to front in the sorted sequence for unsorted data, finding the appropriate location and inserting it.
Algorithm description:
In general, insert sorts are implemented on arrays using in-place. The specific algorithm is described as follows:
- Starting with the first element, the element can be considered sorted;
- Fetch the next element and scan back to front in the sorted element sequence;
- If the element (sorted) is larger than the new element, move the element to the next position;
- Repeat step 3 until you find a place where the sorted element is less than or equal to the new element;
- After inserting a new element into that position;
- Repeat steps 2 to 5.
Code implementation:
var insertionSort = function(list) {
if (list.length <= 1)
return list;
for (let i = 1; i < list.length; i++) {
let preInd = i - 1,
cur = list[i];
while (preInd >= 0 && cur < list[preInd]) {
list[preInd + 1] = list[preInd];
preInd--;
}
list[preInd + 1] = cur;
}
return list;
}
Copy the code
Algorithm complexity:
In the implementation of insertion sort, in-place sort is usually adopted (that is, the sort only needs to use O(1) extra space). Therefore, in the process of scanning from back to front, the sorted elements need to be moved backward step by step repeatedly to provide insertion space for the latest elements.
- Time complexity (average) : O (n²)
- Time complexity (worst) : O (n²)
- Time complexity (best) : O (n)
- Space complexity: O (1)
- Sort by: in-place
- Stability: stability
Shell Sort
Algorithm principle:
1959 Shell invention, the first breakthrough O(n²) sorting algorithm, is an improved version of simple insertion sorting. It differs from insertion sort in that it compares distant elements first. Hill sort is also called reduced increment sort.
Algorithm description:
Firstly, the whole record sequence to be sorted is divided into several sub-sequences for direct insertion sorting. The specific algorithm is described as follows:
- Select a delta sequence T1, T2… , where Ti >tj, tk=1;
- Sort the sequence k times according to the number of incremental sequences k;
- In each sort, the sequence to be sorted is divided into several sub-sequences of length M according to the corresponding increment TI, and each sub-table is sorted by direct insertion. Only when the increment factor is 1, the whole sequence is treated as a table, and the length of the table is the length of the whole sequence.
Code implementation:
var shellSort = function(list) {
if (list.length <= 1)
return list;
for (let gap = Math.floor(list.length / 2); gap > 0; gap = Math.floor(gap / 2)) {
for (let i = gap; i < list.length; i++) {
let preInd = i - gap,
cur = list[i];
while (preInd >= 0 && cur < list[preInd]) {
list[preInd + gap] = list[preInd];
preInd -= gap;
}
list[preInd + gap] = cur;
}
}
return list;
}
Copy the code
Algorithm complexity:
The core of Hill sequence is the setting of interval sequence. You can either set the interval sequence in advance or define the interval sequence dynamically. The algorithm for dynamically defining interval sequences was developed by Robert Sedgewick, co-author of Algorithms (4th Edition).
- Time complexity (average) : O (n^1.3)
- Time complexity (worst) : O (n²)
- Time complexity (best) : O (n)
- Space complexity: O (1)
- Sort by: in-place
- Stability: unstable
Selection Sort
Algorithm principle:
Selection Sort is a simple and intuitive sorting algorithm. It works by first finding the smallest (largest) element in the unsorted sequence and storing it at the beginning of the sorted sequence, then continuing to find the smallest (largest) element from the remaining unsorted elements and placing it at the end of the sorted sequence. And so on until all the elements are sorted.
Algorithm description:
Direct selection sort of N records can get ordered results after n-1 direct selection sort. The specific algorithm is described as follows:
- Initial state: the disordered region is R[1..n], and the ordered region is empty.
- Th order (I =1,2,3… N-1), the current ordered region and disordered region are R[1..i-1] and R(I.. N). This sorting selects the record R[k] with the smallest keyword from the current unordered region and exchanges it with the first record R in the unordered region, so that R[1.. I] and R[I +1..n) become the new ordered region with 1 more records and the new unordered region with 1 less records, respectively.
- N minus one is done, and the array is sorted.
Code implementation:
var selectionSort = function(list) {
let minInd;
for (let i = 0; i < list.length - 1; i++) {
minInd = i;
for (let j = i + 1; j < list.length; j++) {
if (list[j] < list[minInd]) {
minInd = j;
}
}
let temp = list[i];
list[i] = list[minInd];
list[minInd] = temp;
}
return list;
}
Copy the code
Algorithm complexity:
One of the most stable sorting algorithms, because whatever data goes in is O(n2) time, so when you use it, the smaller the data, the better. The only benefit might be that it doesn’t take up any extra memory. Theoretically speaking, the choice sort may also be the usual sort ordinary people think of the most sorting method.
- Time complexity (average) : O (n²)
- Time complexity (worst) : O (n²)
- Time complexity (best) : O (n²)
- Space complexity: O (1)
- Sort by: in-place
- Stability: unstable
Heap Sort
Algorithm principle:
Heapsort refers to a sort algorithm designed by using heap data structure. Heap is a nearly complete binary tree structure, and also satisfies the property of heap: that is, the key value or index of the child node is always smaller (or larger) than its parent node.
Algorithm description:
- The initial sequence of keywords to be sorted (R1,R2… .rn) is constructed into the big top heap, which is the initial unordered region;
- By exchanging the top element R[1] with the last element R[n], a new unordered region (R1,R2… Rn-1) and a new ordered region (Rn), and R[1,2… n-1]<=R[n];
- Since the new heap top R[1] after the swap may violate the heap properties, it is necessary to apply the current unordered region (R1,R2… Rn-1) adjusts to the new heap, and then swaps R[1] again with the last element of the unordered region to obtain the new unordered region (R1,R2… .Rn-2) and the new ordered region (Rn-1,Rn). This process is repeated until the number of elements in the ordered region is N-1, and the whole sorting process is complete.
Code implementation:
var heapSort = function(list) {
buildMaxHeap(list);
for (let j = list.length - 1; j > 0; j--) {
[list[0], list[j]] = [list[j], list[0]];
setHeap(list, 0, j);
}
return list;
}
var buildMaxHeap = function(list) {
for (let i = Math.floor(list.length / 2); i >= 0; i--) {
setHeap(list, i, list.length);
}
}
var setHeap = function(list, i, len) {
let left = 2 * i + 1,
right = 2 * i + 2,
max = i;
if (left < len && list[left] > list[max])
max = left;
if (right < len && list[right] > list[max])
max = right;
if (max != i) {
[list[i], list[max]] = [list[max], list[i]];
setHeap(list, max, len);
}
}
Copy the code
Algorithm complexity:
- Time complexity (average) : O (nlogn)
- Time complexity (worst) : O (nlogn)
- Time complexity (best) : O (nlogn)
- Space complexity: O (1)
- Sort by: in-place
- Stability: unstable
Merge Sort
Algorithm principle:
Merge sort is an efficient sorting algorithm based on merge operation. This algorithm is a very typical application of Divide and Conquer. The ordered subsequences are combined to obtain a fully ordered sequence. That is, each subsequence is ordered first, and then the subsequence segments are ordered. If two ordered tables are joined into one ordered table, it is called 2-way merge.
Algorithm description:
- The input sequence of length N is divided into two subsequences of length N /2.
- Merge sort is used for these two subsequences respectively.
- Combine two sorted subsequences into a final sorted sequence.
Code implementation:
var mergeSort = function(list) {
if (list.length <= 1)
return list;
let mid = Math.floor(list.length / 2),
right = list.slice(0, mid),
left = list.slice(mid);
return merge(mergeSort(right), mergeSort(left));
}
var merge = function(right, left) {
let res = [];
while (right.length && left.length) {
if (right[0] < left[0])
res.push(right.shift());
else
res.push(left.shift());
}
while (right.length)
res.push(right.shift());
while (left.length)
res.push(left.shift());
return res;
}
Copy the code
Algorithm complexity:
Merge sort is a stable sorting method. As with select sort, merge sort performs independently of the input data, but performs much better than select sort because it is always O(nlogn) time. The trade-off is extra memory.
- Time complexity (average) : O (nlogn)
- Time complexity (worst) : O (nlogn)
- Time complexity (best) : O (nlogn)
- Space complexity: O (n)
- Sort by out-of-place
- Stability: stability
Counting Sort
Algorithm principle:
Counting sort is not a comparison-based sorting algorithm. Its core is to convert the input data values into keys and store them in an extra array space. As a sort of linear time complexity, counting sort requires that the input data must be integers with a definite range.
Algorithm description:
- Find the largest and smallest elements in the array to be sorted;
- Count the number of occurrences of each element of value I in the array and store it in the i-th entry of array C;
- Add up all the counts (starting with the first element in C, adding each term to the previous one);
- Reverse populating the target array: place each element I in the C(I) item of the new array, subtracting C(I) by 1 for each element.
Code implementation:
var countingSort = function(arr, maxVal) { let bucket = new Array(maxVal + 1); for (var i = 0; i < arr.length; i++) { if (! bucket[arr[i]]) { bucket[arr[i]] = 0; } bucket[arr[i]]++; } let ind = 0; for (var j = 0; j < bucket.length; j++) { while (bucket[j] > 0) { arr[ind++] = j; bucket[j]--; } } return arr; }Copy the code
Algorithm complexity:
Counting sort is a stable sorting algorithm. When the input elements are n integers between 0 and k, the time complexity is O(n+k) and the space complexity is also O(n+k), which is faster than any comparison sorting algorithm. Counting sort is a very efficient sorting algorithm when k is not very large and the sequence is relatively concentrated.
- Time complexity (average) : O (n+ K)
- Time complexity (worst) : O (n+ K)
- Time complexity (best) : O (n+ K)
- Space complexity: O (n+ K)
- Sort by out-of-place
- Stability: stability
Bucket Sort
Algorithm principle:
Bucket sort is an updated version of counting sort. It makes use of the mapping relation of functions, and the key of efficiency lies in the determination of the mapping function. Bucket sort works by dividing the data into a finite number of buckets, assuming that the input data is evenly distributed, and sorting each Bucket separately (it is possible to use another sorting algorithm or to continue sorting using Bucket sort recursively).
Algorithm description:
- Set a quantitative array as an empty bucket;
- Iterate over the input data and put the data one by one into the corresponding bucket;
- Sort each bucket that is not empty;
- Splicing together sorted data from never-empty buckets.
Code implementation:
var bucketSort = function(list, bucketSize) {
if (list.length < 2)
return list;
let maxVal = list[0],
minVal = list[0];
for (let i = 1; i < list.length; i++) {
if (list[i] < minVal)
minVal = list[i];
else if (list[i] > maxVal)
maxVal = list[i];
}
let BUCKET_SIZE = 5;
bucketSize = bucketSize || BUCKET_SIZE;
let bucketLen = Math.floor((maxVal - minVal) / bucketSize) + 1;
let bucketList = new Array(bucketLen);
for (let i = 0; i < bucketList.length; i++) {
bucketList[i] = [];
}
for (let i = 0; i < list.length; i++) {
bucketList[Math.floor((list[i] - minVal) / bucketSize)].push(list[i]);
}
list.length = 0;
for (let i = 0; i < bucketLen; i++) {
insertionSort(bucketList[i]);
for (let j = 0; j < bucketList[i].length; j++) {
list.push(bucketList[i][j]);
}
}
return list;
}
var insertionSort = function(list) {
if (list.length <= 1)
return list;
for (let i = 1; i < list.length; i++) {
let preInd = i - 1,
cur = list[i];
while (preInd >= 0 && cur < list[preInd]) {
list[preInd + 1] = list[preInd];
preInd--;
}
list[preInd + 1] = cur;
}
return list;
}
Copy the code
Algorithm complexity:
The optimal linear time for bucket sorting is O(n). The time complexity of bucket sorting depends on the time complexity of sorting data between buckets, because the time complexity of all other parts is O(n). Obviously, the smaller the bucket, the less data there is between the buckets, and the less time it takes to sort. But the corresponding space consumption will increase.
- Time complexity (average) : O (n+ K)
- Time complexity (worst) : O (n²)
- Time complexity (best) : O (n+ K)
- Space complexity: O (n+ K)
- Sort by out-of-place
- Stability: stability
Radix Sort
Algorithm principle:
Radix sort is sorted by low order and then collected; And then sort it in order, and then collect it; And so on, all the way to the highest place. Sometimes some attributes are prioritized, first by low priority, then by high priority. The final order is that the higher-priority ones come first, and the lower-priority ones come first.
Algorithm description:
- Get the largest number in the array and get the number of digits;
- Arr is the original array, and each bit is taken from the lowest level to form the RADIX array.
- The radix is sorted by counting (taking advantage of the fact that counting sorting is suitable for small range numbers);
Code implementation:
var radixSort = function(list, maxDigit) { let mod = 10, dev = 1, counter = []; for (let i = 0; i < maxDigit; i++, mod *= 10, dev *= 10) { for (let j = 0; j < list.length; j++) { let bucket = parseInt((list[j] % mod) / dev); if (counter[bucket] == null) counter[bucket] = []; counter[bucket].push(list[j]) } let pos = 0; for (let j = 0; j < counter.length; j++) { var val = null; if (counter[j] ! = null) { while ((val = counter[j].shift()) ! = null) { list[pos++] = val; } } } } return list; }Copy the code
Algorithm complexity:
Radix sort is based on sorting separately, collecting separately, so it’s stable. However, the performance of radix sort is slightly worse than that of bucket sort. It takes O(n) time complexity for each bucket allocation of keywords, and O(n) time complexity for obtaining new keyword sequences after allocation. If the data to be sorted can be divided into D keywords, the time complexity of radix sort will be O(D *2n). Of course, D is much smaller than N, so it is basically linear.
The space complexity of radix sort is O(n+k), where K is the number of buckets. Generally n>> K, so you need about n extra Spaces.
- Time complexity (average) : O (n* K)
- Time complexity (worst) : O (n*k)
- Time complexity (best) : O (n* K)
- Space complexity: O (n+ K)
- Sort by out-of-place
- Stability: stability