This article has 7K + word, system comb js sort algorithm related knowledge, hope you can like.
Original text: louiszhai. Dead simple. IO / 2016/12/23 /…
takeaway
Sorting algorithms are something of a blind spot for me. When I learned that Chrome’s Array.prototype.sort uses quicksort, MY heart broke (what is quicksort, I know bubble? !). You know, the best time to learn a skill was three years ago, I hope I still have time to cram (cover face).
Therefore, this paper picks up more than a dozen sorting algorithms with high on-camera probability, analyzes their sorting ideas one by one, and annotates matters needing attention. Improvements and discussions on the algorithm are welcome.
Bubble sort
Bubble sort requires two nested loops. Where, the outer circle moves the cursor; The inner loop iterates over the cursor and the elements that follow (or precede) it, swapping in pairs to make sure that the end of the inner loop is sorted correctly each time. Then the inner loop ends, and the outer loop moves the cursor behind (or before), starting the next inner loop, and so on until the end of the loop.
Tips: Because bubble sort switches adjacent elements only when they are not the right size, it does not change the relative order of the same elements, so it is a stable sorting algorithm.
Since there are two layers of loops, there are four possible implementations.
plan | The outer loop | The inner loop |
---|---|---|
1 | Positive sequence | Positive sequence |
2 | Positive sequence | The reverse |
3 | The reverse | Positive sequence |
4 | The reverse | The reverse |
The four different directions of circulation are implemented in slightly different ways.
The following is the GIF effect (corresponding to scheme 1: both inner and outer loops are positive order traversal.
The following is the algorithm implementation of the figure above (corresponding to scheme 1: both inner and outer loops are positive order traversal).
// Abstract the exchange element part first
function swap(i,j,array){
var temp = array[j];
array[j] = array[i];
array[i] = temp;
}Copy the code
function bubbleSort(array) {
var length = array.length, isSwap;
for (var i = 0; i < length; i++) { / / positive sequence
isSwap = false;
for (var j = 0; j < length - 1 - i; j++) { / / positive sequence
array[j] > array[j+1] && (isSwap = true) && swap(j,j+1,array);
}
if(! isSwap)break;
}
return array;
}Copy the code
Above, the characteristic of sorting is that the position of the last element is determined first.
Scheme 2: outer loop positive order traversal, inner loop reverse order traversal, the code is as follows:
function bubbleSort(array) {
var length = array.length, isSwap;
for (var i = 0; i < length; i++) { / / positive sequence
isSwap = false;
for (var j = length - 1; j >= i+1; j--) { / / reverse
array[j] < array[j- 1] && (isSwap = true) && swap(j,j- 1,array);
}
if(! isSwap)break;
}
return array;
}Copy the code
Above, the first element is located first.
Scheme 3: outer loop reverse order traversal, inner loop positive order traversal, the code is as follows:
function bubbleSort(array) {
var length = array.length, isSwap;
for (var i = length - 1; i >= 0; i--) { / / reverse
isSwap = false;
for (var j = 0; j < i; j++) { / / positive sequence
array[j] > array[j+1] && (isSwap = true) && swap(j,j+1,array);
}
if(! isSwap)break;
}
return array;
}Copy the code
Above, since the inner loop is a positive order traversal, the position of the last element is determined first.
Scheme four: outer loop reverse order traversal, inner loop reverse order traversal, the code is as follows:
function bubbleSort(array) {
var length = array.length, isSwap;
for (var i = length - 1; i >= 0; i--) { / / reverse
isSwap = false;
for (var j = length - 1; j >= length - 1 - i; j--) { / / reverse
array[j] < array[j- 1] && (isSwap = true) && swap(j,j- 1,array);
}
if(! isSwap)break;
}
return array;
}Copy the code
Above, since the inner loop is in reverse order, the position of the first element is determined first.
Here is the algorithmic complexity:
Mean time complexity | The best situation | The worst | Spatial complexity |
---|---|---|---|
O (n squared) | O(n) | O (n squared) | O(1) |
Bubble sort is the easiest sort to implement, and the worst case is that you have to swap each time, which is approximately nĀ²/2 times traversal and swap, and the time complexity is O(nĀ²). In the best case, the inner loop iterates once and finds that the sort is correct, so the loop exits with O(n) time. On average, the time complexity is O(nĀ²). Since only cached temp variables require memory space in bubble sort, the space complexity is constant O(1).
Two-way bubble sort
Two-way bubble sort is a simple upgrade version of bubble sort, also known as cocktail sort. Bubble sort sort from low to high (or high to low) in one direction, and bidirectional bubble sort sort from both directions (usually, first from low to high, then from high to low). Therefore, it performs slightly better than bubble sort.
The algorithm implementation is as follows:
function bothwayBubbleSort(array){
var tail = array.length- 1, i, isSwap = false;
for(i = 0; i < tail; tail--){
for(var j = tail; j > i; j--){ // In the first round, bubble the smallest data to the front
array[j- 1] > array[j] && (isSwap = true) && swap(j,j- 1,array);
}
i++;
for(j = i; j < tail; j++){ // In the second round, bubble the biggest data to the back
array[j] > array[j+1] && (isSwap = true) && swap(j,j+1,array); }}return array;
}Copy the code
Selection sort
Logically, selection sort is a simple and intuitive sorting algorithm. It’s also a two-layer cycle. The inner loop is like a worker, it actually does the work. Each time the inner loop executes, it picks the smallest (or largest) element to be sorted and stores it at the beginning of the array. The outer loop acts like a boss, telling the inner loop that you need to keep working until the work is done (i.e. all the elements are sorted).
Tips: Select sort each time the elements exchanged may not be adjacent, so it may break the order between elements that were originally the same value. For example, if the array [2,2,1,3] is sorted forward, the first number 2 will be swapped with the number 1, so the order between the 2’s will be different from the original order, even though their values are the same, but their relative order will be changed. We call this instability.
Here’s what the GIF looks like:
Here is the algorithm implementation of the figure above:
function selectSort(array) {
var length = array.length, min;
for (var i = 0; i < length - 1; i++) {
min = i;
for (var j = i + 1; j < length; j++) {
array[j] < array[min] && (min = j); // Remember the lowest index} min! =i && swap(i,min,array); }return array;
}Copy the code
Here is the algorithmic complexity:
Mean time complexity | The best situation | The worst | Spatial complexity |
---|---|---|---|
O (n squared) | O (n squared) | O (n squared) | O(1) |
Selection sorting is truly simple and intuitive, which makes it notoriously slow. In any case, even if the original array is sorted, it takes nearly n ^ 2 /2 times to make sure. Even so, its sorting results are unstable. The only good news is that it doesn’t cost extra memory.
Insertion sort
Insertion sort is designed to quickly insert a new element into an ordered array. Its algorithm idea is: to sort the array is divided into two parts, one part is all the elements of the array (excluding the elements to be inserted), the other part is to be inserted elements; Sort the first part before inserting the element. The first part is sorted by splitting it into two parts again.
Insertion sort can be divided into direct insertion sort, split insertion sort (also called binary insertion sort), linked list insertion sort and Hill sort.
Direct insertion sort
The basic idea is that the elements to be sorted are inserted in order of size into an already sorted array until all the elements are inserted.
Here’s what the GIF looks like:
Here is the algorithm implementation of the figure above:
function directInsertionSort(array) {
var length = array.length, index, current;
for (var i = 1; i < length; i++) {
index = i - 1; // The subscript of the element to be compared
current = array[i]; // The current element
while(index >= 0 && array[index] > current) { // One of the preconditions: the element to be compared is larger than the current element
array[index+1] = array[index]; // Move the element to be compared back one bit
index--; // Move the cursor forward one bit
//console.log(array);
}
if(index+1! = i){// Avoid assigning the same element to itself
array[index+1] = current; // Insert the current element into the reserved space
//console.log(array);}}return array;
}Copy the code
To get a better look at the implementation of direct insertion sort, let’s add the comment section in the above code. Take the array [5,4,3,2,1] as an example. Here is the evolution of the original array.
As you can see, each element of the array, from the back to the front, as long as it is smaller than the preceding element, is inserted in a reasonable position.
Tips: Direct insertion sort is a stable sort because it moves one element at a time and does not change the sort between elements of the same value.
Half insertion sort
Split insert sort is an upgrade of direct insert sort. Since the first part of insertion sort is an sorted array, we don’t have to look for insertion points in order, we just compare their intermediate value with the size of the element to be inserted.
Tips: Like direct insertion sort, split insertion sort swaps adjacent elements with different values each time. It does not change the order between elements with the same values. So it’s stable.
The basic idea of the algorithm is:
- Take the midpoint from 0 to I minus 1 (
m = (i-1)>>1
), array[I] is compared with array[m]. If array[I] < array[m], array[I] should be between 0 and m indexes. Otherwise, it should be between the m ~ i-1 index of the array. - Repeat step 1, narrowing the search by half each time until the insertion position is found.
- Moves all elements in an array back one bit after the insertion position.
- Inserts the ith element at the specified position.
Note: x>>1 == math.floor (x/2)
The algorithm implementation is as follows:
function binaryInsertionSort(array){
var current, i, j, low, high, m;
for(i = 1; i < array.length; i++){
low = 0;
high = i - 1;
current = array[i];
while(low <= high){ // Steps 1&2: search in half
m = (low + high)>>1;
if(array[i] >= array[m]){// If the values are the same, switch to the high half region to ensure stability
low = m + 1; // The insertion point is in the high half
}else{
high = m - 1; // The insertion point is in the lower half}}for(j = i; j > low; j--){ // Step 3: All elements after the insertion position are moved back one bit
array[j] = array[j- 1];
}
array[low] = current; Step 4: Insert the element
}
return array;
}Copy the code
For comparison purposes, the array [5,4,3,2,1] is also used as an example š°. The evolution of the original array is as follows (same as above):
Although the split insertion sort significantly reduces the number of queries, the number of array element movements does not change. They all have order n squared time.
Hill sorting
Hill sort, also known as reduced increment sort, is another upgrade of direct insertion sort, which is essentially group insertion sort. The Hill order was named after its designer, Donald Shell, and published in 1959.
The basic idea of the algorithm:
- Split an array into subgroups, each consisting of elements incrementally apart. For example will,1,2,3,4,5,6,7,8,9,10 [0] array into “incremental” for 5 groups, so the child group (0, 5), respectively, [1, 6], [2, 7], [3], [4, 9] and [5, 10].
- Direct insertion sort is then applied to each subgroup.
- Gradually reduce the increment and repeat step 1,2.
- Until the delta is one, and that’s the last sort, and the sort at this point, it’s a direct insertion sort of the full array.
Here is a schematic of the sorting:
As you can see, Hill sort is actually a continuous direct insertion sort, and the grouping is to order local elements first. Because direct insertion sort is very efficient when the elements are basically ordered. Hill sorting, on the other hand, makes direct insertion sorting efficient by grouping first and sorting later. So Hill is more efficient.
If we try to abstract the common ground, it is not difficult to see that the fourth step of Hill sort is a direct insertion sort, and hill sort is originally a kind of encapsulation of a circular application of direct insertion sort from “increment” n to “increment” 1. So the direct insertion sort can be thought of as a hill sort of step one. So let’s do this by encapsulating direct insertion sort.
// The parameter increases the number of steps (gap).
function directInsertionSort(array, gap) {
gap = (gap == undefined)?1 : gap; // By default, the iterator starts with the element whose subscript is 1
var length = array.length, index, current;
for (var i = gap; i < length; i++) {
index = i - gap; // The subscript of the element to be compared
current = array[i]; // The current element
while(index >= 0 && array[index] > current) { // One of the preconditions: the element to be compared is larger than the current element
array[index + gap] = array[index]; // Move the element to be compared back by gap
index -= gap; // Move the cursor forward by gap
}
if(index + gap ! = i){// Avoid assigning the same element to itself
array[index + gap] = current; // Insert the current element into the reserved space}}return array;
}Copy the code
Then the algorithm of Hill sorting is implemented as follows:
function shellSort(array){
var length = array.length, gap = length>>1, current, i, j;
while(gap > 0){
directInsertionSort(array, gap); // Direct insertion sort by the specified step
gap = gap>>1;
}
return array;
}Copy the code
Similarly, take the array [5,4,3,2,1] for example š°. The evolution process of the original array is as follows:
The number of moves of array elements is reduced from 14 to 7 by comparing the direct insertion sort with the fold insertion sort. Hill sorting further improves sorting efficiency by splitting the original array into subarrays with smaller granularity.
Not only that, but the above steps are set to {N/2, (N/2)/2,… Hibbard: {1, 3… 2^k-1}. By adjusting the step size reasonably, the sorting efficiency can be further improved. In fact, the best known sequence of step sizes is proposed by Sedgewick (1, 5, 19, 41, 109…). The terms in the sequence are either 9*4^ I – 9*2^ I + 1 or 4^ I – 3*2^ I + 1. For details please poke Hill sort – Wikipedia.
Tips: We know that a single direct insertion sort is stable and does not change the relative order of the same elements, but in the course of multiple different insertion sorts, the same elements may move in their respective insertion sorts, which can cause the relative order of the same elements to change. Therefore, the Hill order is not stable.
Merge sort
Merge sort is based on merge operation. It takes the idea of divide and conquer. It splits an array into two subarrays, sorts them separately, and then merges the two subarrays. We split the two subarrays, then recursively split them into smaller subarrays, and sort them separately until the array length is 1, and we return that array directly.
Merge sort merges subarrays in strict left-to-right (or right-to-left) order. it does not change the relative order of the same elements, so it is also a stable sorting algorithm.
Here’s what the GIF looks like:
Merge sort can be implemented in two ways:
- Recursion from top to bottom
- Bottom-up iteration
Here is the algorithm implementation (method 1: recursion):
function mergeSort(array) { // Take a top-down recursive approach
var length = array.length;
if(length < 2) {
return array;
}
var m = (length >> 1),
left = array.slice(0, m),
right = array.slice(m); // Split into two subarrays
return merge(mergeSort(left), mergeSort(right));// Subarrays continue to be recursively split and then merged
}
function merge(left, right){ // Merge two subarrays
var result = [];
while (left.length && right.length) {
var item = left[0] <= right[0]? left.shift() : right.shift();// Note: the condition is less than or equal to, if only less than, then the sort will be unstable.
result.push(item);
}
return result.concat(left.length ? left : right);
}Copy the code
From above, an array of length n will eventually call the mergeSort function 2n-1 times. Merge sort, implemented recursively from top to bottom, has the risk of stack overflow. The number of recursive calls required to personally test stack overflows for each browser is roughly:
- Chrome v55: 15670
- Firefox v50: 44488
- Safari v9.1.2:50755
Here is the test code:
function computeMaxCallStackSize() {
try {
return 1 + computeMaxCallStackSize();
} catch (e) {
// Call stack overflow
return 1; }}var time = computeMaxCallStackSize();
console.log(time);Copy the code
For this reason, the ES6 specification puts forward the idea of tail call optimization: if the last step of a function is also a function call, then the required stack space for the function will be freed and it will go directly to the next call, and only the last call record will be kept in the final call stack.
Despite the allure of the ES6 specification, tail-tuning is currently not supported by any browser, and will be supported by major browsers in the near future.
Here is the algorithmic complexity:
Mean time complexity | The best situation | The worst | Spatial complexity |
---|---|---|---|
O (nlog ā n) | O (nlog ā n) | O (nlog ā n) | O(n) |
In terms of efficiency, merge sort is the “best” sorting algorithm. Suppose the array length is n, then splitting the array takes logn steps, and each step is a normal subarray merging process, the time complexity is O(n), so the time complexity of the synthesis is O(nlogn). On the other hand, subarrays that are split in multiple recursions of merge sort need to be stored in a memory space of O(n).
Quick sort
Quicksort borrows the idea of divide and conquer and improves on bubble sort. It was introduced by C. A. R. Hoare in 1962. It splits the array into two subarrays, where all the elements of one subarray are smaller than the elements of the other, and then does the same thing with both subarrays until the array is unsplit and sorted.
Here’s what the GIF looks like:
The algorithm implementation is as follows:
function quickSort(array, left, right) {
var partitionIndex,
left = typeof left == 'number' ? left : 0,
right = typeof right == 'number' ? right : array.length- 1;
if (left < right) {
partitionIndex = partition(array, left, right);// The benchmark value of the segmentation
quickSort(array, left, partitionIndex- 1);
quickSort(array, partitionIndex+1, right);
}
return array;
}
function partition(array, left ,right) { // Partition operation
for (var i = left+1, j = left; i <= right; i++) {//j is the cursor where the smaller value is stored
array[i] < array[left] && swap(i, ++j, array);// Take the first element as the reference
}
swap(left, j, array); // Move the first element to the middle
return j;
}Copy the code
Here is the algorithmic complexity:
Mean time complexity | The best situation | The worst | Spatial complexity |
---|---|---|---|
O (nlog ā n) | O (nlog ā n) | O (n squared) | O (nlog ā n) |
Quicksort sorting is very efficient. Although it runs at an O(n ^ 2) time complexity at its worst, in general, on average, it’s O(nlogn) time complexity, which is faster than merge sort, which is also O(nlogn) time complexity. Quicksort seems to favor sequences that are out of order, and the more out of order they are, the more efficient they are relative to other sorts. Chrome’s V8 engine uses quicksort when sorting more than 10 pieces of data in order to be more efficient. Insertion sort is used for data with 10 entries or less.
Tips: Like selection sort, quicksort can swap elements that are not adjacent each time, so it can break the order between elements that were originally the same value. Therefore, quicksort is not stable.
Heap sort
Robert W. Floyd, professor of computer science at Stanford University and winner of the computer Pioneer Award in 1991, invented the famous Heap Sort algorithm with J. Williams in 1964.
Heap sort is a sort algorithm designed by using the data structure of heap. It’s a selection sort. The heap is divided into large root heap and small root heap. The large root heap requires that the value of each child node is no greater than the value of its parent node, that is, array[childIndex] <= array[parentIndex]. The maximum value must be at the top of the heap. The small root heap is the opposite, where the value of each child node is no less than the value of its parent node, and the minimum value must be at the top of the heap. So we can use the large root heap for ascending sort and the small root heap for descending sort.
Not all sequences are heaps. For sequences K1, K2… Kn, the following conditions must be met:
- (2 I) and ki ki < = k < = k (I + 1) 2 (1 I n / 2 or less) or less, is the small root stacks, will the < = > =, that is a big root pile. We can view the heap here as a complete binary tree, where k(I) is the non-leaf node of the binary tree, k(2i) is the left child node, and K (2i+1) is the right child node.
The basic idea of the algorithm (taking large root heap as an example):
- The initial sequence K[1..n] is built into a large root heap, which is the initial unordered region.
- Then exchange the record K1 (that is, the top of the heap) with the last record K[n] of the unordered region, so as to obtain a new unordered region K[1..n-1] and ordered region K[n], and K[1..n-1]. Keys ā¤K[n].key
- After switching K1 and K[n], the top of the heap may violate the heap properties, so K[1..n-1] needs to be adjusted to the heap. Then repeat Step 2 until you stop with only one element in the unordered region.
Here’s what the GIF looks like:
The algorithm implementation is as follows:
function heapAdjust(array, i, length) {/ / heap of adjustment
var left = 2 * i + 1,
right = 2 * i + 2,
largest = i;
if (left < length && array[largest] < array[left]) {
largest = left;
}
if (right < length && array[largest] < array[right]) {
largest = right;
}
if (largest != i) {
swap(i, largest, array);
heapAdjust(array, largest, length);
}
}
function heapSort(array) {
// Create the big top heap
length = array.length;
for (var i = length>>1; i >= 0; i--) {
heapAdjust(array, i, length);
}
// Switch the first and last elements and readjust to the big top heap
for (var i = length - 1; i > 0; i--) {
swap(0, i, array);
heapAdjust(array, 0, --length);
}
return array;
}Copy the code
Above, ā The process of building the heap is processed from length/2 to 0, and the time complexity is O(n);
(2) The process of adjusting the heap is to adjust along the parent and child nodes of the heap. The number of executions is the depth of the heap, and the time complexity is O(LGN).
ā¢ The heapsort process is completed by n steps (ā”), and the time complexity is O(NLGN).
Tips: Heap sort is less suitable for small sequences because the heap initialization process is compared a lot. At the same time, because any subscripts exchange positions with each other many times, the original relative order between the same elements is destroyed, so it is an unstable order.
Count sorting
Counting sort is almost the only sorting algorithm that is not based on comparison. It was proposed by Harold H. Seward in 1954. Using it to sort integers over a range of integers in order of time (n+k), where k is the range of integers, it is almost faster than any comparison-based sorting algorithm (except when O(k)>O(n*log(n)) and less efficient than comparison-based sorting, such as merge and heapsort).
To use counting sort, the following conditions must be met:
- The sequences to be sorted are all integers
- Sorting requires additional storage space
The basic idea of the algorithm:
Counting sort takes advantage of the fact that once you know how many other elements (let’s say m) are smaller than an element in an array, you can determine the correct position (the m+1 bit) of that element.
- Get the maximum and minimum value of the array A to be sorted.
- Create a new count array B with the difference between the maximum value and minimum value +1 as the length, and store the number of the same elements as the value in the count array.
- Add the count to the count array B, storing the initial subscripts of the different values.
- We’re going to take the values of A, assign them to A new array C with the corresponding subscript, and return the array C.
Note: If array A is an array of objects that need to be sorted based on A certain attribute of the object, then the algorithm starts by treating array A as A simple array of object attribute values, simpleA, and then counts and accumulates based on simpleA, and so on.
Here’s what the GIF looks like:
The algorithm implementation is as follows:
function countSort(array, keyName){
var length = array.length,
output = new Array(length),
max,
min,
simpleArray = keyName ? array.map(function(v){
return v[keyName];
}) : array; // If keyName exists, then create a simple array of keyvalues
// Get the maximum and minimum values
max = min = simpleArray[0];
simpleArray.forEach(function(v){
v > max && (max = v);
v < min && (min = v);
});
// Get the length of the count array
var k = max - min + 1;
// Create and initialize the count array
var countArray = new Array(k);
simpleArray.forEach(function(v){
countArray[v - min]= (countArray[v - min] || 0) + 1;
});
// Add up the count to store the initial subscripts of different values
countArray.reduce(function(prev, current, i, arr){
arr[i] = prev;
return prev + current;
}, 0);
// Select a value from the original array. (This can only be done by traversing the original array.)
simpleArray.forEach(function(v, i){
var j = countArray[v - min]++;
output[j] = array[i];
});
return output;
}Copy the code
The above implementation supports sorting not only of numeric sequences, but also of objects based on the value of a property. The test is as follows:
var a = [2.1.1.3.2.1.4.2],
b = [
{id: 2.s:'a'},
{id: 1.s: 'b'},
{id: 1.s: 'c'},
{id: 3.s: 'd'},
{id: 2.s: 'e'},
{id: 1.s: 'f'},
{id: 4.s: 'g'},
{id: 2.s: 'h'}]; countSort(a);// [1, 1, 1, 2, 2, 2, 3, 4]
countSort(b, 'id'); // [{id:1,s:'b'},{id:1,s:'c'},{id:1,s:'f'},{id:2,s:'a'},{id:2,s:'e'},{id:2,s:'h'},{id:3,s:'d'},{id:4,s:'g'}]Copy the code
Tips: Counting sort does not change the original relative order of the same elements, so it is a stable sorting algorithm.
Bucket sort
Bucket sort is called bin sort, and it distributes an array into a finite number of buckets. Each bucket is sorted separately (so it is possible to use another sorting algorithm or continue bucket sorting recursively). Bucket sorting takes only O(n) time when the number of elements in each bucket tends to be the same. Bucket sort is efficient by swapping space for time, so it requires additional storage space (that is, the space of the bucket).
The basic idea of the algorithm:
The core of bucket sort is how to evenly distribute elements into each bucket. Proper allocation will greatly improve the efficiency of sorting.
The algorithm implementation is as follows:
function bucketSort(array, bucketSize) {
if (array.length === 0) {
return array;
}
var i = 1,
min = array[0],
max = min;
while (i++ < array.length) {
if (array[i] < min) {
min = array[i];// The minimum value of input data}else if (array[i] >max) { max = array[i]; / / input data of the maximum}} / barrel initialization bucketSize = bucketSize | | 5. Var bucketCount = ~~((max-min)/bucketSize) + 1, // Number of buckets = new Array(bucketCount); For (I = 0; i< buckets.length; i{+ +)buckets[i] = [];// Initialize buckets} // Allocate data to each bucket. In this case, the data value distribution is directly based on the data value distribution. The data efficiency is most efficient when the data value is evenly distributed within a certain rangefor (i = 0; i < array.length; i{+ +)buckets[~ ~ ((array[i] - min) / bucketSize)].push(array[i]);
}
array.length = 0;
for (i = 0; i < buckets.length; i{+ +)quickSort(buckets[i]); // Sort each bucket, using quicksortfor (var j = 0; j < buckets[i].length; j{+ +)array.push(buckets[i] [j]); // Write the sorted data back to the array}}return array;
}Copy the code
Tips: The bucket sort itself is a stable sort, so its stability is consistent with the stability of the bucket sort.
In fact, buckets are just an abstract concept, and the idea is similar to merge sort, quicksort, etc., which is to divide a lot of data into N different containers, sort them separately, and then merge the data. This method greatly reduces the overall number of traversal in sorting and improves the efficiency of the algorithm.
Radix sort
Radix sort comes from the old punching machine, which saw only one column at a time. It sorts based on the characters in each bit of the element value. For numbers, this is based on the ones place, tens place, hundreds place, thousands place, etc.
There are two implementations of sorting in order of high or low priority:
- MSD: from the high level as the basis, first sorted and grouped by K1, recorded in the same group, the key code K1 is equal, and then sorted into subgroups by K2, then continue to sort and group the following key codes in this way, until the subgroups are sorted by the most important key kd. Join the groups together and you get an ordered sequence. MSD mode is suitable for sequences with many bits.
- LSD: From the low order as the basis, first start the sorting from KD, then the sorting of KD-1, repeat, until the sorting of K1 to get an ordered sequence. LSD mode is suitable for sequences with few bits.
Here is the LSD GIF:
The algorithm implementation is as follows:
function radixSort(array, max) {
var buckets = [],
unit = 10,
base = 1;
for (var i = 0; i < max; i++, base *= 10, unit *= 10) {
for(var j = 0; j < array.length; j++) {
var index = ~~((array[j] % unit) / base);// Filter out the ones, tens, and so on
if(buckets[index] == null) {
buckets[index] = []; // Initialize the bucket
}
buckets[index].push(array[j]);// Add data to different buckets
}
var pos = 0,
value;
for(var j = 0, length = buckets.length; j < length; j++) {
if(buckets[j] ! =null) {
while((value = buckets[j].shift()) ! =null) {
array[pos++] = value; // Retrieve data from different buckets one by one to prepare for the next round of high ranking. Since the elements near the bottom of the bucket are ranked first, the bottom of the bucket is retrieved first}}}}return array;
}Copy the code
The algorithms above, if used to compare time, sort first by day, then by month, and finally by year, only need to sort three times.
Radix sort is more suitable for sorting data with unknown overall weights, such as time, strings, etc.
Tips: Radix sort does not change the relative order between the same elements, so it is a stable sorting algorithm.
summary
The comparison of sorting performance is as follows:
Sorting type | On average | The best situation | The worst | Auxiliary space | The stability of |
---|---|---|---|---|---|
Bubble sort | O (n squared) | O(n) | O (n squared) | O(1) | stable |
Selection sort | O (n squared) | O (n squared) | O (n squared) | O(1) | unstable |
Direct insertion sort | O (n squared) | O(n) | O (n squared) | O(1) | stable |
Half insertion sort | O (n squared) | O(n) | O (n squared) | O(1) | stable |
Hill sorting | O (n ^ 1.3) | O(nlogn) | O (n squared) | O(1) | unstable |
Merge sort | O (nlog ā n) | O (nlog ā n) | O (nlog ā n) | O(n) | stable |
Quick sort | O (nlog ā n) | O (nlog ā n) | O (n squared) | O (nlog ā n) | unstable |
Heap sort | O (nlog ā n) | O (nlog ā n) | O (nlog ā n) | O(1) | unstable |
Count sorting | O(n+k) | O(n+k) | O(n+k) | O(k) | stable |
Bucket sort | O(n+k) | O(n+k) | O (n squared) | O(n+k) | (Not) stable |
Radix sort | O(d(n+k)) | O(d(n+k)) | O(d(n+kd)) | O(n+kd) | stable |
Note: The stability of bucket ordering depends on the stability of intra-bucket ordering, so its stability is uncertain. In cardinality sort, k represents the cardinality of the keyword, d represents the length, and n represents the number of keywords.
I would like to miss the algorithm course that far away.
To be continued…
Thanks to Visualgo.net/ for the graphic support. Special thanks to Sean who is not a lamb for the explanation of the sorting algorithm of JS house published in the Jane book.
If you have any questions or good ideas, please feel free to leave a comment below.
Louis
This paper links: louiszhai. Making. IO / 2016/12/23 /…
Refer to the article
- JS home sorting algorithm – simple book
- Vernacular classical algorithm series three hill sort implementation – MoreWindows Blog – Blog channel – CSDN.NET
- Algorithms and data structures (13) Bubble sort, Insert sort, hill sort, Select sort (Swift3.0 version) – Cyan Fu case – Bloggarden