All the complete code for this series of articles on data structures and Algorithms has been uploaded to Github. If you want the complete code, you can go there directly. If you can, please click Star
- Github.com/Lpyexplore/…
In previous articles, I’ve covered all the data structures that the front end needs to know, and we’ve encapsulated them. Now we are going to start the sorting algorithm part, sorting algorithm, as the name implies, is a bunch of disorganized data in order to arrange them together according to certain rules.
When explaining sorting algorithms, there are roughly two categories, as shown in the following figure
This article first introduces three kinds of simple sorting implementation ideas and code implementation
- Public account: Front impression
- Sometimes send book activities, remember to pay attention to ~
- [Interview questions], [front-end required e-book], [data structure and algorithm complete code], [front-end technology communication group]
Big O notation
The big O representation is a rough representation of the time complexity of the algorithm, where the time complexity of the algorithm is the number of basic operations required by the code during the algorithm execution.
Q: Suppose there are five people A, B, C, D and E. Their heights are 165, 178, 150, 180 and 200, please find the tallest person and record the number of comparisons.
Let’s do a very simple algorithm to understand the big O notation
A: Let’s compare A with B first. B is higher than A. So let’s compare B with C, so C is shorter than B; D is higher than B. And then finally, if you compare D with E, you get E is higher than D. So we end up with E being the highest of the five, and we’re going to write down four comparisons.
So using the same method, the number of people is now set to n, so we need to compare at most n minus 1 times, and then we can make the time complexity of this algorithm order n.
Why order n? Since this representation is actually a fuzzy statistical method, we should follow the following principles:
- The number of code runs is only the highest
- All the constants of addition terms are replaced by 1
- The constant of the highest degree term is replaced by 1
So when we compare n minus 1, we just take the highest degree term, and we change the constant term of the highest degree term to 1, so it’s O(n) in big O notation.
Other common big O notations include the following:
symbol | The name of the |
---|---|
O(1) | constant |
O(log(n)) | logarithmic |
O(n) | linear |
O(nlog(n)) | Linear and logarithmic products |
O (n squared) | square |
O( ) |
index |
In each of the following sorting algorithms, we will simply judge their time complexity and use the big O notation to express it
Two, bubble sort
Bubble sort is one of the simplest and most brutal sorting algorithms, and it sort as its name suggests, one by one data bubbling up.
Let’s take a look at the implementation process (the sorting process is from small to large), and directly look at a GIF
The idea is to start from the far left and compare the size of the two adjacent elements, and if the number on the left is larger than the number on the right, swap, so that after comparing all the adjacent elements, the number on the far right is the largest.
It then continues from the far left, comparing each neighboring element and deciding if it needs to be swapped, but unlike the first time, the far right number does not need to be compared because it is already the largest. So the second number is going to be the second largest number from right to left after the second comparison.
And so on, you can sort the data from smallest to largest
Shall we see how to encapsulate the bubble sort function
function bubbleSort(arr) {
// Encapsulates an exchange function for later invocation
function exchange(v1, v2) {
let temp = arr[v1]
arr[v1] = arr[v2]
arr[v2] = temp
}
// Get the length of the array passed in
let length = arr.length
// 1. Set the length of each traversal. The length of each traversal is -1
for(let i = length - 1; i >= 0; i --) {
// 2. Start with the leftmost number and compare adjacent elements
for(let j = 0; j < i; j ++) {
// 3. If the number on the left is larger than the number on the right, swap the two elements
if(arr[j] > arr[j + 1]) {
exchange(j, j + 1)}}}// Returns the sorted array
return arr
}
Copy the code
Let’s briefly test whether this method is correct
let arr = [45.66.1.19.34.80.2]
console.log(bubbleSort(arr));
// Print the result: [1, 2, 19, 34, 45, 66, 80]
Copy the code
Next, we’ll discuss how bubble sort comparisons and swaps can be expressed in big O notation.
Let’s say there are four numbers in an array, and the first time we go through it we compare three times, and we find a maximum; The second iteration only compares 3 of them, only 2 times, and then finds the second largest value; The third iteration just compares the remaining two numbers, just once, and the array is sorted. If you don’t understand, check out the GIF above
The total number of comparisons is 3 + 2 + 1 = 6. So extending to the ordinary case, an array with n elements, the total number of comparisons required is (n-1) + (n-2) +… + 2 + 1 = N *(n-1)/2, according to the rules of big O notation, we find the highest degree term is n²/2, and set its constant term to 1, which is n², so the number of comparisons of bubble sort is O(n²) in big O notation.
Let’s see how the number of swaps for bubble sort is represented in big O notation. Obviously, we don’t know how many times we swap, so we’re going to assume that if we swap every two comparisons, the total number of swaps is n times n minus 1 over 4, and according to the rules of big O notation, we know that the number of swaps in big O notation is also O(n ^ 2).
Conclusion:
- Comparison times of bubble sort: O(n²)
- Number of swaps for bubble sort: O(n²)
Three, selection sort
Select sort is very similar to bubble sort, the only difference is that select sort each time through, compare each element, the maximum or minimum value of the index in a variable, after all the comparison, and then exchange the elements on the index. Simply put, selection sort swaps once per traverse, whereas bubble sort swaps many times per traverse, so selection sort is generally a little more efficient than bubble sort.
Similarly, let’s take a look at the GIF showing selection sort (sort from smallest to largest) :
Let’s look at code encapsulation for selection sort
function selectionSort(arr) {
// Encapsulates a function that swaps elements for later calls
function exchange(v1, v2) {
let temp = arr[v1]
arr[v1] = arr[v2]
arr[v2] = temp
}
// Get the length of the array passed in
let length = arr.length
// 1. Set the traversal scope
for(let i = 0; i < length - 1; i ++) {
// 2. Set the start index of the traversal to the lowest index
let min = i
// 3. Iterate through all elements starting from the last value of index min
for(let j = min; j < length; j ++) {
// 3.1 Compare each iterated element with arr[min]
if(arr[min] > arr[j]) {
min = j
}
}
// 4. Swap the element on the index min of the resulting minimum value with the element on the position we initially traversed
exchange(min, i)
}
// Returns the sorted array
return arr
}
Copy the code
Let’s test that out
let arr = [45.66.1.19.34.80.2]
console.log(selectionSort(arr));
// Print the result: [1, 2, 19, 34, 45, 66, 80]
Copy the code
Now that we know the difference between selection sort and bubble sort, it should be clear that selection sort has the same number of comparisons as bubble sort, so the number of comparison for selection sort is O(n ^ 2) in big O notation.
For every time selection sort traverses the array, it only swaps data once, so the number of swaps is expressed in big O notation as O(n).
Conclusion:
- Comparison times of selection sort: O(n²)
- Number of swaps for selection sort: O(n)
Insert sort
Insertion sort is a sorting algorithm that compares a specified element with an ordered region element and swaps positions.
Let’s just do a quick example. Let’s say I have an unordered array like this
First of all, we think of an element with index 0 as a region, and that region is ordered, because there is only one element, and it is one element in any order, so it is considered to be ordered.
We then take the first element to the right of the ordered region, element 67 with index 1, and store it in the variable temp. Then, starting from the rightmost of the ordered region, the elements are compared with the element 67 in the temp variable. If the value is greater than 67, the position is moved one space to the right. If it’s less than 67, you don’t have to go on, because this region is ordered.
GIF of the first walk:The ordered region isIndexes 0 to 1
This part, so we take the element with index two and compare it to the element in the ordered region
Gifs for the second walk:
The ordered region isIndexes 0 to 2
This part, so we take the element with index 3 and compare it to the element in the ordered region
Gifs for the third pass:
The ordered region isIndexes 0 to 3
This part, so we take the element with index 4 and compare it to the element in the ordered region
Gifs for the fourth traverse:
Now the entire array is an ordered region, and that’s a full insertion sort
Next, we’ll wrap an insertion sort function
function insertionSort(arr) {
// Get the length of the array passed in
let length = arr.length
// 1. Iterate through the array starting with the element with index 1
for(let i = 1; i < length; i ++) {
// 2. Fetch the first element to the right of the ordered region
let temp = arr[i]
let j = i
// 3. Compare elements in the ordered region with Temp from right to left
while(arr[j - 1] > temp && j > 0) {
arr[j] = arr[j - 1]
j --
}
// 4. Insert temp into the appropriate position
arr[j] = temp
}
// Returns the sorted array
return arr
}
Copy the code
Let’s test this function to see if it works
let arr = [45.66.1.19.34.80.2]
console.log(insertionSort(arr));
// Print the result: [1, 2, 19, 34, 45, 66, 80]
Copy the code
Insert sort is indeterminate in terms of the number of iterations, the number of comparisons, and the number of moves, so let’s assume the worst case scenario.
First traversal: the number of comparisons is 1, and the number of element moves is 1; Second traversal: the number of comparison is 2, and the number of element movement is 2; … NTH traversal: the number of comparison is N, and the number of element movement is N;
So, the number of comparisons for insertion sort is 1 + 2 +… Plus n, the number of moves is the same as the number of comparisons, so let’s take the average, which is (n ^ 2 – n)/4, which is O(n ^ 2) in big O notation.
Conclusion:
- Insert sort comparison times: O(n²)
- Number of moves of elements in insert sort: O(n²)
V. Concluding remarks
The next article will cover three advanced sorting algorithms: Hill sort, merge sort, and quicksort.
Then you can pay attention to my wechat official account: Front Impressions. After this column is finished, I will put notes on each data structure and algorithm on my official account, where you can go to get them.
Or you can go to my Github to get the complete code, please click Star
- Github.com/Lpyexplore/…
You’re not gonna give me a “like” when you see this?