🌟🌟🌟🌟🌟
Taste: Sauteed beef
Cooking time: 10min
Github github.com/Geekhyt, welcome to the canteen, if you think the food is delicious, a Star is a great encouragement to the canteen owner.
Sorting algorithm is the interview in the high-frequency research point, we need to master. This article has compiled the most classic and commonly used sorting algorithms and paired them with giFs and videos to help you understand them more easily.
First of all, according to the characteristics of sorting algorithms can be divided into the following two categories:
- Comparison sort
- Non-comparison class sorting
As the name implies, comparison sorts are sorted by comparing elements, while non-comparison classes do not involve comparing elements.
The time complexity of comparison sorting cannot break O(Nlogn), also known as nonlinear sorting.
The time complexity of non-comparison sorting can break O(nlogn) and can run in linear time, also known as linear sorting.
If you are not familiar with time complexity, you can read my column on JavaScript algorithm time and space complexity analysis.
01 Bubble Sort
Bubble Sort visual video 👈
Bubble sort, simple and rude, one sentence explanation:
Bubble sort in each bubble operation, the two adjacent elements will be compared to see if they meet the size relationship requirements, if not, they will be swapped. Iterate until you no longer need to swap, that is, the sorting is done.
const bubbleSort = function(arr) {
const len = arr.length
if (len < 2) return arr
for (let i = 0; i < len; i++) {
for (let j = 0; j < len - i - 1; j++) {
if (arr[j] > arr[j + 1]) {
const temp = arr[j]
arr[j] = arr[j + 1]
arr[j + 1] = temp
}
}
}
return arr
}
Copy the code
- Time complexity: O(n^2)
- Space complexity: O(1)
- stable
Note: By stable, I mean bubble sort is a stable sort algorithm.
What is a stable sorting algorithm?
Stability of sorting algorithm
It is not enough to judge the merits of sorting algorithms only by execution efficiency and memory consumption. For sorting algorithms, there is another important metric, stability.
This means that if there are elements with equal values in the sorted sequence, the order of the elements will not change.
An 🌰 :
Let’s say we have a set of data: 1,9,2,5,8,9. In order of size, it’s one, two, five, eight, nine, nine.
There are two nines in this set of data, and after sorting by some sort algorithm, if the order of the two nines does not change, we call that a stable sorting algorithm. Otherwise, it is an unstable sorting algorithm.
Bubble sort optimization
The code above can be optimized so that when a bubbling operation has no data exchanged, it is completely ordered and no further bubbling operations need to be performed.
const bubbleSort = function(arr) {
const len = arr.length
let flag = false
if (len < 2) return arr
for (let i = 0; i < len; i++) {
flag = false // Flag to exit the bubble loop early
for (let j = 0; j < len - i - 1; j++) {
if (arr[j] > arr[j + 1]) {
const temp = arr[j]
arr[j] = arr[j + 1]
arr[j + 1] = temp
flag = true // Indicates data exchange}}if(! flag)break // There is no data exchange, exit early
}
return arr
}
Copy the code
02 Insertion Sort
Insert sort as the name implies, for the unsorted data, the sorted sequence is scanned from back to front to find the corresponding position for insertion, keeping the sorted sequence elements in order.
Start with I equals 1, get the current element (curr), and compare it to the previous element.
If the preceding element is greater than the current element, the previous element is swapped with the current element, and the loop continues until the unsorted sequence is empty and the sorting is complete.
const insertSort = function(arr) {
const len = arr.length
let curr, prev
for (let i = 1; i < len; i++) {
curr = arr[i]
prev = i - 1
while (prev >= 0 && arr[prev] > curr) {
arr[prev + 1] = arr[prev]
prev--
}
arr[prev + 1] = curr
}
return arr
}
Copy the code
- Time complexity: O(n^2)
- Space complexity: O(1)
- stable
03 Select Sort Selection Sort
Select sort visual video 👈
Selection sort is similar to insertion sort in that there are sorted and unsorted sequences.
But selective sort puts the smallest element at the beginning of the array, finds the smallest element from the remaining unsorted sequence, and places it behind the sorted sequence. And so on until the sorting is done.
const selectSort = function(arr) {
const len = arr.length
let temp, minIndex
for (let i = 0; i < len - 1; i++) {
minIndex = i
for (let j = i + 1; j < len; j++) {
if (arr[j] <= arr[minIndex]) {
minIndex = j
}
}
temp = arr[i]
arr[i] = arr[minIndex]
arr[minIndex] = temp
}
return arr
}
Copy the code
- Time complexity: O(n^2)
- Space complexity: O(1)
- unstable
04 Merge Sort Merge Sort
The divide-and-conquer method is a typical application. The idea of divide-and-conquer algorithm is largely based on recursion, which is also suitable for recursion.
The process is bottom-up, dealing with subproblems and then merging.
For those of you who feel like you don’t know recursion very well, take a look at this column do you really know recursion? .
Divide and conquer, as the name suggests. It is generally divided into the following three processes:
- Decomposition: The decomposition of the original problem into a series of sub-problems.
- Solution: recursively solve each subproblem, if the subproblem is small enough, then directly solve.
- Merge: Combine the results of subproblems into the original problem.
Merge sort is an array to be sorted into smaller subproblems, and then merge the subproblems together, so that the entire array is ordered.
const mergeSort = function(arr) {
const merge = (right, left) = > {
const result = []
let i = 0, j = 0
while (i < left.length && j < right.length) {
if (left[i] < right[j]) {
result.push(left[i++])
} else {
result.push(right[j++])
}
}
while (i < left.length) {
result.push(left[i++])
}
while (j < right.length) {
result.push(right[j++])
}
return result
}
const sort = (arr) = > {
if (arr.length === 1) { return arr }
const mid = Math.floor(arr.length / 2)
const left = arr.slice(0, mid)
const right = arr.slice(mid, arr.length)
return merge(mergeSort(left), mergeSort(right))
}
return sort(arr)
}
Copy the code
- Time complexity: O(nlogn)
- Space complexity: O(n)
- stable
05 Quick Sort
Quicksort visual video 👈
Quicksort is also a divide-and-conquer application, where the process is top-down, partitioning first, and then dealing with subproblems.
Quicksort separates the elements to be sorted into two separate parts by iterating through the number group. If the elements in one part are smaller than those in the other part, the elements in the two parts can be sorted until the sorting is complete.
This requires picking an element from the array as the pivot, and then reordering the array so that the elements smaller than the pivot are placed in front of the pivot, and those larger than the pivot are placed behind.
The quick method is then recursively called for subarrays less than the reference value (left) and greater than the reference value (right) until the sorting is complete.
const quickSort = function(arr) {
const quick = function(arr) {
if (arr.length <= 1) return arr
const len = arr.length
const index = Math.floor(len >> 1)
const pivot = arr.splice(index, 1) [0]
const left = []
const right = []
for (let i = 0; i < len; i++) {
if (arr[i] > pivot) {
right.push(arr[i])
} else if (arr[i] <= pivot) {
left.push(arr[i])
}
}
return quick(left).concat([pivot], quick(right))
}
const result = quick(arr)
return result
}
Copy the code
- Time complexity: O(nlogn)
- Space complexity: O(nlogn)
- unstable
06 Heap Sort
Heapsort is a little bit more complicated than the other sorts, but that’s okay. Let’s look at some of the things that we need to know about heapsort.
Heap sort, as its name implies, is the use of heap data structure for sorting. A heap is a special kind of tree that satisfies two things:
- The heap is a complete binary tree
- The value of each node in the heap must be greater than or equal to (or less than or equal to) the value of each node in its child tree
A heap whose value of each node is greater than or equal to the value of each node in the subtree is called a large top heap, and a heap whose value is less than or equal to the value of each node in the subtree is called a small top heap.
That is, in the big top heap, the root node is the largest element in the heap. In a small top heap, the root node is the smallest element in the heap.
If you’re not familiar with trees as data structures, you can take a look at my tree column
If the heap is represented as an array, given the subscript I of A node (I starts at 1), its parent must be A[I / 2], its left child A[2i], and its right child A[2i + 1].
Heap sort consists of two processes, build heap and sort.
Start by building a big top heap, which stores the maximum value at the root node (I = 1).
The root node of the big top heap is exchanged with the last node of the heap every time. At this time, the maximum value is put into the last bit of the effective sequence, and the effective sequence is reduced by 1. The effective heap still retains the complete binary tree structure, and then heaps to become a new big top heap.
Repeat this until the effective heap length is 0 and the sorting is complete.
const heapSort = function(arr) {
buildHeap(arr, arr.length - 1)
let heapSize = arr.length - 1 // Initialize the effective sequence length of the heap
for (let i = arr.length - 1; i > 1; i--) {
swap(arr, 1, i) Swap the heap-top element with the last valid child element
heapSize-- // The valid sequence length is reduced by 1
heapify(arr, heapSize, 1) // heap the valid sequence
}
return arr
}
// Build the big top heap
const buildHeap = function(items, heapSize) {
Back-to-front does not start from the last element in the sequence, but from the last non-leaf node. This is because leaf nodes have no children and do not need top-down heaping-style.
// The parent of the last child node is n/2, so the heap starts at position n/2
for (let i = Math.floor(heapSize / 2); i >= 1; i--) {
heapify(items, heapSize, i)
}
}
/ / heap
const heapify = function(arr, heapSize, i) {
while (true) {
let maxIndex = i
if (2 * i <= heapSize && arr[i] < arr[i * 2]) {
maxIndex = i * 2
}
if (2 * i + 1 <= heapSize && arr[maxIndex] < arr[i * 2 + 1]) {
maxIndex = i * 2 + 1
}
if (maxIndex === i) break
swap(arr, i, maxIndex)
i = maxIndex
}
}
// Swap utility functions
const swap = function(arr, i, j) {
let temp = arr[i]
arr[i] = arr[j]
arr[j] = temp
}
Copy the code
- Time complexity: O(nlogn)
- Space complexity: O(1)
- unstable
To make it easier for you to understand and remember, I have compiled the complexity and stability of these six sorting algorithms into the following table:
This article explains the ten classic sorting algorithm of 6 sorting algorithm, the 6 sorting algorithm is usually more common in the development, we must be familiar with.
The rest of hill sort, count sort, bucket sort, radix sort, if you are interested in you can click the link below to learn.
Standing on the shoulders of giants
- Ten classic sorting algorithms
- Front-end advanced algorithm 9: after reading this article, no longer afraid of heap sort, Top K, median question interview
- The core Principles of JavaScript
- The Beauty of Data Structures and Algorithms
2021 group brush problem plan
- LeetCode problem solving warehouse in front canteen
At the beginning of the year, a flag was set that the warehouse above would write 100 front-end interview high-frequency questions in 2021, and the progress has been completed one-third of the time.
If you are going to brush or are currently brushing LeetCode, join the front canteen and have a good time.
❤️ Love triple punch
1. If you think the food and drinks in the canteen are ok with you, please give me a thumbs-up. Your thumbs-up is my biggest motivation.
2. Pay attention to the front canteen of the public account and eat every meal!
3. Like, comment, forward === urge more!