preface

This article will summarize several commonly used sorting algorithms, and carry on a detailed analysis of their understanding, to solve the problem as the driver, analysis of its various application scenarios, for the sorting algorithm brush to lay a foundation, to achieve according to the characteristics of the sorting algorithm, combined with the topic scene can quickly choose the corresponding sorting method effect. Again, merge sort and quicksort at the end are very important.

1. Start with a sorting problem

Description: Given an integer array nums, please sort the array in ascending order.

Example 1: input: nums =,2,3,1 [5] output:,2,3,5 [1] example 2: input: nums =,1,1,2,0,0 [5] output:,0,1,1,2,5 [0]

1 <= nums.length <= 5000-50000 <= nums[I] <= 50000

In the third chapter will use a variety of sorting methods to solve the problem, comparative analysis.

2. Analysis indicators are briefly introduced

A. Time complexity analysis

The time complexity of an algorithm reflects how long it takes a program to run from start to finish. The time complexity of the algorithm is taken as the number (frequency) of repeated basic operations. In simple terms, there is no loop, called O(1), also known as constant order. If there is only one cycle (the number of cycles is uncertain), the execution frequency of basic operations of the algorithm increases linearly with the problem size N, denoted as O(n), also known as linear order. Common time complexity is O(1): constant-level time complexity

/ / sample
let i=1;
let j==2;
let sum=i+j;// If there are three rows, it is still O(1).
Copy the code

O(log n): logarithmic time complexity,O(nlog n): linear logarithmic time complexity

i=1;
while (i <= n)
{ 
i = i * 2;// If ax =N (a>0, and a≠1), then x is called the logarithm base a of N, called x=logaN
// Whatever the base is, the time complexity is O(logN).
}
Copy the code

If the time of a piece of code is O(logn), and we iterate n times, the time is O(nlogn). O(nlogn) is also a very common algorithm time complexity. For example, merge sort and quicksort have O(nlogn) time complexity.

O(n): linear order time complexity, O(n^2): square order time complexity, O(n^3): cubic order time complexity, O(n^k): k order time complexity simple say even, a double cycle even O(n), double cycle even O(n^2), and so on, so to understand

O(2^n): exponential time complexity O(n!) : Factorial time complexity Don’t need to know too much about time complexity, we commonly space complexity is O(1), O(n), O(n2).

B. Space complexity analysis

The spatial complexity of a program refers to the amount of memory required to run a program. Using the spatial complexity of a program, you can have a prior estimate of how much memory a program will need to run.

let arr=new Array(n); // The space complexity is O(n)Copy the code

In addition to storage space and storage of instructions, constants, variables and input data used by a program, it also needs some units of work to manipulate the data and some auxiliary space to store information needed for real computation.

C. Stability analysis

It is assumed that there are multiple records with the same keywords in the sequence of records to be sorted. If the relative order of these records remains unchanged after sorting, that is, ri= Rj in the original sequence and RI is ahead of Rj, but ri is still ahead of Rj in the sorted sequence, the sorting algorithm is said to be stable. Otherwise it is called unstable. Simple example: the condition nums[I]

In general, it is only a numerical attribute that is sorted. Whether the order of two identical attributes is changed or not is meaningless, and the time complexity will not be greatly affected. (Soul torture) The main significance of stability is: for example: a class of students have been sorted according to the size of the student number, I now ask to be sorted according to the age of the students, if the same age, must be sorted according to the order of the student number. So the question is, if the age sorting method you choose is not stable, is it the same age group of students after the sorting is out of order, you have to put this group of students of the same age in accordance with the student number again. If it was a stable sorting algorithm, I would have just sorted it by age. Sort from one key, then from another, and the results of the first key sort can be used by the second key sort. The content to be sorted is multiple numeric attributes of a complex object, and its original initial order has meaning, so we need to maintain the original meaning of sorting on the basis of second sorting, and then we need to use the stable algorithm.

Scene: taobao goods sorting, according to the sales, prices and other conditions for sorting, it’s a lot of data in the data server, therefore, when using a sorting algorithm stability result is bad, such as the heap sort, shell sort, when it comes to the worst case, will make the sorting effect is very poor, serious impact on the performance of the server, affect the user experience.

3. Each sorting method to solve the problem

A. Insert sort

The code:

// Use insert sort double for o(n^2)
 var sortArray = function(nums) {
    for(let i=1; i<nums.length; i++){// From the second card, compare to the left
        let target=i;
        for(let j=i-1; j>=0; j--){// I precedes j, where j is the left part of the current nums[I] to be compared
            if(nums[target]<nums[j]){// The current element is inserted forward.
               [nums[target],nums[j]]= [nums[j],nums[target]];
               // Destruct the assignment quickly
                target=j;
            }else{
            // Since the left part is sorted, if the left part is smaller than itself,
            // The left side is also smaller than itself, the current element finds its position, can exit, the next element insert
                break; }}}return nums;
};
Copy the code

Sorting process introduction: I give an example, insert sort is like playing poker, unfold a deck of cards, sort for it, starting from the second card, from right to left step by step comparison, until it finds the position in accordance with the sorting rules, and then insert. Insertion sort is the case, as described in the above code, each value compared to the left, when compared with small will swap the position, and then continue to the left, conform to the laws of swap position, until the left, it’s guaranteed to the left of the current element has been sorted ok, so when elements compared to the smaller than their position when you can stop the comparisons, Performs insertion sort for the next value. The conclusion is: by correctly locating the current element in the ordered sequence, continuously expanding the range of ordered array, finally achieve the goal of complete sort.

Leetcode submits results for analysis:



Sorting analysis: Time complexity :O(n2) Space complexity :O(1) Stability: stable

B. Bubble sort

The code:

var sortArray = function(nums) {
    let len=nums.length;
    // Cache the array length to avoid continuous calculation and wasted performance
    for(let i=0; i<len; i++){// Bubble is done len times, each time bubbling to the right from the first.
        for(let j=0; j<len-1; j++){if(nums[j]>nums[j+1]){
                [nums[j],nums[j+1]]=[nums[j+1],nums[j]]; }}}return nums;
};
Copy the code

Sort process introduction: bubble algorithm is very easy to understand, the code is very easy to understand, is the array has how many elements bubble how many, each time from the left of the bubble, has been compared to the right. If the element on the right is larger than the element on the right, it will switch positions, equivalent to bubbling a distance. If the element on the right is larger than itself, the element on the right will take over the baton and continue bubbling to the right until it reaches the right.

Leetcode Submission results analysis:

Think about it, if an array of length 5 bubbles three times and then it’s sorted, then it bubbles two more times, isn’t that a lot of waste? It’s redundant. We should optimize it.

// Optimize the code so that it does not bubble
var sortArray = function(nums) {
    let len=nums.length;
    // Cache the array length to avoid continuous calculation and wasted performance
    for(let i=0; i<len; i++){let complete=true;
        // optimization, used to determine whether the swap order occurred last time, to determine whether the order is good
        // Bubble is done len times, each time bubbling to the right from the first.
        for(let j=0; j<len-1; j++){if(nums[j]>nums[j+1]){
                [nums[j],nums[j+1]]=[nums[j+1],nums[j]];
                complete=false; }}// If the loop is sorted early, exit the loop
        if(complete){
            break; }}return nums;
};
Copy the code

Think again, every bubble, the largest will be on the right, bubble for the second time, the second largest will row to the right of the second position, the right of the N even if the order, then behind the bubble is necessary to tube these several? There’s no need. Let’s optimize it again!

// Optimize the code so that it does not bubble when it is sorted in advance, and it does not pipe the last n bytes that are sorted
var sortArray = function(nums) {
    let len=nums.length;
    // Cache the array length to avoid continuous calculation and wasted performance
    for(let i=0; i<len; i++){let complete=true;
        // optimization, used to determine whether the swap order occurred last time, to determine whether the order is good
        // Bubble is done len times, each time bubbling to the right from the first.
        for(let j=0; j<len-1-i; j++){// The condition of the inner loop is changed, that is, the I elements are ignored and no longer compared
            if(nums[j]>nums[j+1]){
                [nums[j],nums[j+1]]=[nums[j+1],nums[j]];
                complete=false; }}// If the loop is sorted early, exit the loop
        if(complete){
            break; }}return nums;
};
Copy the code

After two optimizations, you can find that it takes much less time.



Sequence analysis:

Time complexity: O(N2)

Space complexity :O(1)

Stability: stability

C. Select sort

The code:

// Select sort
var sortArray = function(nums) {
    let len=nums.length;
    for(let i=0; i<len-1; i++){// The outer loop determines the range from [0,len] to [1,len] to... Len] to [len - 1,
    // The scope is gradually narrowed, each time to find the smallest to put in the current scope of the head
        let minIndex=i;
        for(let j=i+1; j<len; j++){// Find the minimum value
            if(nums[minIndex]>nums[j]){
                minIndex=j;
            }
        }
        [nums[i],nums[minIndex]]=[nums[minIndex],nums[i]];
    }
    return nums;
};
Copy the code

Sorting process introduction: Select the keyword is the minimum value: loop through the number group, each time find the current range (range from [0,array.length] to [1,array.length] to [2,array.length]… Up to the minimum range [array. leng-1,array.length], and so on narrowing the range), put it at the head of the current range; Then narrow down the range of sorted numbers and repeat until the array is completely sorted.

Leetcode submits results for analysis:



Sorting analysis: temporal complexity :O(n2) Spatial complexity :O(1) Stability: unstable (for example, in sequence 6, 8, 6, 2, 9, we know that the first selection of the first element 6 will exchange with 2, so the relative order of the two 6’s in the original sequence will be broken, so it is unstable)

The above a, B, C three kinds of sorting time efficiency is very low, but do not be disappointed, think that the two sorting algorithm is the application of the idea of “divide and conquer” its useless, the above is the basic sort, must master, the next talk about merge and fast sorting will be advanced!

The following two sorting algorithms are both applications of the idea of divide and conquer (divide and conquer is to divide and conquer, the idea is to decompose a big problem into several sub-problems, solve the sub-problems separately, and then integrate the solutions of the sub-problems into the solution of the big problem). Divide and conquer in three steps:

  • Decomposition subproblem
  • Solve each subproblem
  • Combine the solutions of the subproblems to get the solution of the larger problem

Now let’s combine merge sort and quicksort and learn divide-and-conquer as you like!

D. Merge sort

The code:

// Merge sort uses the recursive idea
var sortArray = function(nums) {
    const len=nums.length;
    if(len<=1) {return nums;
        // There is no need to sort if it is only a value or an empty array
        // Also used as the lowest return condition of recursion, i.e. split to only one element
    }
    // Calculate the split point
    const mid=Math.floor(nums.length/2);
    // Divide the left subarray recursively and merge it into an ordered array
    const numsLeft=sortArray(nums.slice(0,mid));
    // Divide the right subarray recursively and merge it into an ordered array
    const numsRight=sortArray(nums.slice(mid));
    // Merge two ordered arrays
    nums=mergeArr(numsLeft,numsRight);
    return nums;
};
// Merge two ordered arrays
var mergeArr=function(arr1,arr2){
    let i=0;
    let j=0;
    let arr=[];// Initializes an array to hold values from two ordered arrays
    const len1=arr1.length;
    const len2=arr2.length;
    while(i<len1&&j<len2){
        if(arr1[i]<arr2[j]){
            arr.push(arr1[i]);
            i++;
        }else{ arr.push(arr2[j]); j++; }}// This is a legacy issue
    // The two arrays may have different lengths, so one of the arrays is added to the arR container
    // A longer array that has the rest of the ordered elements not yet added to the container, concatenates the rest of the array into the container
    if(i<len1){
        return arr.concat(arr1.slice(i));
    }else{
        returnarr.concat(arr2.slice(j)); }}Copy the code

Sort process introduction: merge sort is to use the idea of divide and conquer, the big problem is divided into small problems, solve small problems, and then the solution of small problems, in order to obtain the solution of the big problem. Sorting process:

  • Using recursion, divide the array step by step, divide the array in half, form two small arrays, recurse until a small array has only one element.
  • If the smallest array has only one element, even if it is ordered, then the algorithm of adding two ordered arrays will combine to get an ordered array, and merge step by step until you get the size of the original large array, which is already a sorted array.

Leetcode submits results for analysis: This performance is much better than base sort. However, there is a disadvantage, the space complexity is slightly higher, need to copy multiple arrays



Sorting analysis: Time complexity :O(nlogn) Space complexity :O(n) Stability: Stable

Given two ordered integer arrays nums1 and nums2, please merge nums2 into nums1 to make nums1 an ordered array. Initialize nums1 and nums2 to m and n, respectively. You can assume that nums1 has a space size equal to m + n so that it has enough space to hold elements from Nums2.

Use the double pointer method.

  • Because nums1’s space is concentrated in the back, it’s better to process sorted data from back to front, saving space, and filling in values into j as you traverse
  • Set pointer I and j to the numeric tail of nums1 and nums2, respectively, and start the comparison traversal from the tail value. At the same time, set pointer K to the end of nums1. After each traversal, the comparison value will be filled
  • When I <0, the traversal ends. At this time, there are still incomplete data in NUMs2, which is directly copied before NUMs1, and the result array is finally obtained

Here’s the code:

var merge = function(nums1, m, nums2, n) {
      // initialize two Pointers to point to, nums1 tail index k
    let i=m-1;
    let j=n-1;
    let k=m+n-1;
    // When neither array is traversed, the pointer moves synchronously
    while(i>=0&&j>=0) {// Compare two arrays from back to front
        // Put the biggest one last
        if(nums1[i]>=nums2[j]){
            nums1[k]=nums1[i];
            i--;
            k--;
            // The I pointer tells you where nums1 has been placed next
            // The following element has a value in its original position, but can be overwritten by the value in nums2
        }else{ nums1[k]=nums2[j]; j--; k--; }}Nums1 is not necessarily the same length as nums2.
    // We also need to consider one of these cases that comes out early:
    // There are nums2 elements left unadded, special handling
    while(j>=0){ nums1[k]=nums2[j]; j--; k--; }};Copy the code

E. Quicksort (Important)

Sort process introduction: Quicksort and merge sort in the basic idea is the same, still follow the idea of divide and conquer. The difference is that quicksort does not split the real array and merge it into a new array, but rather sort it directly inside the original array. By sorting the sorted data into two independent parts, one part of the data is smaller than the other part of the data, and then in this way to the two parts of the data are sorted, the whole sorting process can be carried out recursively, so that the entire array into an ordered sequence. Steps:

  • Select a base element target (usually the first element is selected, but it is always the same, the left-most element is used as the base element)
  • The goal is to move elements smaller than target to the left of the array and elements larger than target to the right of the array (This step is done with two Pointers, the first to the far left and the second to the far right)
  • Because the left pointer is one position after the reference target, remember that to do this, the left pointer is to look for something greater than the reference target, keep moving to the right, and stop when you find it, and then move the right pointer to look for something less than the reference target, and move to the left when you find it, and stop when you find it, Switches positions with the number found by the left pointer, and the pointer continues to move until the right pointer’s subscript is greater than the left pointer’s. That means that everything on the left is less than target, and everything on the right is greater than target.
  • In this case, the reference values target andPointer to the rightThe points are interchanged, and then divided into left and right by the position of the reference target.
  • The elements to the left and right of the reference target are quicksorted
  • Repeat (recursiveImplementation) until there is only one element left and right



Code implementation:

function sortArray(array, start, end) {
    if (end - start < 1) {
      return;// return a recursive condition
    }
    const target = array[start];// The base value is the leftmost
    // Initialize the left (l) right (r) pointer
    let l = start;
    let r = end;
    while (l < r) {
        // The left pointer is greater than the right pointer, and the value traversed is greater than the reference value
      while (l < r && array[r] >= target) {
        r--;/ / move
      }// Jump to indicate that it was found
      array[l] = array[r];
      // The right pointer is greater than the left, and the value traversed is less than the reference value
      while (l < r && array[l] < target) {
        l++;/ / move
      }// Jump to indicate that it was found
      array[r] = array[l];
    }
    //
    array[r] = target;
    sortArray(array, start, r - 1);
    sortArray(array, r + 1, end);
    return array;
  }
Copy the code

If you feel difficult to understand, you can also learn to write the following method, more in line with the public thinking (but waste a lot of storage space). Create two storage Spaces, left and right, to store the sequence whose recursion is smaller or larger than target. Each recursion directly returns the array joined by left, target and right

// Quick sort easy to understand version
function sortArray(array) {
    if (array.length <=1) {
        // There is no need to sort less than two elements
      return array;
    }
    const target = array[0];// The base value is the leftmost one
    const left = [];
    const right = [];
    for (let i = 1; i < array.length; i++) {
      if (array[i] < target) {
        left.push(array[i]);// Those less than the base value are placed on the left
      } else {
        right.push(array[i]);// The value greater than the base value is placed on the right}}/ / recursion
    return sortArray(left).concat([target], sortArray(right));
  }
Copy the code

And there are many different ways to write it depending on where the base value is. So let’s write it in the middle. Next up:

function sortArray(arr, left = 0, right = arr.length - 1) {
    // Define recursive bounds. If the array has only one element, there is no sorting necessary
    if(arr.length > 1) {
        // lineIndex indicates the index bit of the next subarray partition
        const lineIndex = partition(arr, left, right)
        // If the length of the left subarray is not less than 1, the subarray is recursively fast-sorted
        if(left < lineIndex-1) {
          // The left subarray is bounded by lineIndex-1
          sortArray(arr, left, lineIndex-1)}// If the length of the right subarray is not less than 1, the subarray is recursively fast-sorted
        if(lineIndex<right) {
          // The right subarray is bounded by lineIndex
          sortArray(arr, lineIndex, right)
        }
    }
    return arr
  }
  // The process of dividing left and right subarrays with reference values as the axis
  function partition(arr, left, right) {
    // The base value defaults to the middle element
    let pivotValue = arr[Math.floor(left + (right-left)/2)]
    // Initialize the left and right Pointers
    let i = left
    let j = right
    // When the left and right Pointers do not exceed the bounds, the loop executes the following logic
    while(i<=j) {
        // If the left pointer points to an element less than the base value, the left pointer moves to the right
        while(arr[i] < pivotValue) {
            i++
        }
        // If the right pointer points to an element greater than the reference value, the right pointer moves to the left
        while(arr[j] > pivotValue) {
            j--
        }
  
        // If I <=j, it means that there is a larger element to the left or a smaller element to the right of the base value
        if(i<=j) {
            [arr[i], arr[j]] = [arr[j], arr[i]];
            i++
            j--
        }
  
    }
    // Return the left pointer index as the basis for the next subarray partition
    return i
  }
Copy the code

Leetcode submits results analysis (to see how efficient it is) :



Sorting analysis: Time complexity :O(nlogn), actually less than O(nlogn) in most cases Spatial complexity :O(nlogn) Stability: unstable (recursive call consumption)

Finally, sort sort

Fortunately, there’s a built-in method in JavaScript to do this, and the code is here. Using sort is very simple.

// Sort sort
// The bottom of sort is
var sortArray = function(nums) {
    return nums.sort((a,b) = >{
        returna-b; })};Copy the code

Leetcode Submission results: Sort is excellent

Know why Sort is so good? Because of its lineage 😜

The sort function of the V8 engine only provides two kinds of sort InsertionSort and QuickSort. InsertionSort is used for arrays less than or equal to 22, and QuickSort is used for arrays larger than 22.

The source code reads:

 // In-place QuickSort algorithm. 
   // For short (length <= 22) arrays, insertion sort is used for efficiency.
Copy the code

In addition, other engines implement sort as follows:

Webkit: The underlying implementation uses the C++ library qsort() method (jsarray.cpp source code).

Thank you for reading, I hope you put forward learning suggestions, I am also ready to find an internship in April and May, I hope there is a big guy fishing, want to learn with my friends can also add my wechat to learn 😜 micro signal: XiehB-frontend -178

ConardLi’s Blog – Quicksort Sort Algorithm stability implications for front-end algorithms and data structure interviews