Chapter 4 – Quicksort

Divide and conquer

Divide-and-conquer is not an algorithm to solve problems, but an idea to solve problems. The core idea of divide-and-conquer is:

  • Identify simple baseline conditions (termination conditions);
  • Determine how to reduce the size of the problem to qualify for termination.

Case 1

How do you divide a field evenly into squares (squares) and make sure the squares are the largest?

Termination condition: the length of one side of the remaining land is an integer multiple of the other side.

For example, in the following case, the largest square has a side length of 25m.

Recursive condition: first find the largest square that can be accommodated, and use the same algorithm for the rest of the land.

Finally, I found the largest square in the field that could be divided evenly!

Case 2

Given an array of numbers, add the numbers and return the result.

[2, 4, 6]
Copy the code

Use cycle:

const getSum = (arr) => {
  let sum = 0

  for (let i = 0, len = arr.length; i < len; i++) {
    sum += arr[i]
  }
  return sum
}
Copy the code

Use divide and conquer:

Terminating condition: The array is empty or has only one entry.

[] An empty array with a value of 0. [5] An array has only one entry with a value of 5Copy the code

Recursive condition: Reduce the size of the array (reduce the problem size).

sum([2, 4, 6])

2 + sum([4, 6])

2 + 4 + sum([6])
Copy the code

The implementation code is as follows:

Const getSum = (arr) => {if (arr.length === 0) {// If the array is empty and the value is 0, Return firstNum + getSum(arr)} else {const firstNum = arr.shift();Copy the code

Practice after class

The answer

4.2

Terminating condition: Array is empty, length is 0; The array has only one entry, and the length is 1

Recursive condition: Reduce the size of the array (reduce the problem size).

const getLength = (arr) => { if (JSON.stringify(arr) === '[]') { return 0 } else { arr.shift() return 1 + getLength(arr) }} this is the same as in example 2 above, but it is always weird to check that the array is empty without length, hhCopy the code

4.3

Terminating condition: There are only two items in the array, two for comparison, the larger one is the maximum.

Recursive condition: Reduce the size of the array (reduce the problem size).

const getMax = (arr) => {
  if (arr.length === 2) {
    return arr[0] > arr[1] ? arr[0] : arr[1]
  } else {
    const max = arr.shift()
    return Math.max(max, getMax(arr))
  }
}
Copy the code

4.4

Base condition: the array has only one element. If the element is equal, it is found. If the element is not equal, it is not found.

Recursion condition: Shrink the array by half each time.

const binarySearch = (list, item) => {
  let low = 0
  let high = list.length - 1

  while (low < high) {
    const mid = Math.floor((low + high) / 2)
    if (list[mid] === item) {
      return mid
    }
    if (list[mid] < item) {
      low = mid + 1
    }
    if (list[mid] > item) {
      high = mid - 1
    }
  }

  return null
}

Copy the code

Quick sort

Quicksort is also divide-and-conquer

Quicksort is a common sorting method, which is much faster than selection sort. Quicksort also uses divide-and-conquer.

Baseline condition: If the array is empty or has only one element, the sort is done.

Recursive condition: Reduce the length of the array.

You specify an element in the array as a ruler. Those smaller than the ruler go to the left of the array, and those larger than the ruler go to the right of the array. Then we recurse through the array on the left and the array on the right until everything is sorted.

The code implementation is as follows:

Const quickSort = (arr) => {if (arr. Length < 2) {// Terminate condition: Return arr} else {const flag = arr[0] const less = [] const greater = [] for (let I = 1, len = arr.length; i < len; I ++) {if (arr[I] < flag) {less. Push (arr[I]) // greater. Push (arr[I]) // greater [... quickSort (less), flag,... quickSort (greater)] / / merge}}Copy the code

Optimizing spatial complexity

We define two arrays less and greater to store elements that are smaller or larger than the size of the scale.

We do this by swapping elements in the array without taking up extra space:

Const swap = (arr, I, j) => {// Swap method [arr[I], arr[j]] = [arr[j], arr[I]]]} const finndCenter = (arr, left, Const flag = arr[left] const flag = arr[left] const flag = arr[left] Let idx = left + 1 for (let I = idx; i <= right; If (arr[I] < flag) {swap(arr, I, idx); if (arr[I] < flag) {swap(arr, I, idx); }} swap(arr, left, idx-1) // Swap (arr, left, idx-1) to the right of the smaller element. return idx } const quickSort = (arr, left = 0, right = arr.length - 1) => { if (left < right) { const center = finndCenter(arr, left, QuickSort (arr, center + 1, arr, center + 1, arr, center + 1, arr, center + 1, arr, center + 1, arr, center + 1) } return arr // Elements are sorted, return arr}Copy the code

Worst case and best case

In the above code, I randomly select an element in the array as the base element. If I happen to select the minimum or maximum value of the array, the worst will happen, as shown below:

Let’s analyze the time complexity, to sort you have to iterate through the array, which takes up N, and then you have this sort of base value divide and conquer, recursive call stack length is also N.

So that’s the worst time order n^2.

It would be better if the median was selected exactly:

The length of the call stack is logn and the time complexity is O(nlogn).

So the worst time for quicksort is O(n^2), the average time is O(nlogn), and the average time is also the best case.

Quicksort and merge sort

The complexity of both quicksort and collation is order (n*log n), and merge sort is more stable. Why use quicksort instead of collation?

Each step of the algorithm actually requires a fixed amount of time, called a constant.

We don’t take constants into account when we think about time complexity.

For example, simple lookup and binary lookup, assume that the simple lookup constant is 10 milliseconds and the binary lookup constant is 1 second

Now look in a list of four billion elements:

Easy to find Binary search
10 milliseconds x 4 billion = 463 days 1 second * log 4 billion = 32 seconds

As you can see, binary search is still much faster, and constant doesn’t matter at all.

But for quicksort and merge sort, constants matter.

Quicksort has a smaller constant than merge sort, and both run time is O(nlogn), so quicksort is faster.

But you might say, well, this is stable, and at worst it’s order N logn, and at worst it’s order n^2, so why not consider the worst case?

In fact, in the vast majority of cases, quicksort encounters the average case, that is, the best case, and only in rare cases, the worst case, so the worst case is often not considered.

Practice after class

When using the big O notation, how long does each of the following operations take?

4.5 Prints the value of each element in the array.

4.6 Multiply the value of each element in the array by 2.

4.7 Only multiply the value of the first element in the array by 2.

4.8 Create a multiplication table based on the elements of the array, that is, if the array is [2, 3, 7, 8, 10], multiply each element by 2, then multiply each element by 3, then multiply each element by 7, and so on.

The answer

4.5

O(n)

4.6

O(n)

4.7

O(1)

4.8

O(n^2)

summary

  • Divide-and-conquer breaks down the problem step by step, and when dealing with lists using divide-and-conquer, the baseline case is likely to be an empty array or contain only one element.
  • When implementing quicksort, the case of randomly selecting the base element is the average case of time complexity, which is also the best case.
  • Constant factors also determine the speed of the algorithm. In the case of different time complexity (simple search and binary search), constant is irrelevant, but in the case of the same time complexity (fast sorting and sorting), constant is very important.