Make writing a habit together! This is my first day to participate in the “Gold Digging Day New Plan · April More text challenge”, click to see the details of the activity.

preface

Hello, everyone, I am a knowledge.

Studies this matter, especially the algorithm of this stuff, the light is less, most of the time feel understood, but it may not be easy to write code, so still have to understand and implement the code, and even oneself start work implements, may be too long, will forget, so you also need to summarize and review on a regular basis.

This article is a personal review of six common sorting algorithms that you will learn about

  • Bubble sort
  • Selection sort
  • Insertion sort
  • Quick sort
  • Merge sort
  • Count sorting

The basic ideas, implementation ideas and code implementation of these six common sorting algorithms.

Noun explanation

Time complexity:

Time complexity describes the running time of an algorithm. Let’s write the number of operations that the algorithm has to do in terms of a function of input size n, called T(n)T(n)T(n). Usually with time complexity O (f (n)) O (f (n)) O (f (n)), there are T (n) = O (n) (f) T (n) = O (f (n)) T (n) = O (f (n)) and f (n) (n) f f (n) as the basic operation and the number of repeat code.

Space complexity:

Spatial complexity is a measure of the storage required for an algorithm to execute within a computer. As S (n) = O (f (n)) S (n) = O (f (n)) S (n) = O (f (n)). The storage space required during algorithm execution consists of three parts:

  • Space occupied by algorithmic programs;
  • The storage space occupied by the initial data input;
  • The extra space required during algorithm execution.

Sorting algorithm stability:

It is assumed that there are multiple records with the same keywords in the sequence of records to be sorted. If the relative order of these records remains unchanged after sorting, that is, A1=A2 and A1 is before A2 in the original sequence, and A1 is still before A2 in the sorted sequence, the sorting algorithm is said to be stable. Otherwise it is called unstable.

1. Bubble sort

1.1 Basic Ideas

We know that the bubbles in the water get bigger and bigger as they go up, and bubble sort, as the name suggests, is a sort process of letting the big bubbles go up. The complete process of traversing from the first digit of an unordered sequence to the last digit is called a bubble. By judging the size of adjacent elements (a, b), for example, a on the left is larger than B on the right, the positions of A and B are swapped. Each bubble can determine the position of one element, and can reduce the range of the unordered sequence, and then bubble the unordered sequence until finally all the elements are determined, that is, the sorting is complete.

1.2 Implementation Roadmap

  1. If the left element is greater than the right element, swap them;
  2. After traversing the whole unordered sequence, a large number can be pushed to the left of the unordered sequence, and then the right boundary of the unordered sequence can be reduced.
  3. Repeat steps 1 and 2 until only one element remains in the unordered sequence and the sorting is complete.

1.3 Sort animation

1.4 Code Implementation

1.4.1 method a

Conventional one-way bubbling

/** * Normal one-way traversal bubble *@param {*} arr* /
const bubbleSort = (arr) = > {
	console.time('Regular one-way traversal bubbling');

	const len = arr.length;

	for (let i = 0; i < len; i++) {
		let j = 0;
		// The right end of an unordered sequence
		const end = len - 1 - i;

		while (j < end) {
			if (arr[j] > arr[j + 1]) {
				let temp = arr[j];
				arr[j] = arr[j + 1];
				arr[j + 1] = temp; } j++; }}console.timeEnd('Regular one-way traversal bubbling');

	return arr;
};

// test case:
const arr = Array.from({ length: 50 }).map(() = >
	Math.floor(Math.random() * 100));console.log(bubbleSort(arr));

Copy the code

1.4.2 method 2

Double end bubbling.

The first method is to bubble from left to right, and bubble the disordered sequence from small to large only once each time. After one bubble, only the right boundary of the disordered sequence is reduced. However, the difference of double-ended bubbling is that in the bubbling process of disordered sequence, the left and right boundaries of disordered sequence can be reduced simultaneously by bubbling once from small to large and again from large to small.

/** * double end bubble sort implementation */
const bubbleSort = (arr) = > {
	console.time('Two-ended bubble sort time');

	let low = 0;
	let high = arr.length - 1;
	let temp;

	// 6, 8, 35, 28, 32, 38, 42
	/ / write write
	// low high
	// Each time through the while loop, the left and right sides do a traversal swap to reduce the range of disorder between low and high by 2
	while (low < high) {
		// Iterate from low to high, and determine a large number after each for loop
		for (let i = low; i < high; i++) {
			// If the left side is larger than the right, swap
			if (arr[i] > arr[i + 1]) {
				temp = arr[i];
				arr[i] = arr[i + 1];
				arr[i + 1] = temp;
			}
		}
		high--;

		// Iterate from high to low, and determine a small number after each for loop
		for (let j = high; j > low; j--) {
			if (arr[j] < arr[j - 1]) {
				temp = arr[j];
				arr[j] = arr[j - 1];
				arr[j - 1] = temp;
			}
		}
		low++;
	}

	console.timeEnd('Two-ended bubble sort time');
        
        return arr;
};

// test case:
const arr = Array.from({ length: 50 }).map(() = >
	Math.floor(Math.random() * 100));console.log(bubbleSort(arr));
Copy the code

1.4.3 the difference

From the perspective of time complexity and space complexity of the algorithm, the two methods are the same. According to personal tests, when the array length is small, the actual time of the two methods is basically similar, while when the array length is large, the actual time of double-ended bubbling is less than that of conventional one-way bubbling.

  1. Array length 100:

2. Array length is 100000:

1.5 Algorithm Complexity

Time complexity :(n2n^2n2);

Space complexity: Use constant space to store several temporary variables, and the space complexity is O(1).

1.6 Stability of sorting algorithm

We only swap when the left side is larger than the right, so the relative position of two identical elements will not change after sorting, it is stable.

2. Select sort

2.1 Basic Ideas

Find the smallest number from the unordered sequence, place it in the ordered sequence, and repeat the process until there is only one (the largest number) left in the unordered sequence.

2.2 Implementation Roadmap

  1. The initial ordered sequence is empty, and the unordered sequence is the entire array to be sorted.
  2. Traverse the unordered sequence, find one of the smallest number, through the way of position exchange to put it in the left end of the ordered sequence, and then reduce the initial boundary of the unordered sequence;
  3. Repeat step 2 until there is only one number left in the unordered sequence. This number is the largest number and is automatically ranked in the last digit. At this point, the sorting is complete.

2.3 Sort animation

2.4 Code Implementation

/** * Select sort * each time find a smallest (or largest) number from the unordered sequence, put into the ordered sequence */
const selectionSort = (arr) = > {
	console.time('Selection sort time');

	const len = arr.length;
	let minIndex;
	// The starting index of an unordered sequence is 0
	let i = 0;
	let temp;

	while (i < len - 1) {
		minIndex = i;
		let j = i;
		// Find the smallest index from the unordered sequence
		while (j < len) {
			if (arr[j] < arr[minIndex]) {
				minIndex = j;
			}
			j++;
		}
		// Place the smallest number found above from the unordered sequence into the ordered sequence on the left
		temp = arr[i];
		arr[i] = arr[minIndex];
		arr[minIndex] = temp;
		// Move the starting index of an unordered sequence right one bit
		i++;
	}
	console.timeEnd('Selection sort time');

	return arr;
};

// test case:
const arr = Array.from({ length: 50 }).map(() = >
	Math.floor(Math.random() * 100));console.log(selectionSort(arr));
Copy the code

2.5 Algorithm Complexity

Time complexity: O(n2n^2n2);

Space complexity: O(1).

2.6 Stability of sorting algorithm

If an element A is smaller than the current element B1, and the A appears after an element B2 that is equal to the current element, stability is broken after the exchange. For example, the sequence 4, 9, 4, 2, 7, in the first selection process, the first element 4 will be swapped with 2, so the relative order of the two fours in the original sequence will be broken, so selection sort is an unstable sorting algorithm.

3. Insert sort

3.1 Basic Ideas

Insertion sort of process is similar to when we play poker, and will touch touch card according to the size of the single card inserted into it should be in the corresponding position, such as our current hand some CARDS are 3, 4, 6, 9, 10, J, so if we touch a 8 at this time, we should put it into the middle of the 6 and 9, and will be 9, 10, J in turn back. In the insertion sort process, we will select a number target from the unordered sequence, and then scan the ordered sequence from back to front, find the first number smaller than target or equal to target, and then insert target after this number; The sequence is then repeated by selecting a number from the unordered sequence.

3.2 Implementation Roadmap

  1. The initial index of an unordered sequenceiWe start at one, because the 0th one can be considered already in an ordered sequence;
  2. Select the first number in the unordered sequence on the right as the number to be insertedtarget;
  3. Scan back to front in the ordered sequence on the left to find the first ratiotargetLess than or equal totargetThe number of willtargetInsert after the number and place it in the ordered sequencetargetThe number after the inserted position is moved back one bit, and then the left edge of the unordered sequence is moved back one bit;
  4. Repeat steps 2 and 3 to complete the sorting.

3.3 Sort animation

3.4 Code Implementation

3.4.1 track method a

/** * insert sort */
const insertionSort = (arr) = > {
	console.time('Insert sort time');

	const len = arr.length;

	// I is the starting index of an unordered sequence, starting at 1, since the 0th index can be considered already in an ordered sequence
	for (let i = 1; i < len; i++) {
		// The number to insert
		const target = arr[i];
		// scanIndex is a scan pointer. The initial value is the last index of the ordered sequence
		let scanIndex = i - 1;

		// Find the first number less than target or equal to target or scan the left end of the ordered sequence
		while (scanIndex >= 0 && arr[scanIndex] > target) {
			// During the scan, the number larger than target is moved one bit later
			arr[scanIndex + 1] = arr[scanIndex];
			// Move the scan pointer forward one bit
			scanIndex--;
		}
		// Insert target at the corresponding position
		arr[scanIndex + 1] = target;
	}

	console.timeEnd('Insert sort time');

	return arr;
};

// test case:
const arr = Array.from({ length: 100 }).map(() = >
	Math.floor(Math.random() * 1000));console.log(insertionSort(arr));
Copy the code

3.4.2 method 2

The idea is basically the same as method one, except that binary search is used when finding the first number smaller than target or equal to target.

/** * insert sort */
const insertionSort = (arr) = > {
	console.time('Insert sort time using binary lookup');

	const len = arr.length;

	// I is the starting index of an unordered sequence, starting at 1, since the 0th index can be considered already in an ordered sequence
	for (let i = 1; i < len; i++) {
		// The number to insert
		const target = arr[i];
		// binary lookup pointer
		let left = 0;
		let right = i - 1;
		let mid;

		// At the end of the while loop, the index left is the result we are looking for (the first number less than target or the left end of the ordered sequence)
		while (left <= right) {
                        // Binary search
			mid = left + Math.floor((right - left) / 2);
			if (arr[mid] <= target) {
				left = mid + 1;
			} else {
				right = mid - 1; }}for (let j = i - 1; j >= left; j--) {
			// Move the number larger than target one bit further back
			arr[j + 1] = arr[j];
		}

		// Insert target at the corresponding position
		arr[left] = target;
	}

	console.timeEnd('Insert sort time using binary lookup');

	return arr;
};

// test case:
const arr = Array.from({ length: 100 }).map(() = >
	Math.floor(Math.random() * 1000));console.log(insertionSort(arr));
Copy the code

Rule 3.4.3 difference

From the perspective of time complexity and space complexity of the algorithm, the two methods are the same. According to personal tests, when the array length is small, the actual time of the two methods is basically similar, while when the array length is large, the actual time of the insertion sort using binary search is less than that of the conventional insertion sort. The specific test screenshots are no longer available, but you can try them out for yourself if you are interested, and the code address will be provided at the end of this article.

3.5 Algorithm Complexity

Time complexity: O(n2n^2n2);

Space complexity: O(1).

3.6 Stability of sorting algorithm

We always choose the first place in the unordered sequence, so the relative position of the same element will not change after sorting, it is stable.

4. Quicksort

4.1 Basic Ideas

Select a base number from the array to be sorted, put all the numbers smaller than it to its left, and put the numbers larger than it to its right. At this time, it divides an unordered sequence into two sub-sequences, and then performs the same processing on the left and right sub-sequences using recursion. The end point of recursion is that if there are only 1 or 0 numbers in the subsequence to be processed, the subsequence is considered to be ordered, and eventually the entire sequence will be ordered. This is a classic divide-and-conquer idea.

4.2 Implementation Roadmap

  1. Selects a reference number from the sequencepivot, any number in the sequence can be used as a reference number. For convenience, we choose the first number;
  2. If you iterate over an unordered sequence, if it’s smaller than the base number, you put it to the left, if it’s larger than it, you put it to the right, and if it’s equal to it, you don’t move, because you can put it on both sides;
  3. In the above traversal, we first start from the right side of the sequence and go to the left, and we call this subscript thetajIf thearr[j]Greater than or equal to the base number, executej--Operation until the first value less than the base number is found; And then you start from the left and you go to the rightiIf thearr[i]Less than or equal to the base number, executei++Operation until the first value greater than the base number is found, and then thearr[i]andarr[j]Swap places; And then continue the search process from right to left untilijIt ends when it overlaps. And the base value is still in the first place, so we need to put thearr[i]Swap the position with the base value, and then you get the effect that all the numbers to the left of the base number are less than or equal to it, and all the numbers to the right of the base number are greater than or equal to it.
  4. Then the left and right sub-sequences are treated the same by recursion.
  5. When the number of numbers in the subsequence to be processed is 1 or 0, the subsequence is considered to be ordered, and eventually the whole sequence will be ordered.

4.3 Sort Animation

4.4 Code Implementation

/** * Swap two items in the array */
const swap = (arr, i, j) = > {
	const temp = arr[j];
	arr[j] = arr[i];
	arr[i] = temp;
};

const quickSort = (arr, left, right) = > {
	// If left >= right, there is no element or only one element in the sequence
	if (left >= right) return;

	// take the first number in the sequence as the base number
	const pivot = arr[left];
	let i = left;
	let j = right;

	// while terminates when I equals j
	while (i < j) {
		// This while terminates, which means it stops when it encounters something less than the base number
		while (i < j && arr[j] >= pivot) {
			j--;
		}
		// Then, from left to right, find the first one greater than the base number
		while (i < j && arr[i] <= pivot) {
			i++;
		}
                // Switch the positions of the two numbers
		swap(arr, i, j);
	}

        // Change the base number to the middle
	swap(arr, i, left);
        // Perform the same sort on the left and right subsequences
	quickSort(arr, left, i - 1);
	quickSort(arr, i + 1, right);
};

// test case:
const arr = Array.from({ length: 100 }).map(() = >
	Math.floor(Math.random() * 1000)); quickSort(arr,0, arr.length - 1);
console.log(arr);
Copy the code

4.5 Algorithm Complexity

Time complexity: O(nlognnlognnlogn);

Space complexity: Using recursion, only constant space is used in each recursion, so space complexity is the depth of recursion, O(lognlognlogn).

4.6 Stability of sorting algorithm

Is not stable.

5. Merge sort

5.1 Basic Ideas

Divide a sequence to be sorted into two sub-sequences (divide), then use merge sort to sort the two sub-sequences respectively (conquer), and merge the two sorted sub-sequences into an ordered sequence (combine), which also uses the divide-and-conquer idea.

5.2 Sort Animation

5.3 Implementation Roadmap

5.3.1 Recursion (Top-down)

  1. The sequence to be sorted is divided into two sub-sequences from the middle position;
  2. Then the two subsequences are merged and sorted by recursion. The end point of recursion is that there is only one number in the subsequence, so the sequence is considered to be orderly.
  3. A larger ordered sequence is obtained by combining ordered subsequences in pairs; The merging process is as follows:
    • Create a result array to hold the merge results;
    • Set two Pointers, starting at the start of two sorted sequences;
    • Compare the elements to which the two Pointers point, place the smaller element in the result array, and move the pointer to the next position;
    • Repeat the previous step until a pointer reaches the end of the sequence;
    • Then all the remaining elements of the other sequence are copied directly to the end of the merged sequence.

5.3.2 Code implementation ONE

This implementation requires that the memory space of the left and right arrays be cleared, and the sorted array is a new one, not the original array.

/** * merges two subsequences */
const merge = (left, right) = > {
   const result = [];

   while (left.length && right.length) {
   	// Since both left and right are ordered, compare which of the two arrays has the smallest number in the head and place it in the result array first
   	if (left[0] < right[0]) {
   		result.push(left.shift());
   	} else{ result.push(right.shift()); }}// When an array is empty, another array is pushed into result
   while (left.length) {
   	result.push(left.shift());
   }

   while (right.length) {
   	result.push(right.shift());
   }

   return result;
};

const mergeSort = (arr) = > {
   const len = arr.length;
   // If there are only 1 items or 0 items, there is no need to sort
   if (len < 2) return arr;

   // In the middle position, the sequence is divided into two subsequences
   const mid = Math.floor(len / 2);
   const left = arr.slice(0, mid);
   const right = arr.slice(mid);

   // Order two subarrays,
   // Merge the two ordered subarrays into an ordered array
   return merge(mergeSort(left), mergeSort(right));
};

// test case:
const arr = Array.from({ length: 30 }).map(() = >
   Math.floor(Math.random() * 1000));console.log(mergeSort(arr));
Copy the code

5.3.3 Code implementation two

This implementation uses Pointers instead of clearing the left and right arrays and sorting them on the original array.

Subsequences are two subsequences separated by mid in the [left, right] interval of an array
const merge = (arr, left, mid, right) = > {
	let i = left;
	let j = mid + 1;
	// Temporarily store merged results
	const result = [];

	// Determine which of the two subsequences' header elements is smaller and place it in the result array first
	while (i <= mid && j <= right) {
		if (arr[i] <= arr[j]) {
			result.push(arr[i++]);
		} else{ result.push(arr[j++]); }}// Place the remaining elements of the remaining subsequence into the result array
	while (i <= mid) {
		result.push(arr[i++]);
	}
	while (j <= right) {
		result.push(arr[j++]);
	}

	// Copy the elements saved in result to their respective locations in arR
	// Here we copy from back to front
	while(right >= left) { arr[right] = result[right - left]; right--; }};const sort = (arr, left, right) = > {
	// If the subsequence has only 1 or 0 items, no sorting is required
	if (right <= left) return;

	const mid = left + Math.floor((right - left) / 2);

	// Order two subarrays,
	sort(arr, left, mid);
	sort(arr, mid + 1, right);
	// Merge the two ordered subsequences into a large ordered sequence
	merge(arr, left, mid, right);
};

const mergeSort = (arr) = > {
	sort(arr, 0, arr.length - 1);
};

// test case:
const arr = Array.from({ length: 30 }).map(() = >
	Math.floor(Math.random() * 1000)); mergeSort(arr);console.log(arr);
Copy the code

5.3.4 Iterative Method (Bottom-up)

  1. Merge each two adjacent digits of the sequence to form ceiL (N /2) CeiL (N /2) ceiL (N /2) sequences, and each sequence contains two (or one) elements after sorting;
  2. If the number of sequences is greater than 1, the above sequences are merged again to form CEIL (N /4) CEIL (N /4) CEIL (N /4) sequences, each sequence contains four (or three) elements;
  3. Repeat step 2 until all elements are sorted, that is, the sequence number is 1.

5.3.5 Code implementation

The merge procedure and code in this implementation is the same as in the second implementation of the recursive method above, and also sorts on the original array.

Subsequences are two subsequences separated by mid in the [left, right] interval of an array
const merge = (arr, left, mid, right) = > {
	let i = left;
	let j = mid + 1;
	// Temporarily store merged results
	const result = [];

	// Determine which of the two subsequences' header elements is smaller and place it in the result array first
	while (i <= mid && j <= right) {
		if (arr[i] <= arr[j]) {
			result.push(arr[i++]);
		} else{ result.push(arr[j++]); }}// Place the remaining elements of the remaining subsequence into the result array
	while (i <= mid) {
		result.push(arr[i++]);
	}
	while (j <= right) {
		result.push(arr[j++]);
	}

	// Copy the elements saved in result to their respective locations in arR
	// Here we copy from back to front
	while(right >= left) { arr[right] = result[right - left]; right--; }};const mergeSort = (arr) = > {
	const len = arr.length;
	if (len < 2) return;

	// The number of subsequence elements, when is 1, the subsequence has only 1 element, such as [3] and [5] combined, and so on
	let k = 1;

	while (k < len) {
		// The number of elements after merging two complete subsequences (complete: number of elements in subsequence is k)
		const count = 2 * k;
		let i;
		// I < len-count is to ensure that the merges in this for loop are full subsequence merges
		for (i = 0; i < len - count; i += count) {
			merge(arr, i, i + k - 1, i + count - 1);
		}

		// There may be a subsequence that does not need to be merged, because each subsequence is already ordered;
		// finally, if there are two subsequences (I + k < len), the two subsequences may not be complete, such as [3,5,8,16] and [2,7],
		// If I + count - 1 is passed to right as in the above for loop, it will exceed the length of arr, so it will be processed separately
		if (i + k < len) {
			merge(arr, i, i + k - 1, len - 1);
		}
		k *= 2; }};// test case:
const arr = Array.from({ length: 90 }).map(() = >
	Math.floor(Math.random() * 1000)); mergeSort(arr);console.log(arr);
Copy the code

5.4 Algorithm complexity

Time complexity: Regardless of the initial state of the original array, these steps are performed regardless of the element in any case, the best and worst time complexity is O(nlognnlognnlogn);

Space complexity: The space occupied by the temporary result array in the merge process and the data pushed in the recursive process: N + logn; In the iterative method, only the temporary result array of the merge process takes up space: N, so the space complexity of merge sort is O(NNN).

5.5 Algorithm Stability

The key is that in the merge method, we set arr[I] <= arr[j], so when two elements are identical, we put the left arR [I] in the result array first. Their relative positions do not change, so merge sort is a stable sorting algorithm.

6. Counting sort

6.1 Basic Ideas

Using the characteristic that array index is ordered, use a count array count, the element arr[I] in the array to be sorted corresponds to the index (bucket) of the array, count[arr[I] corresponds to the number of times in the array to be sorted, and then read the count array in turn to generate an ordered array, that is, the sorting result. This algorithm uses the size of the value as an array index, so it is more suitable for scenarios where the range of elements in an array to be sorted is small.

6.2 Implementation Roadmap

  1. Create a count arraycount;
  2. Iterates through the array to be sorted, storing the number of occurrences of element A in the array as the a entry in the count arraycount[a];
  3. Create a result array, iterate through the count array,count[i]Greater than 0 pushes one into the result arrayi, simultaneous executioncount[i]--If thecount[i]There is no count or is equal to 0, theni++And then do the same;
  4. Count array traversal ends, returns the result array.

6.3 Sort Animation

6.4 Code Implementation

const countSort = (arr) = > {
	let len = arr.length;
	const count = [];

	/ / count
	for (let i = 0; i < len; i++) {
		count[arr[i]] = (count[arr[i]] || 0) + 1;
	}

	len = count.length;
	const result = [];
	for (let i = 0; i < len; i++) {
		while(count[i]) { result.push(i); count[i]--; }}return result;
};

// test case:
const arr = Array.from({ length: 300 }).map(() = >
	Math.floor(Math.random() * 100));console.log(countSort(arr));
Copy the code

6.5 Algorithm Complexity

Time complexity: O(n+kn+ KN +k), where n is the array length and k depends on the maximum value of array elements.

Space complexity: The length of count depends on the maximum number of array elements, and the space complexity is O(KKK).

6.6 Stability of sorting Algorithm

Stability.

The code that appears in this article can be found here.