preface

Sorted out the common sorting algorithm Python implementation and GIF and dance video on the algorithm running process of visual display.


Bubble sort

The working principle of

  1. Compare adjacent elements. If the first one is bigger than the second, swap them both.
  2. Do the same for each pair of adjacent elements, starting from the first pair to the last pair at the end. When we do that, the last element is going to be the largest number.
  3. Repeat this step for all elements except the last one.
  4. Keep repeating the above steps for fewer and fewer elements at a time until there are no more pairs to compare.

The complexity of the

Worst Time Complexity Optimal Time Complexity Average Time Complexity Additional space complexity Stability O(N^2)O(N ^2)O(1) Stability

implementation

Def bubble_sort(alist): def bubble_sort(alist): len(alist)-1 for passnum in range(len(alist)-1, 0, -1) For I in range(passnum): if [I] > [I +1]: alist[i], alist[i+1] = alist[i+1], alist[i] return alistCopy the code

visualization


Bubble sort of the dancing sorting algorithm

Selection sort

The working principle of

First, find the smallest (large) element in the unsorted sequence and store it at the beginning of the sorted sequence. Then, continue to find the smallest (large) element from the remaining unsorted elements and place it at the end of the sorted sequence. And so on until all the elements are sorted. The steps are as follows:

  1. Starting with the first element, the element can be considered sorted;
  2. Fetch the next element and scan back to front in the sorted element sequence;
  3. If the element (sorted) is larger than the new element, move the element to the next position;
  4. Repeat step 3 until you find a place where the sorted element is less than or equal to the new element;
  5. After inserting a new element into that position;
  6. Repeat steps 2 to 5.

The complexity of the

Worst Time Complexity Optimal Time Complexity Average Time Complexity Additional space complexity Stability O(N^2)O(N^2)O(N^2)O(1) Stability

implementation

def select_sort(alist):
    for i in range(len(alist) - 1):
        min_index = i
        for j in range(i + 1, len(alist)):
            if alist[j] < alist[min_index]:
                min_index = j

        alist[i], alist[min_index] = alist[min_index], alist[i]
    return alistCopy the code

visualization


Selection sorting algorithm for dancing

Insertion sort

The working principle of

By constructing an ordered sequence, unordered data is scanned from back to front in the sorted sequence to find the corresponding position and insert.

The complexity of the

Worst Time Complexity Optimal Time Complexity Average Time Complexity Additional space complexity Stability O(N^2)O(N ^2)O(1) Stability

implementation

Def insertion_sort(alist): # scan for I in range(1, len(alist)): Current_value = a [I] position = a [I] position = a [I] position = a [I] position While position > 0 and alist[position-1] > current_value: alist[position] = alist[position - 1] position -= 1 alist[position] = current_value return alistCopy the code

visualization


The insertion sort of the dancing sort algorithm

Hill sorting

The working principle of

Hill sort, also known as the descending incremental sort algorithm, is a more efficient and improved version of insertion sort. Hill sort is an unstable sort algorithm.

Hill sort is an improved method based on the following two properties of insertion sort:

  1. Insertion sort has high efficiency when it operates on almost ordered data, that is, it can achieve the efficiency of linear sort.
  2. But insert sort is generally inefficient because it can only move data one bit at a time.

The complexity of the

Worst Time Complexity Optimal Time Complexity Average Time complexity Extra space complexity Stability O(NlogN)O(N^2)-O(1) Unstable

implementation

def shell_sort(alist):
    sub_list_count = len(alist) // 2
    while sub_list_count > 0:
        for start_position in range(sub_list_count):
           alist = gap_insertion_sort(alist, start_position, sub_list_count)

        print('After increments of size', sub_list_count, 'The list is', alist)
        sub_list_count = sub_list_count // 2
    return alist


def gap_insertion_sort(alist, start, gap):
    for i in range(start + gap, len(alist), gap):
        current_value = alist[i]
        position = i

        while position >= gap and alist[position - gap] > current_value:
            alist[position] = alist[position - gap]
            position = position - gap

        alist[position] = current_value
    return alistCopy the code

visualization


Hill sort algorithm for dancing

Quick sort

The working principle of

Quicksort uses a Divide and conquer strategy to Divide a list into two sub-lists. The steps are as follows:

  1. Pick an element from the sequence, called pivot;
  2. Reorder the array so that all elements smaller than the base value are placed in front of the base value and all elements larger than the base value are placed behind the base value (the same number can go to either side). At the end of the split, the base is in the middle of the sequence. This is called a partition operation;
  3. There is a recursive correlation between subarrays of values less than the reference element and those of values greater than the reference element.

At the bottom of the recursion, the size of the column is zero or one, so it’s sorted. The algorithm must end because in each iteration it puts at least one element in its final position.

The complexity of the

Worst Time Complexity Optimal Time Complexity Average Time Complexity Additional space complexity Stability O(N^2)O(NlogN)O(NlogN)O(logN) Unstable

implementation

Quicksort first selects a value, called the pivot value. Although there are many different ways to select pivot values, we will use the first item in the list.

def quick_sort(alist):
    quick_sort_helper(alist, 0, len(alist) - 1)


def quick_sort_helper(alist, first, last):
    if first < last:
        splitpoint = partition(alist, first, last)
        alist = quick_sort_helper(alist, first, splitpoint - 1)
        alist = quick_sort_helper(alist, splitpoint + 1, last)
    return alist


def partition(alist, first, last):
    pivotvalue = alist[first]

    leftmark = first + 1
    rightmark = last

    done = False
    while not done:
        while leftmark <= rightmark and alist[leftmark] <= pivotvalue:
            leftmark += 1

        while alist[rightmark] >= pivotvalue and rightmark >= leftmark:
            rightmark -= 1

        if rightmark < leftmark:
            done = True
        else:
            temp = alist[leftmark]
            alist[leftmark] = alist[rightmark]
            alist[rightmark] = temp

    temp = alist[first]
    alist[first] = alist[rightmark]
    alist[rightmark] = temp

    return rightmarkCopy the code

visualization

The following figure pivots on the last item:

The first term is used as the pivot value in the video:

Quicksort for the sort algorithm of the dance

Merge sort

The working principle of

Merge, also known as merge algorithm, refers to the operation of merging two already sorted sequences into a single sequence. Merge sort algorithms rely on merge operations. The steps are as follows:

  1. Apply for a space equal to the sum of two sorted sequences, which is used to store the combined sequence;
  2. Set two Pointers, starting at the start of two sorted sequences;
  3. Compare the elements pointed to by the two Pointers, select the smaller element into the merge space, and move the pointer to the next position;
  4. Repeat step 3 until a pointer reaches the end of the sequence;
  5. Copies all remaining elements of the other sequence directly to the end of the merged sequence.

The complexity of the

Worst Time Complexity Optimal Time Complexity Average Time Complexity Additional Space complexity Stability O(NlogN)O(NlogN)O(NlogN)O(N) Stability

implementation

def merge_sort(alist):
    if len(alist) > 1:
        mid = len(alist) // 2
        left_half = alist[:mid]
        right_half = alist[mid:]

        merge_sort(left_half)
        merge_sort(right_half)

        i = 0
        j = 0
        k = 0
        while i < len(left_half) and j < len(right_half):
            if left_half[i] < right_half[j]:
                alist[k] = left_half[i]
                i += 1

            else:
                alist[k] = right_half[j]
                j += 1
            k += 1

        while i < len(left_half):
            alist[k] = left_half[i]
            i += 1
            k += 1

        while j < len(right_half):
            alist[k] = right_half[j]
            j += 1
            k += 1
    return alistCopy the code

visualization


Merge sort of dancing sort algorithm

Small eggs

What do different sorting algorithms sound like

reference

  1. Chinese Wikipedia
  2. AlgoRythmics – YouTube
  3. What different sorting algorithms sound like