This article is participating in the “Java Theme Month – Java Brush questions punch card”, see the activity link for details

preface

An algorithm is a set of methods we use to manipulate data and solve procedural problems. We can apply different algorithms to the same problem and achieve the same result. But for different algorithms, there are different differences in the consumption of time and resources. In order to analyze the efficiency of different algorithms, we often compare the two aspects of time and space, and then choose the most suitable solution for us.

This paper mainly starts from the definition of time complexity and space complexity, and then introduces the common time complexity and space complexity, and finally summarizes the common sorting algorithms.

Time complexity

define

If there is a function f(n)f(n)f(n) f(n) such that the limit value of T(n)/f(n)T(n) /f(n)T(n) /f(n) is a constant not equal to 0 as NNN approaches infinity, then f(n)f(n)f(n) f(n) is a function of the same order of magnitude as T(n)T(n)T(n), As T (n) = O (f (n)) T (n) = O (f (n)) T (n) = O (f (n)), according to O (f (n)) O (f (n)) O (f (n)) as the gradual time complexity of the algorithm and short time complexity, with big O, called the big O notation;

Derive the principle of time complexity

  1. If the running time is a constant order of magnitude, it is expressed by constant 1.
  2. Only the highest order term in the time function, such as O(n2+4n)O(n^2 +4n)O(n2+4n)O(n2+4n), becomes O(n2)O(n^2)O(n2) after the highest order term is reserved.
  3. If the highest order term exists, the coefficient before the highest order term is omitted, such as O(4n2)O(4n^2)O(4n2), and becomes O(n2)O(n^2)O(n2) after the coefficient is omitted.

A method for analyzing time complexity

To sum up, there are three main practical methods for analyzing the time complexity of a piece of code:

  1. Focus only on the line of code that executes the most times;
  2. The rule of addition: total complexity equals the complexity of the section of code that measures the most;
  3. Multiplication rule: the complexity of nested code equals the product of the complexity of both inside and outside the nested code;

Common time complexity curves

Common time complexity


O ( 1 ) O(1)

In this case, the complexity of the code is O(1)O(1)O(1) O(1). In the following code, assuming that the execution time of each line is the same, cut to TTT, the execution time of each line 2 and 3 is $t + t = 2t. In this case, the execution time complexity is constant.

void sayHello(String name){
    System.out.prinln("Hello, " + String);
    System.out.prinln("Welcome to follow my official account: [Village Yuyao]");
}
Copy the code


O ( l o g n ) O(log n)

As in the following binary search code, the search scope can be reduced exponentially through the while loop, assuming that it takes x times to break out of the loop, then num * 2 * 2… X =log2(n/num)x =log2(n/num)x =log2(n/num)x =log2(n/num)x =log2(n/num)x =log2(n/num)x =log2(n/num) So the time complexity is expressed in big O notation as O(logn). O (log n). O (logn).

int binarySearch(int[] arr, int target){
    int left = 0;
    int right = arr.length - 1;
    while(left <= right){
        int middle = left + (left - right) / 2;
        if(arr[middle] == target){
            return middle;
        }else if(arr[middle] > target){
            right = middle - 1;
        }else {
            left = middle + 1; }}return -1;
}
Copy the code


O ( n ) O(n)

As in the following code, the code in the for loop is executed arr.length times, so the time required is proportional to the length of the array, so it can be expressed as O(n)O(n)O(n). Using the method of deduction to principle and analysis above, it can be known that the following code has the most cycles of 4 and 5 lines, and the total execution time is O(2n)O(2n)O(2n) O(2n). After removing the coefficients, the final time complexity O(n)O(n)O(n) O(n) is obtained.

int sum(int[] arr){
    int total = 0;

    for(int i = 0; i < arr.length; i++){
        total += arr[i];
    }

    return total;
}
Copy the code


O ( n l o g n ) O(n log n)

If we execute a code O(logn)O(logn)O(logn) n times, then the code becomes O(logn)O(logn)O(logn) O(logn).

void hello (int n){
    for( int i = 1 ; i < n ; i++){
        int m = 1;
        while( m < n ){
            m *= 2; }}}Copy the code


O ( n 2 ) O(n^2)

Assuming that the code with time complexity O(n)O(n)O(n) O(n) is repeatedly executed NNN times, the time complexity at this time is N ∗O(n)n*O(n)n∗O(n), which can be expressed as O(n2)O(n^2)O(n2), in the form of a double cycle.

void selectionSort(int[] arr, int n){
    for(int i = 0; i < n; i++){
        int min = i;
        for(int j = i + 1; j < n; j++){
            if(arr[j] < arr[min]){
                inttmp = arr[i]; arr[i] = arr[j]; arr[j] = tmp; }}}}Copy the code


O ( n 3 ) O(n^3)

And O (n2) to O (n ^ 2) O (n2), similar to the time complexity is O (n2) to O (n ^ 2) O (n2) code nested loop once, this time complexity becomes a O (n3) O (n ^ 3) O (n3), is expressed in the form of triple nested loop.

void demo(int n){
    for(int i = 0; i < n; i++){
        for(int j = 0; j < n; j++){
            for(int k = 0; k < n; k++){
                System.out.print("Hello, World");
            }
            System.out.print("-- -- -- -- -- -");
        }
        System.out.print("* * * * * *"); }}Copy the code


O ( n ! ) O(n!)

Although there is a theoretical time complexity O(n!) O(n!) O(n!) But you basically don’t see it in practice, so I’m not going to expand it here.

Spatial complexity

define

Is space complexity of an algorithm in the process of running the temporary occupy a measure of the size of the storage space (namely, except the size of the original sequence of memory, in the process of the algorithm used the extra storage space), reflecting the trend of memory footprint, rather than a specific memory, also called a gradual space complexity, said storage space of the algorithm and the growth of the relationship between the data scale, S(n)S(n)S(n);

Common space complexity


O ( 1 ) O(1)

The temporary space required by the algorithm does not change with the size of a variable n, so the spatial complexity of the algorithm is a constant, which is expressed as S(n)=O(1)S(n) =O(1).

int num1 = 1;
int num2 = 2;
int total = num1 + num2;
Copy the code


O ( n ) O(n)

The memory occupied by the array is N, and no new space is allocated later, so the space complexity of the algorithm is S(n)=O(n)S(n) =O(n)S(n) =O(n).

int[] arr = new int[n];
Copy the code


O ( n 2 ) O(n^2)

In the case of two-dimensional arrays;

int[][] arr = new int[n][n];
Copy the code

Time complexity and space complexity of common sorting algorithms

For common sorting algorithms in interviews, the following summary gives their time complexity, space complexity, and algorithm stability.

Sorting algorithm Average time complexity Best time complexity Worst time complexity Spatial complexity The stability of
Insertion sort
O ( n 2 ) O(n^2)

O ( n ) O(n)

O ( n 2 ) O(n^2)

O ( 1 ) O(1)
stable
Hill sorting
O ( n 1.3 ) O (n ^ 1.3} {)

O ( n ) O(n)

O ( n 2 ) O(n^2)

O ( 1 ) O(1)
unstable
Selection sort
O ( n 2 ) O(n^2)

O ( n 2 ) O(n^2)

O ( n 2 ) O(n^2)

O ( 1 ) O(1)
unstable
Heap sort
O ( n l o g 2 n ) O(nlog_2n)

O ( n l o g 2 n ) O(nlog_2n)

O ( n l o g 2 n ) O(nlog_2n)

O ( 1 ) O(1)
unstable
Bubble sort
O ( n 2 ) O(n^2)

O ( n ) O(n)

O ( n 2 ) O(n^2)

O ( 1 ) O(1)
stable
Quick sort
O ( n l o g 2 n ) O(nlog_2n)

O ( n l o g 2 n ) O(nlog_2n)

O ( n 2 ) O(n^2)

O ( n l o g 2 n ) O(nlog_2n)
unstable
Merge sort
O ( n l o g 2 n ) O(nlog_2n)

O ( n l o g 2 n ) O(nlog_2n)

O ( n l o g 2 n ) O(nlog_2n)

O ( n ) O(n)
stable
Count sorting
O ( n + k ) O(n+k)

O ( n + k ) O(n+k)

O ( n + k ) O(n+k)

O ( n + k ) O(n+k)
stable
Bucket sort
O ( n + k ) O(n+k)

O ( n ) O(n)

O ( n 2 ) O(n^2)

O ( n + k ) O(n+k)
stable
Radix sort
O ( n k ) O(n*k)

O ( n k ) O(n*k)

O ( n k ) O(n*k)

O ( n + k ) O(n+k)
stable

conclusion

Well, that’s all for today’s article. This paper mainly introduces the definition, derivation principle and common time complexity of time complexity, and also introduces the definition and common space complexity of space complexity, and finally summarizes the common sorting algorithm of time complexity and space complexity. If you find the article helpful, just give it a “like” and leave!