The concept of complexity

An algorithm is a piece of executed program, which can be understood as a few lines of code, or a method; The time complexity of the algorithm refers to the time resources consumed by the code; The spatial complexity of an algorithm refers to the amount of spatial resources (usually memory) that this code consumes.

Large O complexity notation

Usually, when we talk about an algorithm, we say, well, this algorithm has time order.The O (),). The O (), O () is the big O notation. thisandHere’s a simple example:

int cal(int n) { 
  int sum = 0;									(1)
  for(int i = 1; i <= n; ++i){ (2)
    sum = sum + i;							(3)} (4)
  return sum;									
}  
Copy the code

The code above is to calculate 1, 2, 3… We assume that each line of code has the same CPU time, and each line of code is run_time. What is the total execution time T(n) of this code? The first line is run_time, and lines 2 and 3 loop n times for 2 * n * run_time. All total time:

T(n) = (2 * n + 1) * run_timeCopy the code

According to the law, it can be summarized into a formula:

T(n) = O (f(n))
Copy the code

So, the top O(), O () In the example,As well as.Run_time is a constant. When n is large enough, the lower order, constant, and coefficient in the formula can be ignored. N The code time complexity of the sum is O().

Common time complexity is as follows:

Here is an example of spatial complexity:

void print(int n){
  int i = 0;							 (1)
  int[] arr = new int[n];  (2)
  int j = 0;							 (3)}Copy the code

The first and third lines of code each request a space to store variables, both have data size n independent, all can be ignored, the second line request n number of array space. The space complexity of the entire code is zero.

Common space complexity is as follows:

What is the use of knowing the time and space complexity of an algorithm

First of all, comparing the quality of several algorithms requires the combination of application scenarios, such as data volume, memory, data characteristics, etc. The same piece of code runs at different speeds on different machines. Usually to compare the performance of several algorithms, it is necessary to run on the same machine, the same batch of data, and the same environment to compare the running time required to test them.

However, when developers use or choose an algorithm, it is often impossible to accurately compare the advantages and disadvantages of the algorithm, and the time cost does not allow. So we need a test without specific data, we can roughly estimate the execution efficiency of the algorithm, which is the time complexity and space complexity of the algorithm.

Best – and worst-case time complexity

Under what circumstances can a piece of code have the best and worst case; For example, if you are looking for a number in an array of integers of length N, you will immediately return the bottom table of the array. In a loop lookup, the best case is the first one in the array, and the best case is zero; If the last one in the array is the one to find, that’s the worst case, and the worst case time is zero.

Average case time complexity

As in the example above, when looking for a number in an array of integers of length N, there are best and worst cases, and the best and worst case time complexity does not measure the efficiency of a piece of code, you need to use the average case time complexity. Average = sum of all cases/total times.

So how do I figure out the average time complexity, where I’m looking for something that’s either in the array or not in the array, and what’s the probability that it’s in the array or not in the array, the probability of the searched number appearing in the position from 0 to n-1 is the same, is, all the numbers searched are in the array with probability. The overall process is as follows:

Minus the constant, order n.

Amortize the time complexity

Amortized time complexity is a special average time complexity that can only be used in special scenarios. For example, if you know Java, you’re familiar with the container ArrayList. Arraylist has an array of default length, and in add, if the array is not long enough, you need to expand it. Here is an example of add method pseudocode :(for simplicity, add int as an example)

int arr[] = new int[10]; // The initial length is 10
int len = 10;
int index = 0;

void add(int element){
  if(index >= len){  // If the length is exceeded, the array needs to be expanded and copied to a new array
    final int newLen = len * 2;
    int new_arr[] = new int[newLen]; // Apply for a double array
    for (int i = 0; i < len; ++i){
      new_arr[i] = arr[i];
		}
    arr = new_arr;
    len = newLen;
  }
  arr[index] = element;
  ++index;
}
Copy the code

In the case of no expansion, the time complexity of the add method is, if copy to the new array needs to be traversed once,

int arr[] = new int[10]; Int len = 10; int index = 0; void add(int element){ifFinal int newLen = len * 2; (index >= len){final int newLen = len * 2; int new_arr[] = new int[newLen]; // Apply for a double arrayfor (int i = 0; i < len; ++i){
      new_arr[i] = arr[i];
    }
    arr = new_arr;
    len = newLen;
  }
  arr[index] = element;
  ++index;
}
Copy the code

When you don’t expand, the time complexity of this add method is, when you expand, the copy to the new array in if has to be traversed once,

int arr[] = new int[10]; Int len = 10; int index = 0; void add(int element){ifFinal int newLen = len * 2; (index >= len){final int newLen = len * 2; int new_arr[] = new int[newLen]; // Apply for a double arrayfor (int i = 0; i < len; ++i){
      new_arr[i] = arr[i];
    }
    arr = new_arr;
    len = newLen;
  }
  arr[index] = element;
  ++index;
}
Copy the code

In the case of no capacity expansion, the time complexity of the add method is O(1). In the case of capacity expansion, if copy to the new array needs to be traversed once. The default array length is assumed to be N, and the time complexity of capacity expansion is O(n). So what is the average time complexity, if you do it in probability theory like we did in the last example, you can do it, but it’s a hassle. Let’s say we need to add 11 numbers, and when we add the 11th number we need to expand it, loop 10 times to copy the first 10 numbers into the new array and then add the 11th number. The time complexity of adding the first 10 numbers is O(1). If the first 10 numbers are allocated to the operation of adding the first 10 numbers, then the operation of adding the first 10 numbers only runs one line of code, which is constant level. The average time complexity of all the add methods is O(1). Analyzing time complexity by apportionment is called amortized time complexity.