Why is complexity analysis needed?

Some people think that writing a set of algorithms, through monitoring and statistics, can be very accurate to know the running time and memory footprint. This statistical method is correct, but it has one big limitation.

The results depend on the machine

Finding a machine with a high configuration and a machine with a low configuration will result in different results.

The size of the test data affects the test results

The size of the data has a great probability to affect the execution time and results of the algorithm.

Regular summary

The execution time T(n) of all code is proportional to the number of times f(n) of execution per line of code.

It’s summed up in a formula

  • T(n) represents the total execution time of the code.
  • N indicates the data size.
  • F (n) represents the total number of times each line of code is executed
  • O in the formula means that the execution time of the code T(n) is proportional to the expression F (n).

The origin of time complexity

Time complexity Example 1: T(n) = O(2n+2), example 2: T(n) = O(2n2+2n+3), high O complexity is not the actual execution time of the code, but the trend of time growth as the data size increases. So it’s also called progressive time complexity. In short, time complexity.

When n is large, you can think of it as 10,000, 100,000. However, the low-order, constant and coefficient parts in the formula do not control the growth trend, so they can be ignored. We only need to record a maximum order of magnitude. If we use big O notation for the time complexity of the two examples we just described, we can write it as T(n) = O(n); T (n) = O (n ^ 2).

How to analyze time complexity?

1. Focus only on the piece of code that has been executed the most.

High O complexity represents a trend over time with data size. We usually ignore constants, low orders, and coefficients in the formula and just record the magnitude of the largest order.

 int cal(int n) {
   int sum = 0;
   int i = 1;
   for (; i <= n; ++i) {
     sum = sum + i;
   }
   return sum;
 }
Copy the code

The second and third lines of code are constant-level execution times that are independent of the size of n and have no effect on complexity. This loop is executed the most times, n times, so the total time is order n.

2. The addition rule, the total complexity is equal to the maximum code complexity.

Important point: Constant execution time is independent of complexity. Even if a loop executes a thousand times, ten thousand times, as long as it’s a known number, it’s a constant execution time. Although n is a large number, it has a significant impact on execution time. But back to the concept of time complexity, it represents the trend of algorithm execution efficiency as the data size increases. So whatever execution time is, we can ignore it because it has no effect on the growth trend.

3. Multiplication rule: The complexity of nested code is equal to the product of the complexity of both inside and outside of the nested code

Actually, this is easy to understand, let’s just think of it as a nested loop. For example, within one loop, another loop is called, the complexity of the code is the product of the complexity. If T1(n)=O(f(n)), T2(n)=O(g(n)); Then T = T1 * T2 (n) (n) (n) = O (f (n)) * O (g (n)) = O (f (n) * g (n)).

Several common time complexities

The origin of spatial complexity

By analogy, the full name of time complexity is progressive time complexity, which represents the increasing relationship between the execution time of an algorithm and the size of data. By analogy, the full name of spatial complexity is asymptotic spatial complexity, which represents the growth relationship between the storage space of an algorithm and the size of the data.

void print(int n) {
  int i = 0;
  int[] a = new int[n];
  for (i; i <n; ++i) {
    a[i] = i * i;
  }

  for (i = n-1; i >= 0; --i) {
    print out a[i]
  }
}
Copy the code

We define a constant int I =0; And an array of length N, so the space complexity of the whole code is O(n).

The space complexity is much simpler than the time complexity. The common space complexity is O(1), O(n), O(n2), and logarithmic complexity like O(logn) and O(nlogn) is not used at ordinary times.

conclusion

Complexity includes time complexity and space complexity. Our common time complexity is shown in the figure above. Mastering complexity analysis proficiently enables us to learn an essential skill of algorithms.