Time complexity of the algorithm

Two ways to measure the execution time of a program (algorithm)

  1. This method is feasible, but there are two problems: first, in order to evaluate the performance of the designed algorithm, it is necessary to run the program; Second, the statistics of the time obtained depend on the hardware, software and other environmental factors of the computer. In this way, it is necessary to run in the same state of the same computer to compare the algorithm faster.

  2. The pre-estimation method determines which algorithm is better by analyzing the time complexity of an algorithm.

Time frequency

Time frequency: The time taken by an algorithm is directly proportional to the number of statements executed in the algorithm. The algorithm that has more statements executed takes more time. The number of statements executed in an algorithm is called statement frequency or time frequency. Notes for T (n).

Ignore constant term

  1. 2n+20 and 2n as n gets larger, the execution curves approach infinity, and 20 can be ignored
  2. 3n+10 and 3n as n gets larger, the execution curves approach infinity, and 10 can be ignored

Ignore the lower degree term

  1. 2n^2+3n+10 and 2n^2 as n gets bigger, the execution curve gets infinitely close, and you can ignore 3n+10
  2. N ^2+5n+20 and n^2 as n gets larger, the execution curve gets infinitely close, and you can ignore 5n+20

Ignore the coefficient

  1. As the value of n increases, 5n^2+7n and 3n^2 + 2n, the execution curves overlap, indicating that in this case, 5 and 3 can be ignored.
  2. And n^3+5n and 6n^3+4n, perform the curve separation, show how many times the mode is critical

Time complexity

  1. In general, the number of repeated execution of the basic operation statement in the algorithm is a function of the problem size N, which is expressed by T(n). If there is some auxiliary function F (n) such that the limit value of T(n)/f(n) is a constant not equal to zero when n approaches infinity, then F (n) is said to be a function of the same order of magnitude as T(n). Denoted as T(n)=O(f(n)), O(f(n)) is called the asymptotic time complexity of the algorithm, or time complexity for short.

  2. T(n) is different, but the time complexity might be the same. For example, T(n)=n²+7n+6 and T(n)=3n²+2n+2 have different T(n), but the same time complexity, both are O(n²).

  3. Method for calculating time complexity:

  • Replace all the addition constants in the running time with constant 1 T(n)=n²+7n+6 => T(n)=n²+7n+1
  • In the modified run-time function, only the highest order term T(n)=n²+7n+1 => T(n)=n² is retained
  • T(n) = n² => T(n) = n² => O(n²)

Common time complexity

  1. Constant order O (1)
  2. The logarithmic order O (log2n)
  3. Linear order O (n)
  4. Linear log order O(nlog2n)
  5. Square order O (n ^ 2)
  6. Cubic order O (n ^ 3)
  7. Order N to the k.
  8. The index order O (2 ^ n)

Description:

  • The time complexity of common algorithms is as follows: O (1) < O (log2N) < O (N) < O (n2) < O (N3) < O (NK) < O (2N) : As the scale of problem N increases, the time complexity increases and the execution efficiency of the algorithm decreases. Therefore, avoid using the exponential algorithm

Constant order O (1)

It doesn’t matter how many lines of code it executes, as long as it doesn’t have loops or anything, the time complexity of the code is O(1).

The logarithmic order O (log2n)

instructions

Linear order O (n)

Linear log order O(nlogN)

instructions

Squared square order O (n)

instructions

Order N cubed, order K O(n^ K)

Note: refer to the above O(n²) to understand, O(n³) is equivalent to three layers of n cycle, other similar

Average time complexity and worst time complexity

  1. Average time complexity refers to the running time of the algorithm when all possible input instances occur with equal probability.
  2. The worst-case time complexity is called the worst-case time complexity. Generally, the time complexity discussed is the worst case time complexity. The reason for this is that the worst-case time complexity is the limit on the running time of the algorithm on any input instance, which guarantees that the algorithm will not run longer than the worst-case time.
  3. Whether the average time complexity is consistent with the worst time complexity depends on the algorithm (see figure :).

Spatial complexity

  1. Similar to the discussion of time Complexity, the Space Complexity of an algorithm is defined as the amount of storage the algorithm consumes as a function of the problem size N.
  2. Space Complexity is a measure of the amount of storage Space temporarily occupied by an algorithm while it is running. The number of temporary units of work required by some algorithms is related to the problem size n, which increases with the increase of N. When N is large, more storage units will be occupied, such as quicksort and merge sort algorithms
  3. When we do algorithm analysis, we mainly discuss time complexity. From the perspective of user experience, the speed of program execution is more important. Some caching products (Redis, memcache) and algorithms (radix sort) essentially trade space for time.