About

If programming languages are the work of programmers, then data structures and algorithms are the work of programmers.

Time complexity

Time complexity

In the time frequency just mentioned, n is called the size of the problem, and as n keeps changing, the time frequency T(n) also keeps changing. But sometimes we wonder how it changes. To this end, we introduce the concept of time complexity. In general, the number of repeated basic operations in the algorithm is a function of the problem size N, expressed by T(n). If there is some auxiliary function F (n), such that when n approaches infinity, the limit value of T(n) /f(n) is a constant not equal to zero, then F (n) is said to be a function of the same order of magnitude as T(n). Denoted as T(n)=O(f(n)), O(f(n)) is called the asymptotic time complexity of the algorithm, or time complexity for short. The time frequency varies, but the time complexity may be the same. For example, T(n)=n2+3n+4 and T(n)=4n2+2n+1 have different frequencies, but the time complexity is the same, both are O(n2). In order of increasing order, the common time complexity is: constant order O(1), logarithmic order O(log2n), linear order O(n), linear logarithmic order O(nlog2n), square order O(n2), cubic order O(n3),... Order K, order K nk, order 2n exponential. As the problem size n increases, the above time complexity increases, and the execution efficiency of the algorithm decreases.Copy the code

Finding time complexity

If the execution time of the algorithm does not increase with the size of the problem n, even if there are thousands of statements in the algorithm, the execution time is no more than a large constant. The time complexity of such algorithms is O(1).

x=91; y=100; while(y>0) if(x>100) {x=x-10; y--; } else x++;Copy the code

Answer: T(n)=O(1), this program looks a little scary, running the loop 1100 times, but do we see n? No. This program is running independent of n, and if it repeats for 10,000 years, we don’t care, it’s just a function of constant order

  • When there are several loop statements, the time complexity of the algorithm is determined by the frequency f(n) of the innermost statements in the loop statement with the most nested levels.
x=1; for(i=1; i<=n; i++) for(j=1; j<=i; j++) for(k=1; k<=j; k++) x++;Copy the code

The statement with the largest frequency in this program segment is (5). Although the execution times of the inner loop are not directly related to the problem size N, they are related to the variable value of the outer loop, while the number of the outermost loop is directly related to N. Therefore, the execution times of the statement (5) can be analyzed from the inner loop to the outer loop: Then the time complexity of the program segment is T(n)=O(N3/6 + low-order term)=O(N3)

  • The time complexity of the algorithm depends not only on the size of the problem but also on the initial state of the input instance. The algorithm for finding the given value K in the value A[0..n-1] is roughly as follows:
i=n-1; while(i>=0&&(A[i]! =k)) i--; return i;Copy the code

The frequency of statement (3) in this algorithm is not only related to the problem size N, but also to the values of each element of A and K in the input instance:

  • ① If there is no element equal to K in A, then the frequency of statement (3) f(n)=n;

  • ② If the last element of A is equal to K, then the frequency f(n) of statement (3) is constant 0.

  • Time complexity evaluation performance has two algorithms A1 and A2 to solve the same problem, T1(n)= 100N2, T2(n)=5n3. (1) When the input quantity n < 20, T1(n) > T2(n), which takes less time. (2) With the increase of problem size N, the ratio of time cost of the two algorithms 5n3/100n2=n/20 also increases. That is, when the problem size is large, algorithm A1 is much more efficient than algorithm A2. Their asymptotic time complexity O(n2) and O(n3) gives a macroscopic evaluation of the quality of these two algorithms in terms of time. In algorithm analysis, time complexity and asymptotic time complexity are often not distinguished, but the asymptotic time complexity T(n)=O(f(n)) is often referred to as time complexity, where F (n) is generally the statement frequency with the highest frequency in the algorithm.

Fixed part

The size of this part of the space has nothing to do with the number of input/output data, numerical value. It mainly includes instruction space (code space), data space (constant, simple variable) and so on. This part of the space is static.Copy the code

Learn together

  1. Fork my project
  2. Pull Request
  3. Merge