This article has participated in the “Digitalstar Project” and won a creative gift package to challenge the creative incentive money.
I. Definition of time complexity
In algorithm analysis, the total number of statement execution T(n) is a function of problem size n, and then the change of T(n) with n is analyzed and the order of magnitude of T(n) is determined.
The time complexity of the algorithm, that is, the time measure of the algorithm, is denoted as T(n)=O(f(n)). It means that with the increase of problem size N, the growth rate of algorithm execution time is the same as that of F (n), which is called the asymptotic time complexity of the algorithm, or time complexity for short. Where f(n) is some function of the size n of the problem. In this way, capital O() is used to represent the notation of time complexity of the algorithm, which is called big O notation.
In general, as n increases, the algorithm with the slowest growth of T(n) is the optimal algorithm.
Obviously, from the definition of algorithm time complexity, we have unofficially named common time complexity, O(1) constant order, O(n) linear order, O(n²) square order, O(logn) logarithmic order, and of course, there are other orders, which we’ll talk about later.
Derivation of order O
We can deduce the time complexity of the algorithm by the following method, which is called the derivation of large O order:
1) Replace all addition constants in the running time with constant 1. 2) In the modified run times function, only the highest order is retained. 3) If the highest order term exists and is not 1, the constant multiplied by the term is removed.
So this is the big O order.
2.1 constant order
int sum = 0,n = 100; Sum = (1 + n) * n / 2; Printf ("%d", sum); /* Execute */ onceCopy the code
Suppose that there is an algorithm like the one above, and those of you who have not studied algorithms might think that the time complexity of the algorithm is O(3), but it is actually O(1). According to our big O derivation above, the first step, replacing 3 with 1, becomes 1, and the second step, the algorithm has no highest order at all, so the algorithm time complexity is O(1).
Sum = (1 + n) * n / 2; /* Execute */ once
int sum = 0,n = 100; Sum = (1 + n) * n / 2; Sum = (1 + n) * n / 2; Sum = (1 + n) * n / 2; Sum = (1 + n) * n / 2; Sum = (1 + n) * n / 2; Sum = (1 + n) * n / 2; Sum = (1 + n) * n / 2; Sum = (1 + n) * n / 2; Sum = (1 + n) * n / 2; Sum = (1 + n) * n / 2; Printf ("%d", sum); /* Execute */ onceCopy the code
In the above algorithm, the code executes 12 lines, which is actually no different from the original 3 lines, because it is independent of the nature of the problem: the size of n, so the final time complexity of the algorithm is still O(1).
The above two are called O(1) time complexity, or constant order.
2.2 linear order
int i; for (i = 0; i < n; I++) {/* sequence of steps in O(1) time complexity */}Copy the code
As shown in the code above, the complexity of the code inside the loop is O(1). The key point is the size of n, which determines the number of cycles, so the time complexity of the whole algorithm is O(n).
2.3 the logarithmic order
int count = 1; while (count < n) { count = count * 2; /* Sequence of O(1) steps */}Copy the code
As shown in the code above, count is multiplied by two each time. Assuming count =10 and n=100, we have the following equation:
10 times 2 is 20, 20 times 2 is 40, 40 times 2 is 80Copy the code
So it ends up being 10 times 2 to the third is equal to 80. So we conclude that count times 2 to the x is equal to n, and if we replace count with the constant 1, we get x is equal to log base 2n. So the time of this loop is order logn.
2.4 square order
int i, j; for (i = 0; i < n; i++) { for (j = 0; j < n; J++) {/* sequence of O(1) steps */}}Copy the code
As the nested loop above, its memory loop is a linear order, then its time complexity is O(n), the time complexity of the external loop is also O(n), equivalent to the memory O(n) loop n times, then the time complexity of this code is O(n²).
If there is the following algorithm:
int i, j; for (i = 0; i < m; i++) { for (j = 0; j < n; J++) {/* sequence of O(1) steps */}}Copy the code
The time complexity above is order mn.
Here’s how to derive time complexity for a complex case:
int i, j; for (i = 0; i < n; I ++) {/* note that j = I instead of 0 */ for (j = I; j < n; J++) {/* sequence of O(1) steps */}}Copy the code
When I = 0, internal execution n times; When I = 1, the internal execution of n-1, and so on:
n + (n-1) + (n-2) + .... . + 1Copy the code
When the above formula is transformed, the first term plus the last term is N +1, and there are n n+1 in total, then the following formula is obtained:
N times n plus 1 over 2 is n squared over 2 plus n over 2Copy the code
1) there is no addition constant, so do not replace 2) keep the highest term, which is n²/2 3) remove the constant, which is 1/2, and wait until the final result is n².
As mentioned above, some conclusions can be obtained. In fact, the derivation process is not difficult, but it is difficult to transform the sequence, such as arithmetic sequence.
2.5 Other time complexity
In addition to those described earlier, there are other time complexities.
Cubic order: O(n³) Exponential order: 2 n Pair number order: N logn factorial order: N! N to the n: n or N
These all the time complexity of time-consuming size is as follows: (1) < O O (logn) < O (n) < O (nlogn) < O (n squared) < O (n after) < O (2 ⁿ) < O (n!) < O (n ⁿ)
Third, algorithm space complexity
Space for time.
The calculation formula of the space complexity of the algorithm is written as S(n)=O(f(n)), where n is the size of the problem and f(n) is the function of the statement regarding the storage space occupied by n.
We both use “time complexity” to refer to runtime requirements and “space complexity” to refer to space requirements. When “complexity” is used without a qualifier, it usually means time complexity.
This article focuses on time complexity, not space complexity.