Algorithm complexity
In the past, the keyword complexity was always mentioned when learning algorithms or looking at the Diff algorithm in the React document. Moreover, they often evaluate algorithms based on complexity. What is the complexity of algorithms? And how are they calculated? Study with these questions in mind.
Big O notation
The complexity of the algorithm is often expressed in big O notation, which shows the change trend of the time or space consumed by the operation of the program with the increase of the program data size. The big O notation converts the sum of all the execution steps of the program into a general term formula with respect to size N, and then removes the low-order coefficient and constant terms that do not greatly affect the overall complexity of the problem. According to the time required and space used, time complexity and space complexity are subdivided.
Time complexity
Also known as progressive time complexity: the increasing relationship between the time consumed by the algorithm and the data size: T(n) = O(f(n)) where N represents the program data size, f(n) represents the specific algorithm of complexity (the sum of the execution steps), and T(n) represents the total time consumed by the algorithm execution:
/ / pseudo code
let n = 10, sum = 0;
for (let i=0; i++; i<n ) {
sum = sum + i
}
Copy the code
Assuming that each line of code takes the same amount of time to execute, each line takes one unit of time:
let n = 10, sum = 0; Let I =0; i++; The time it takes for I <n is n sum = sum + I is nCopy the code
So the total time consumed by the whole program is 2n+1. If n is infinite, the low order coefficients and constant terms can be ignored, so f(n)=n, which is O(n) in big O notation.
A common order of time complexity
-
Constant order O (1)
let i = 10; let j = 0; let k = i + j Copy the code
Again, assume that each line of code takes the same amount of time to execute, all 1 unit of time; If I have n lines of code to execute, and the execution time is n, is the time of this program order n? Actually, no. In fact, the time complexity of the big O representation does not specifically represent the time consumed by code execution, but represents the changing trend of code execution time with the increase of data size. Therefore, it is also called progressive time complexity, or time complexity for short. If the execution time of a program does not increase with the size of the problem n, even if there are thousands of statements in the program, the execution time is no more than a large constant. The time complexity of such algorithms is O(1). Another simple rule is that the time complexity of this rule is equal to two o (1), even if there are thousands of statements in the program, as long as there are no circular or recursive statements in the algorithm.
-
The logarithmic order O (logn)
let i = 1; while(i < n) { i = i * 2; } Copy the code
I never understood before why time complexity has logarithmic order? I finally figured it out. With a loop body like this, we only need to focus on the statement that is executed the most times in the loop body. In the above program the value of I starts at 1, and each loop multiplies the value of I by 2. The loop ends when the value of I is greater than n. It is easy to see that the I values are 2⁰, 2¹, 2², and 2³…… , 2ˣ, where the value of x represents the number of cycles, and the critical condition for the end of the cycle is 2ˣ=n according to the logarithmic formula in mathematics can be calculated x value, that is, x = log₂𝒏; So its time is O(log₂𝒏)
-
Linear order O (n)
for (let i=0; i++; i<n ) {
sum = sum + i
}
Copy the code
I mentioned above, and I won’t go into this, but for loop statements, when calculating the time complexity, you only need to focus on the statement in the loop that is executed the most times, because if n is infinite, everything else is negligible.
- Squared square order O (n)
for (let i=0; i++; i<n ) {
for (let j=0; j++; j<n ) {
sum = sum + i + j
}
}
Copy the code
Spatial complexity
Also known as progressive space complexity: The growth relationship between the storage space required to perform an algorithm and the size of the data. Similar to time complexity, space complexity also represents a growth trend rather than a specific storage size value. S(n) = O(f(n)) where N represents the program data scale, f(n) represents the specific algorithm of complexity (unit space used), and S(n) represents the storage space required for algorithm execution. The spatial complexity is much simpler than the temporal complexity, because the two most common ones are:
-
Constant order O (1)
let i = 10; let j = 0; let k = i + j Copy the code
The storage space required by the execution of the program does not change with the size of a certain variable N, that is, the spatial complexity of the algorithm is a constant, but the difference of the size, and the spatial complexity can be expressed as O(1).
-
Linear order O (n)
/ / pseudo code let sum = {} for (let i=0; i++; i<n ) { sum[i]= i } Copy the code
Every time the program circulates, it needs to open up a space in memory to store the value of I. The space required for the entire program operation increases with the increase of variable n, which is a linear increase relationship. The space complexity can be expressed as O(n).
In space complexity, such as logarithmic order O(logn), exponential order O(n²), O(n³), etc., just think, we usually in the program is not used, here will not be described.