This series of articles are recorded in the form of notes. The main content is derived from The beauty of Data Structures and Algorithms written by Professor Wang Zheng.
Time complexity
What is a data structure? What is an algorithm
- Data structure: A data structure is a storage structure for a group of data
- Algorithm: An algorithm is a set of methods for manipulating data
Data structures serve algorithms, which operate on specific data structures
What is time complexity and space complexity
Time complexity and space complexity are indicators to measure the efficiency of algorithm code execution.
Common complexity: The O (1), O (logn), O (n), O (nlogn), O (n2), O (nk), O (2 n) O (1), O (logn), O (n), O (nlogn), O (n ^ 2), O (n ^ k), O (2 ^ n) O (1), O (logn), O (n), O (nlogn), O (N2), O(NK), O(2n)
What is time complexity
Such as code:
int demo(int n) {
int sum = 0;
int i = 1;
for (; i <= n; ++i) {
sum = sum + i;
}
return sum;
}
Copy the code
Assume that each line of code executes at the same time, unit_time.
Lines 2 and 3 take 1 unit_time each, and lines 4 and 5 have been run n times, so it takes 2n times unit_time, so the total execution time for this code is (2n+2) unit_time. As you can see, the execution time T(n) for all code is proportional to the number of times per line of code is executed. T (n) = (2 n + 2) ∗ T (n) = (2 n + 2) * T (n) = (2 n + 2) ∗ unit_time
Let’s call unit_time and some of the other distractions O. So the above formula is simplified as: T = O (n) (2 n + 2) T (n) = O (n + 2) T (n) = O (n + 2)
We assume that n tends to infinity, so the 2 in the firm can be ignored. The final formula is simplified as: T(n)=O(n)T(n) =O(n)T(n) =O(n)
Let’s look at another example:
int demo(int n) {
int sum = 0;
int i = 1;
int j = 1;
for (; i <= n; ++i) {
j = 1;
for(; j <= n; ++j) { sum = sum + i * j; }}}Copy the code
The example above, try to think for yourself what is its time complexity?
Thinking time…
Thinking time…
Thinking time…
Announced the answer: the first step: T (n) = (2 n + 2 n + 3) ∗ T (n) = (n ^ 2 + 2 n + 3) * T (n) = (2 n + 2 n + 3) ∗ unit_time part ii: T (n) = O (n2 + 2 n + 3) T (n) = O (n ^ 2 + 2 n + 3) T (n) = O (n2 + 2 n + 3) third, T (n) = O (n2) T (n) = O (n ^ 2) T (n) = O (n2)
We are extending two examples, such as: T (n) = O (n) + O (n2) = O (n2) T (n) = O (n) + O (n ^ 2) = O (n ^ 2) T (n) = O (n) + O (n2) = O (n2) T (n) = O (n) ∗ O (n2) = O (n3) T (n) = O (n) * O (n ^ 2) = O (n3) T (n) = O (n) ∗ O (n2) = O (n3) \
Several common examples of time complexity
- Constant order O (1) O (1) O (1)
int i = 8;
int j = 6;
int sum = i + j;
Copy the code
Generally, the time complexity of this algorithm is equal to two o (1), even if there are thousands of lines of code, as long as there are no circular or recursive statements in the algorithm.
- Log-order, linear log-order O(logn), O(nlogn)O(logn), O(nlogn)O(logn), O(nlogn)O(logn), O(nlogn)
i=1;
while (i <= n) {
i = i * 2;
}
Copy the code
When n=1: O(n)=20=1O(n) =2 ^0 =1O(n)=20=1 When n=2: O(n)=21=2O(n) =2 ^1 =21=2 when n=4 is: O(n)=23=4O(n) =2 ^3 =4O(n) =23=4 It can be concluded from the above that the number of code runs is 2 (number of runs)=n2^ (number of runs)=n2 (number of runs)=n, then after logarithmic transformation: T(n)=O(log2n)T(n) =O(log_2N)T(n)=O(log3n) If the above code is changed to I = I * 3, then T(n)=O(log3n)T(n) =O(log3n) And because of the principle of logarithmic transformation (as shown in the figure below), log3n is equal to log32∗log2n, where log32Log_3N is equal to log_3 2 * LOG_2N, where log_3 2Log3n is equal to log32∗log2n, where log32 is a constant and is ignored. So O(log3n)=O(log2n)O(Log_3N)=O(log_2N)O(log3n)=O(log2n). Therefore, in the expression method of logarithmic order time complexity, we ignore the “base” of logarithm and express it as O(logn)O(logn)O(logn) O(logn).
For those of you who don’t know the log base change formula, look at this
- Conventional order O(m+n), O(m*n)
int demo(int m, int n) {
int sum_1 = 0;
int i = 1;
for (; i < m; ++i) {
sum_1 = sum_1 + i;
}
int sum_2 = 0;
int j = 1;
for (; j < n; ++j) {
sum_2 = sum_2 + j;
}
return sum_1 + sum_2;
}
Copy the code
As you can see from the code, m and n represent two data sizes. We can’t evaluate the magnitude of m or n in advance, so when we write complexity, we can’t simply use the rule of addition and omit one of them. So, the time complexity of the code above is O(m+n).
So that’s the time complexity notes. Next we’ll look at spatial complexity. If you feel good, you can follow me and update one article every week.
Other articles in this series: Table of contents for Data Structures and Algorithms