This is the 25th day of my participation in the August Challenge. Check out the details: August Challenge
Time complexity
Time complexity: We discard the low-order operands of the algorithm and remove all the coefficients.
Put an O in front of it, big O notation.
int n = 100;
int a = 10;
System.out.println(a);
// Execute three times
Copy the code
There are no lower order terms, the coefficient is 3, get rid of the coefficient 3, so the time is order 1.
int n = 100;
int a = 10;
for (int i = 0; i < n; i++) { / / n
System.out.println(a); / / n
}
System.out.println(a);
// Execute 2n + 3 times
Copy the code
Remove the lower order term 3, remove the coefficient 2, so its time is O(n).
int n = 100;
int a = 10;
for (int i = 0; i < n; i++) {
for (int j = 0; j < n; j++) {
System.out.println(a);
System.out.println(a);
}
}
System.out.println(a);
/ / 2 n ^ 2 + 3
Copy the code
Get rid of the lower order 3, get rid of the coefficient 2, so its time is O(n2).
Common time complexity
Even if the coefficient is large, it doesn’t mean anything to us. As long as the order of n is large enough, it can always wipe out the influence of the lower order terms and coefficients, and the higher order terms are the main factors affecting the growth of the function.
The growth rate
General rule
The for loop
Assuming that the time complexity of the loop body is O(n) and the number of cycles is m, then the time complexity of the loop is O(m*n).
The time complexity is O(n*1), that is, O(n).
Nested for loops
For multiple loops, assuming that the time complexity of the loop body is O(n) and the number of cycles is A, B, and C, then the time complexity of the loop is O(n* A * B *c…). .
You should analyze these cycles from the inside out.
Time complexity is O(1*m*n), that is, O(m*n).
Sequence of statements
The running time of each statement can be summed (or maximized).
The time complexity is O(2+n+n2+1), that is, O(n2).
Branch statements
The total time complexity is equal to the time complexity of the path with the highest time complexity
Here, if we’re lucky, we end up making the first judgment if, so the time is O(n),
But the time complexity of the algorithm is worst-case, so look at the path with the highest time complexity, and that’s the final time complexity.
So the time complexity of the figure above is O(n2).
A function call
for (int i = 0; i < n; i++) { //O(n)
list.insert(0,i); //O(n)
}
Copy the code
The insert statement above has a time complexity of O(n), not O(1), so its time complexity is O(n2).
Function calls depend on the time complexity in the function body.
Note:
- The speed of an algorithm cannot simply be measured by execution time
Big O notation
Definitions in << Introduction to Algorithms >> :
For a given function g(n), O(g(n)) is used to represent the set of the following functions:
O (n) (g) = {f (n) : there is the normal amount of c and n0, for all n > = n0, have 0 < = f (n) < = CG (n)}.
We use O notation to give an upper bound on a constant factor of a function.
The big O notation tends to represent the worst complexity.
Common sorting algorithms and their corresponding time complexity and space complexity
Time complexity of common data structures
To find the
- Arrays can be queried directly by indices, so it’s O(1).
- The list has to be found one by one, so it’s order n.
Header insert/delete
- The array needs to move everything backwards, so it’s order n.
- The list just picks a place to put the element, and the pointer points to the next one, so it’s O(1).
Tail insert/delete
- The array is positioned directly at the end, and the input data can be inserted.
- The list is iterated to find the position of the last element before insertion
Intermediate insert/delete
-
Array time is spent copying the data, overwriting it
-
The linked list takes time to traverse
Array and linked list selection
Very few inserts/deletes, very many queries, and not Out of memory, arrays.
If there are frequent inserts, traversals, and few queries to retrieve, use linked lists.