The previous article covered big O notation related to complexity and common time complexity orders. This article covers several more: Recursive algorithm time complexity, Best Case Time complexity, Worst Case Time complexity, average Case Time complexity Complexity) and amortized time complexity.
The time complexity of the recursive algorithm
If the recursive function, only one recursive call, recursion depth;
In each recursive function, the time complexity is T;
The total time complexity is O(T * depth).
In the previous study, merge sort and quicksort have the idea of recursion, and the time complexity is O(nlogn), but it is not necessarily O(nlogn) recursion function level. Analyze from the following two situations.
The complexity analysis of a recursive call is carried out in recursion
Binary search method
int binarySearch(int arr[], int l, int r, int target){
if( l > r ) return- 1; int mid = l + (r-l) / 2;if( arr[mid] == target ) return mid;
else if( arr[mid] > target )
returnbinarySearch(arr, l, mid-1, target); / / the leftelse
returnbinarySearch(arr, mid+1, r, target); } / / rightCopy the code
If arr[mid] is not target, then arr[mid] is smaller than target. If arr[mid] is not target, then arr[mid] is smaller than target. The binarySearch function is called again.
In this recursive function, each time a target is not found, either the binarySearch function on the left or the binarySearch function on the right is called. That is, at most one recursive call was made in this recursion. We know from the math that it takes log base 2n times to recurse to the end. Therefore, the time complexity of binary search is O(logn).
sum
int sum (int n) {
if (n == 0) return 0;
return n + sum( n - 1 )
}
Copy the code
It’s easy to understand in this code that the depth of recursion increases linearly with input N, so the time complexity is O (n).
exponentiation
O(logn) double pow(double x, int n){if (n == 0) return 1.0;
double t = pow(x,n/2);
if (n %2) return x*t*t;
return t * t;
}
Copy the code
The depth of the recursion is logn, because we’re trying to figure out how many times we have to divide by 2 to get to the bottom.
(2) The complexity analysis of multiple recursive calls is carried out in recursion
One of the more difficult things to calculate in recursive algorithms is multiple recursive calls.
So let’s look at the code below. There are two recursive calls.
Int f(int n){// int f(int n){if (n == 0) return 1;
return f(n-1) + f(n - 1);
}
Copy the code
The number of nodes in a recursion tree is the number of calls the code counts.
For example, when n = 3, the number of calls can be calculated by
1 + 2 + 4 + 8 = 15
Generally, the calculation formula of call times is
2^0 + 2^1 + 2^2 +…… + 2^n = 2^(n+1) – 1 = O(2^n)
Similar to this is a merge sort recursion tree, where the difference is
-
- The depth of the tree in the above example is
n
, and the recursive tree depth of merge sort islogn
.
- The depth of the tree in the above example is
-
- In the above example, the data size is the same for each processing, whereas in merge sort, the data size is gradually reduced for each node
Therefore, in sorting algorithms such as merge sort, the amount of data processed by each layer is O(n) level, and there are logn layers, and the time complexity is O(nlogn).
Best – and worst-case time complexity
The GIF shows the location of the first occurrence of variable x in array. If no location is found, -1 is returned. Otherwise return the position subscript.
int find(int[] array, int n, int x) {
for ( int i = 0 ; i < n; i++) {
if (array[i] == x) {
return i;
break; }}return- 1; }Copy the code
In this case, when the first element in the array is x, the time is O(1); When the last element is x, the time is O(n).
Best-case time complexity is the time complexity of executing code in the best case, which is the shortest; Worst-case time complexity is the time complexity of executing code in the worst case, which is the longest.
Average case time complexity
The best and worst time complexity reflect the complexity under extreme conditions, and the probability of occurrence is small, which cannot represent the average level. In order to better represent the algorithm complexity in the average case, it is necessary to introduce the average time complexity.
Average case time complexity can be expressed as a weighted average of the number of times the code executes under all possible scenarios.
Again, using the find function, from a probabilistic point of view, the probability of x at every position in the array is the same, 1 / n. So, then the average case time complexity can be calculated as follows:
(1 + 2 +… + n) / n + n) / 2 = (3n + 1) / 4
The average time complexity of find is O(n).
Amortized complexity analysis
We understand the amortized complexity through push_back on a dynamic array.
template <typename T> class MyVector{ private: T* data; int size; // Store the number of elements in the array. O(n) void resize(int newCapacity){T *newData = new T[newCapacity];for( int i = 0 ; i < size ; i ++ ){
newData[i] = data[i];
}
data = newData;
capacity = newCapacity;
}
public:
MyVector(){ data = new T[100]; size = 0; capacity = 100; } // Average complexity is O(1) void push_back(T e){if(size == capacity) resize(2 * capacity); data[size++] = e; } // Average complexity is O(1) Tpop_back(){
size --;
returndata[size]; }};Copy the code
Push_back adds an element to the end of an array. If the array is not full, push_back adds an element to the end of an array. If the array is full (size == capacity), double the array and insert elements.
For example, if the array length is N, the complexity of the first n calls to push_back is O(1). In the NTH + 1 operation, n element transfer operations are required first, and then one insertion operation is required, with a complexity of O(n).
Therefore, on average: for the dynamic array with n capacity, it takes 1 * N time to add the previous element and N time to expand the operation, which is 2 * N time in total. Therefore, the amortized time complexity is O(2n/n) = O(2), which is O(1) level.
An interesting conclusion can be drawn: if it is guaranteed that a relatively time-consuming operation will not be triggered every time, the corresponding time of this relatively time-consuming operation can be allocated to other operations.