Follow the public account MageByte, set the star punctuation “looking” is our motivation to create good writing. “Add group” into the technical exchange group for more technical growth. This article is from Magebyte-Aoba
Last time we talked about time complexity and spatial complexity, listed some analysis techniques and some common complexity analysis such as O(1), O(logn), O(n), O(nlogn). Today we will continue to elaborate on time complexity.
1. Best Case time complexity
2. Worst case time complexity
3. Average Case Time complexity
4. Amortized Time Complexity
Complexity analysis
public int findGirl(int[] girlArray, int number) {
int i = 0;
int pos = -1;
int n = girlArray.lentgh();
for (; i < n; ++i) {
if (girlArray[i] == number) {
pos = i;
break; }}return pos;
}
Copy the code
The code logic, which you should be able to easily see, looks for the number in an unordered array, and returns -1 if it doesn’t find it. “Tang Bo Hu light Autumn heung” star ye through this method through the number of groups to find autumn heung, because at the moment we have not learned a variety of coquetty algorithm, can only check from beginning to end is autumn heung, so can only go through the number of groups. GirlArray array holds autumn fragrance, winter fragrance, spring fragrance… Now Tang pak Hu checks whether it is Chou-heung by selecting the code number.
The time complexity of this code varies in different cases, so to describe the different time complexity of the code in different cases, we introduce == best, worst, and average time complexity ==. N = length of the girlArray array.
- When chu heung is in the first one, the time complexity of the code is O(1).
- When Chou-heung is at the back of the queue, the time complexity of the code is order n.
- When Chou-heung is in the team but not first and not last, then there is no certainty.
- If Washington cheated, there was no chou-heung in the team. Tang Boxu also needed to check the team one by one to know that the time complexity became O(N).
Best case time complexity
In the best case, the execution of this code time, that is, “Tang Pak Hu” the fastest point Mid-Autumn incense. If this row of girls represents the girlArray array, the number variable is the code for Chou-heung. If the first girl is Chou-heung, the time is O(1).
Worst-case time complexity
In the worst case, the time complexity of executing this code. So we’re going to check the length O(n) of each array.
Average case time complexity
In fact, the best and worst cases are extreme cases, the probability of occurrence is not large. So to more accurately represent the average case time complexity, we introduce another change: the average case time complexity.
==n + 1== :
In arrays 0 to n minus 1 and not in this array. There are n possibilities in the array, plus n + 1 possibilities not in the array. Each case has a different number of girls to traverse. We add up the number of girls we need to find in each case, and then divide by the number of cases (n + 1) to get the average number of times we need to go. Knock on the board: the formula is average case complexity = number of elements of each traversal/number of all cases
The average case complexity is:
Derivation process:
According to the time complexity and spatial complex large O notation we learned before, omit coefficient, ground connection, constant, so the average case of time complexity is O(n).
Expected time complexity
The above derivation of average case time complexity ** does not consider the probability of occurrence of each case. For n+1 cases, the probability of occurrence of each case is different, so the probability of occurrence of each case should be introduced for specific analysis.
Chou-heung’s number is either in the range 0 to n-1 or not in the range 0 to n-1, so their probability is.
At the same time, the probability of number in each position from 0 to n-1 is 1/n. According to the probability product rule, what is the probability that number is anywhere between 0 and n-1? \frac {1} {2n}? .
Therefore, on the basis of the previous derivation, we take the probability of each situation into account, and then the calculation process of the time complexity of the average situation is as follows:
Consider the average case complexity of the probabilities:
This is the weighted average in probability theory, also called the expected value, so the full name for average time complexity is the weighted average time complexity or expected time complexity.
After introducing probability, the average complexity becomes O(? \frac {3n+1} {4}?) , ignoring the coefficient and constant, the final weighted average time complexity is O(n). The students can breathe a sigh of relief.
Note:
In most cases, we do not need to distinguish between best-case, worst-case, and average-case time complexity. Only when there is a difference of magnitude in the time complexity of the same piece of code in different cases, we will distinguish the three cases in order to describe the time complexity of the code more effectively.
Time complexity of the amortized case
Now, the last piece of the puzzle, it’s a lot easier to look at this when you know the expected time complexity with the probability on top. Amortized time complexity sounds a little bit like average time complexity.
Amortized complexity is a more advanced concept. It is a special case and applies to more specific and limited scenarios.
The corresponding analysis method is called amortized analysis or amortized analysis.
// array represents an array of length N
// Array. length is equal to n
int[] array = new int[n];
int count = 0;
public void insert(int val) {
if (count == array.length) {
int sum = 0;
for (int i = 0; i < array.length; ++i) {
sum = sum + array[i];
}
array[0] = sum;
count = 1;
}
array[count] = val;
++count;
}
Copy the code
Code logic: to insert data, an array when the array is full count = = array. The lenth, iterate through summation, after summing the array of the sum of the values in the first place, and then insert the new data. But if the array has free space to begin with, the data is inserted directly into the array. Data is full: a storage space that can be read and written repeatedly is empty if the user thinks it is empty. If you define emptying to be all overwritten to zero or something, that’s fine! The user only cares about the new value to save!
Analyze the above time complexity:
- In the best case, if you have free space, just insert it into the array at the subscript count. So it’s order 1.
- In the worst case, the array has no free space and needs to be iterated over and then inserted. Time complexity O(n).
Average time complexity
The array length is n, and because you can insert in different places, there are n cases, each of which is O(1).
And then there’s the special case, where there’s no free space to insert, which is order n, which is n+1, and what’s the probability of each case? \frac{1} {n+1}? . Therefore, according to the weighted average calculation method, the average time complexity is:
When the coefficients and constants are omitted, the average time complexity is O(1).
We don’t need to be so complicated. Compare findGirl with insert.
- FindGirl is O(1) in extreme cases and O(1) in insert base cases. It’s only O(n) if the array is full.
- For insert() function, the frequency of O(1) and O(n) time complexity inserts is very regular, and there is a certain sequential relationship, generally an O(n) insert is followed by n-1 O(1) insert operation, cyclic.
Amortized analysis
The average complexity analysis of the above example does not need to be so complex as to introduce knowledge of probability theory.
As you can see from the analysis, the above example code is mostly O(1) in complexity, and only O(n) in extreme complexity. At the same time, the complexity follows a certain rule, which is generally 1 O(n) and N O(1). A simpler analysis method is used for this particular scenario: amortization analysis.
The time complexity obtained by amortization analysis is equal amortization time complexity.
The general idea is that every time O(n) is followed by n times O(1), so the ones that take more time are amortized to the ones that take less time. The amortized time complexity is O(1).
Application scenario: Amortized time complexity and amortized return analysis are special application scenarios. When continuous operations are performed on one data, the time complexity is low in most cases but high in some cases. This group of operations has a coherent sequential relationship.
At this point we analyze the set of operations together, amortizing the high complexity over the rest of the low complexity, so that the general amortized time is equal to the best-case time.
Note: Amortized time complexity is a special average complexity (used in special application scenarios). You need to master the analysis method.
Amortized time complexity is a special kind of average time complexity, and we don’t need to spend too much effort distinguishing them. The most important thing you should know is the method of analysis, amortized analysis. Whether the result of the analysis is called average or amortized, that’s just a saying, it doesn’t matter.
At the end of the article think
I’ll leave you with one last question. What I learned in this article is to analyze the “best”, “worst”, and “amortized” time complexities of the following code.
/ global variable, size is10Array, length len, subscript I.int array[] = new int[10];
int len = 10;
int i = 0;
// Add an element to the array
void add(int element) {
if (i >= len) { // The array space is running out
// Reapply for an array space twice the size
int new_array[] = new int[len*2];
// Copy the data from the original array to new_array
for (int j = 0; j < len; ++j) {
new_array[j] = array[j];
}
// new_array is copied to array. Array is now twice as big as len
array = new_array;
len = 2 * len;
}
// Place element at subscript I, subscript I plus one
array[i] = element;
++i;
}
Copy the code
The general idea is to add an element to the array, regenerate an array with twice as much space when you run out of space and copy the data from the original array into the new array.
When the load factor of the HashMap is 0.75, the HashMap needs to be expanded twice before the elements are added to the new array. What is the time complexity?
MageByte replies “add” to get the answer to this question, or “add” to the technical group to share your thoughts with us. Our first is time feedback.
Reference: The Beauty of Data Structures and Algorithms