The last article -JavaScript algorithm complexity analysis article introduced the complexity analysis, BELIEVE friends to common code time or space complexity must be able to analyze out.

Think about the test

Without further ado, here is a topic to test you and analyze the time complexity of the following code (PS: although it is not written this way).

function find(n, x, arr) {
        let ind = -1;
        for (let i = 0; i < n; i++) {
          if (arr[i] === x) ind = i;
        }
        return ind;
      }
Copy the code

The above function finds if a variable x is in arr, and returns its location if so, or -1 otherwise. From the analysis in the last section, the time complexity of this function is easy to know, which is O(n).

Next, we’ll tweak the find function a little so that if we find the target, we don’t need to go any further.

Please analyze the time complexity of the optimized function:

function find(n, x, arr) {
        let ind = -1;
        for (let i = 0; i < n; i++) {
          if (arr[i] === x) {
              ind = i;
              break; }}return ind;
      }
Copy the code

Is the time complexity of the code still O(n)? Uncertain, using the analysis of the previous chapter can not be solved.

Different situation

Because the variable x you’re looking for can appear anywhere in the array. If x happens to be the first element in the array, then the function breaks, and there is no further iteration, and the time complexity is O(1). But if it happens to be the last element in the array, or if there’s no variable x in the array, then you have to go through the entire array, and it’s O(n). Therefore, under different circumstances, the time complexity of this function is different.

So in order to represent the different time complexity of code in different cases, you need to understand three concepts: best-case time complexity, worst-case time complexity, and average-case time complexity **.Copy the code

The ideal situation

Best-case time complexity: In the best case, the time complexity of executing this code. For example, in the best case, the variable x is the first element in the array, in which case the time complexity is the best case.

Bad situation

Worst case time complexity: In the worst case, the time complexity of executing this code. For example, if the variable x is the last element in the array or doesn’t exist in the array, the search function will iterate through the array, and in that case the time is the worst-case time.

On average

However, best-case time complexity and worst-case time complexity correspond to extreme case code complexity, which is not very likely to occur.

To better represent average case complexity, we introduce another concept: average case time complexity, abbreviated as average time complexity.

How to analyze the average time complexity? Let’s look at the previous lookup function:

The position of the variable x in the array can be n+1:0 to n-1 and not in the array. And then you add up the number of elements that you need to iterate over in each case, and then you divide by n plus 1, and you get the average of the number of elements that you need to iterate over.

According to the previous chapter, coefficients, low orders and constants can be omitted from the big O notation of time complexity. Therefore, after simplifying this formula, the average time complexity is O(n).

However, the above calculation does not take into account the problem of probability, because the probability of appearing in each position is different, so it has to be recalculated, as follows:

The variable x you’re looking for is either in the array or not in the array at all. The simple marker has a probability of 1/2 in both cases. In addition, the probability that the data to be searched appears in the n positions from 0 to n-1 is the same, which is 1/n. So, according to the probability product rule, the probability that the data you are looking for will appear anywhere from 0 to n-1 is 1/(2n). So if we take into account the probabilities of each situation, the expression becomes:

The final result is also called the weighted average in probability, so the average time complexity of the final function is O(n).

So if you look at it this way, the average time complexity is really cumbersome, and you need to calculate the probability. In fact, in most cases, we do not need to distinguish between best case, worst case, and average case time complexity. Most of the time, we can use a complexity to meet the requirements. We use these three complexity representations only when the time complexity of the same piece of code varies by an order of magnitude in different cases.

Capitation situation

Let’s look at another concept, a special average time complexity: amortized time complexity.

Let’s look at a particular function and examine its time complexity:

{ var arr = new Array(n); Var ind = 0;function add(num) {
        if (ind === arr.length) {
          var sum = 0;
          for(var i = 0; i < arr.length; i++) { sum += arr[i]; } arr[0] = sum; ind = 1; } arr[ind] = num; ind++; }}Copy the code

The add function implements a function that adds data to an array. Define an empty array of arbitrary length, then add data to the array. When ind === array.length is reached, we loop through the sum through the for loop, place the sum at the first position in the array, and insert the new sum. But if the array is empty to begin with, the data is added directly to the array.

Let’s analyze the time complexity of this function:

In the best case, there are places left in the array, and all we need to do is add the data to ind, so the best case is O(1). In the worst case, there are no places left in the array, so we need to do the sum of the array first, and then add the data, so the worst case is order n.

Next, analyze the average time complexity to be calculated:

Since the length of the array is N, depending on where the data is added, there are n cases, each of which has a time complexity of O(1). In addition, there is a special case where adding data when there is no free space in the array is O(n) time. And all of these n+1 scenarios have the same probability of happening, which is 1 over n+1.

So by the big O notation, the average time is O(1).

The average complexity of the add function doesn’t need to be that complex. Let’s look at the differences between find and add:

  • findThe function is O(1) in the extreme case. butaddIn most cases, the time complexity of the function is O(1). Only in rare cases, the time complexity is high, O(n).
  • foraddIn terms of functions, the addition of O(1) time complexity and O(n) time complexity appear in a very regular frequency, and there is a certain sequence, generally, an O(n) addition is followed by n-1 O(1) addition operation, and the cycle repeats.

Therefore, for the complexity analysis of such a special scenario, we do not need to find out all the input cases and the corresponding probability of occurrence, and then calculate the weighted average as mentioned in the average complexity analysis method before.

For this special case, we introduce a simpler analysis method: amortization analysis method. The time complexity obtained by amortization analysis is called amortized time complexity.

How to use amortization analysis to analyze the amortization time complexity of the algorithm?

Let’s go back to the Add function. Each O(n) add operation will be followed by n-1 O(1) add operation, so if the operation that takes more time is amortized to the next n-1 operation that takes less time, then the amortized time complexity of this group of continuous operations is O(1). So that’s the general idea of amortized analysis.

The general situation is summarized as follows:

A data structure for a set of continuous operation, the time complexity in most cases is very low, only a few cases of high time complexity and the and the coherent temporal relationship between operation, this time, we can put together a set of operations analysis, look to whether can high time complexity of the operation of time consuming, Amortized to other operations that are less time complex. Moreover, in the case where amortized time complexity analysis can be applied, the general amortized time complexity is equal to the best-case time complexity.

Life, for example,

See how great people can bring complexity to life:

Today you are going to visit Lao Wang’s house, but his lover called him to play a soy sauce, she told you that she limited time n minutes to buy for him. So if you want to use his home to go to the grocery store downstairs and back for a minute at most, the “best-case scenario” is that you only have to wait for him for a minute. Then something could happen, like the elevator ran out of power, or he fell on the road, and god knows what he was doing, for n minutes. I can’t help it. My wife has a limit of n minutes, so this is the “worst case scenario”. So “average time complexity” means he could be number 1,2,3… If n comes back in some minute, the average is 1+2+3+… N over n, sum up the time of all possible scenarios divided by the number of scenarios. “Amortized time complexity” is to divide those that take more time with those that take less time to get a middle value. If n is 10 minutes, then 9 minutes is 4 minutes to 1 minute, then 8 minutes 3 to 2… That’s five minutes.

conclusion

Four concepts

  • Best-case time complexity: The time complexity of code execution in the best-case scenario.
  • Worst-case time complexity: The time complexity of code execution in the worst-case scenario.
  • Average case time complexity: expressed as a weighted average of the number of times the code executes in all cases.
  • Amortized time complexity: Most of all complexity cases of code execution are best-case time complexity, and some cases are worst-case time complexity and occur in a sequential relationship, individual worst-case time complexity can be amortized to the best-case time complexity. Basically, the amortized result is equal to the best-case time.

Purpose to introduce

  • In order to describe the time complexity of the code in a more comprehensive and accurate way, these four concepts are introduced.
  • The distinction between these four types of complexity is made only when code complexity varies by an order of magnitude in different cases. In most cases, there is no need to distinguish between them.

Focus on

If there are mistakes or typos, please give me a message pointed out, thank you.

We’ll see you next time.