This series of articles are recorded in the form of notes. The main content is derived from The beauty of Data Structures and Algorithms written by Professor Wang Zheng.

Best and worst time complexity

What is the best time complexity

The best-case time complexity is, in the best case, the time complexity of executing this code.

What is the worst time complexity

The worst-case time complexity is, in the worst case, the time complexity of executing this code.

See the examples:

    /** * Finds the unknown * of the specified number in the array@paramArray an array of *@paramX the number you need to look up in the array *@returnLocation * /
    int find(int[] array, int x) {
        int i = 0;
        int pos = -1;
        int n = array.length;
        for (; i < n; ++i) {
            if (array[i] == x) {
                pos = i;
                break; }}return pos;
    }
Copy the code

If the first element in the array happens to be the variable x we’re looking for, then we don’t need to go through the remaining n-1 data, and the time is O(1). But if there is no variable x in the array, then we need to iterate through the entire array, and the time is order n.

So the best time complexity in this example is O(1), and the worst time complexity is O(n).

Average time complexity

What is average time complexity

The average time complexity is how long it takes to execute this code, averaged over all the cases.

In the example above, the position of the variable x in the array can be n+1:0 ~ n-1 and not in the array. We add up the number of elements to be traversed in each case and divide by n+1 to get the average of the number of elements to be traversed, that is:

So the average complexity is O(n) if you omit the ones that you don’t want.

This calculation is actually problematic, because all the probabilities are different. The weighted average (expected value) is involved here, so I won’t expand the description, but if you’re interested, you can see the link for yourself

Amortize the time complexity

What is amortized time complexity

A data structure for a set of continuous operation, the time complexity in most cases is very low, only a few cases of high time complexity and the and the coherent temporal relationship between operation, this time, we can put together a set of operations analysis, look to whether can high time complexity of the operation of time consuming, Amortized to other operations that are less time complex. Moreover, in the case where amortized time complexity analysis can be applied, the general amortized time complexity is equal to the best-case time complexity.

 // array represents an array of length N
 // Array. length is equal to n
 int[] array = new int[n];
 int count = 0;
 
 void insert(int val) {
    if (count == array.length) {
       int sum = 0;
       for (int i = 0; i < array.length; ++i) {
          sum = sum + array[i];
       }
       array[0] = sum;
       count = 1;
    }

    array[count] = val;
    ++count;
 }
Copy the code

You can see from the code that in most cases the time complexity is O(1). Only in rare cases, the complexity is high, O(n). The occurrence frequency of O(1) and O(n) time-complexity inserts is very regular, and there is a certain sequential relationship between them. Generally, an O(n) insert is followed by n-1 O(1) insert operation, and the cycle repeats.

So if you amortize the operation that takes more time to the next n-1 operation that takes less time, then the amortized time complexity of this set of continuous operations is O(1). So that’s the general idea of amortized analysis.

Amortized time complexity is a special kind of average time complexity, and we don’t need to spend too much effort distinguishing them.

Other articles in this series: Table of contents for Data Structures and Algorithms