This is the first day of my participation in the August Challenge. For details, see:August is more challenging

1. Introduction

Brush algorithm friends all know that the algorithm has good and bad, we brush algorithm of the highest one pursuit is to find an optimal solution, so how do we judge the quality of an algorithm? That is the length of time the algorithm runs and the size of the memory space, corresponding to the time complexity and space complexity.

2. Measurement

Since there must be a way to measure good or bad, we often use the big O notation to measure the time complexity (also known as progressive time complexity) and space complexity (also known as progressive space complexity) of the algorithm. So what’s the big O notation? From the perspective of time complexity, it is defined as follows:

F (n) is a function of the same order of magnitude as T(n) if there is a function f(n) such that the limit value of T(n)/f(n) is a constant that does not equal zero as n approaches infinity. Write for

T(n)=O(f(n)), called O(f(n)), O is the asymptotic time complexity of the algorithm, or time complexity for short

Common time/space complexity levels are:

  • Constant order O (1)
  • The logarithmic order O (logN)
  • Linear order O (n)
  • Linear logarithmic order O(nlogN)
  • Squared square order O (n)
  • Cubic order O (n) after
  • Order n to the K.
  • The index order (2 ^ n)

Here are some examples of common time/space complexity.

3. Time complexity

In big-O notation, the formula for time complexity is: T(n) = O(f(n)), where f(n) represents the sum of the number of code executions per line, and O represents direct proportionation.

3.1. Constant order O(1)

No matter how many lines of code are executed, the time complexity of the code is O(1) as long as there are no complex structures such as loops. For example:

int i = 1;
int j = 2;
++i;
j++;
int m = i + j;
Copy the code

3.2. Linear order O(n)

The linear order is generally n cycles or recursions, such as:

for(i=1; i<=n; ++i)
{
   j = i;
   j++;
}
Copy the code

3.3. Logarithmic order O(logN)

Let’s look at the code first:


int i = 1;
while(i<n)
{
    i = i * 2;
}
Copy the code

As you can see from the code above, in the while loop, each time I is multiplied by 2, I is getting closer and closer to n. So let’s try to solve for this, and let’s say that after I goes through x times, I is greater than 2, and then the loop ends, which means that 2 to the x is equal to n, so x is equal to log base 2 to the n, which means that after I go through log base 2 to the n, this code ends. So the time complexity of this code is: order logn.

3.4. Linear logarithmic order O(nlogN)

Linear logarithmic order O(N logn), which is pretty easy to understand, if you loop a code that’s O logn times N times, then it’s N times O logn, which is O(N logn).

Take the above code with a few modifications:

for(m=1; m<n; m++) { i = 1; while(i<n) { i = i * 2; }}Copy the code

3.5. Order O(n²)

The square order O(n ^ 2) is a little bit easier to understand, because if I nested the O(n) code again, it would be O(n ^ 2) time. For example:

for(x=1; i<=n; x++) { for(i=1; i<=n; i++) { j = i; j++; }}Copy the code

4. Space complexity

Space complexity is a measure of the amount of storage space temporarily occupied by an algorithm while it is running, as well as a trend, and is formulated as S(n)=O(f(n)).

Space complexity is commonly used: O(1), O(n), O(n²), let’s take an example.

4.1. Constant space

When the storage space size of the algorithm is fixed and has no direct relationship with the input size, the space complexity is denoated as O(1). Here are some examples:


int i = 1;
int j = 2;
++i;
j++;
int m = i + j;
Copy the code

4.2. Linear space

When the space allocated by the algorithm is a linear set (such as a set of numbers) and the size of the set is proportional to the input size N, the space complexity is denoted as O(n).

Here are some examples:

    void fun(int n){
        int[] array = new int[n];
        //....
    }
Copy the code

Note: recursion is also a linear space

4.3. Two-dimensional space

When the space allocated by the algorithm is a set of two-dimensional arrays, and the length and width of the set are proportional to the input size N, the space complexity is denoted as O(n2). Here are some examples:


    void fun(int n){
        int[][] array = new int[n][n];
        //....
    }
Copy the code

5. The choice between time complexity and space complexity

The fundamental reason why we put so much effort into evaluating the time and space complexity of an algorithm is that computers are limited in speed and space. Time complexity and space complexity are often contradictory. If the time complexity of the algorithm needs to be reduced, the space complexity of the algorithm usually needs to be increased. Reducing the use of extra space usually results in an increase in the time complexity of the algorithm. So we have to make a trade-off between the two, and most of the time we will choose to sacrifice the space complexity for the full time complexity, because time is not recyclable while space is recyclable while the program is running.

Reference: Comic Algorithm