Last year, I set myself a small goal of saving 30 thousand yuan this year. When I counted, I still needed 50 thousand yuan.

The first part will share the knowledge related to algorithm and data structure with you. Stop talking and start day1.

concept

The time complexity of an algorithm is a measure of the time consumed by the algorithm. Usually, the big O notation is used to establish the mathematical model, that is, O(f(n)). As the value of N increases, the slower the value of O(f(n)) increases, the lower the time complexity of the algorithm

Large order O derivation

The key to the big O notation is the order in O, and the steps for deriving the big O:

1. Replace all addition constants in running time with constant 1. 2. In the modified run times function, only the highest order items are retained. 3. If the highest order term exists and is not 1, remove the constant multiplied by the term. The result is order large O.

example

Constant of the order

var a = 1, b = 1; // Run once
var sum = a + b; // Run once
Copy the code

Run twice, according to the derivation method, “2” is a constant, should be replaced by “1”; Then there are no order terms, so ignore the next two steps of derivation. So the time here is order 1.

Linear order

for(var i = 0; i < 2*n+3; i++){ // Execute 2*n+3 times
  sum +=n;
}
Copy the code

The number of executions is 2n+3, and 2n+1 is derived according to the first step. Change it to 2n in step 2 (why? Since n=n1, 1=n0, n is the highest order term); Number three, 2n should be divided by a constant of 2; So the time here is order n.

The logarithmic order

var cout = 1;
while(cout < n){
  cout = cout * 2; 
}
Copy the code

Assuming that the number of cycles is x, the subexpression is true: 2x = n, and x = log2n, and the time complexity is O(logn).

Square order

  for(var i=0; i<n; i++){for(var j=0; j<n; j++){// Time complexity O(1) statements}}Copy the code

Suppose the number of cycles is n2, and the time complexity is O(n^2).

Time complexity comparison

Common time complexity of the relationship between the size of the amount of time: O (1) < O (logn) < O (n) < O (n * logn) < O (n ^ 2) < O (n ^ 3) < O (2 ^ n) < O (n!) < O(n^n)

Thanks for reading! Welcome to pay attention! Continuously updated…