Introduction to the

You have to learn algorithms on the front end? You must learn, and you must learn hard. Nowadays, data structure and algorithm are standard for interviews in big factories. If you don’t know how to do it, it is basically out of the big factories.

As a front-end, although in the usual development rarely write algorithm, but when we need to go deep into the front-end framework, development language, open source library, understand the algorithm will greatly improve our ability to see the source code. For example, diff algorithm in React, tree-shaking optimization in Webpack, call stack in V8, message queue, etc., are widely used in these algorithms. If you understand them, you can better understand their performance, solve problems more efficiently, advance to a higher Level, and make more money.

There are a lot of algorithms on the market now, but there are very few algorithms for the front end. Therefore, HERE I sorted out a series of data structures and algorithms applicable to the front end, hoping to help you build a complete data structure and algorithm system from 0 to 1.

This is the first in what is expected to be a 40-plus series. If you want to learn more about this series, you can follow the public account “Front-end Bottle” and my “Github”.

Why introduce complexity analysis

We know that good data structures and algorithms can greatly reduce code execution time and storage space, so how do we measure it? This section mainly introduces the algorithm performance measurement index – complexity analysis.

The simplest and most intuitive way to judge the efficiency of a piece of code execution is to put it on the machine to execute it once, and the algorithm execution time and memory footprint will be obtained naturally. So why introduce complexity analysis?

This is mainly because there are great limitations in calculating the performance of an algorithm by running the code on the machine, which is easily affected by the test environment and data size:

  • Statistical results are subject to the test environment: machine test results for different systems and processors can vary widely
  • Statistical results are easily affected by data itself and data size: different data and data of different lengths may have a huge impact on test results

And that’s not what we want. What we need is a rough way to estimate the efficiency of an algorithm, independent of external factors. This is why complexity analysis is used.

How to express complexity

How to express the complexity of the algorithm, specifically the time of code execution, the storage space consumed by execution. Such as:

function cal(n) {
    let sum = 0; // 1 unit_time
    let i = 0; // 1 unit_time
    for(; i <= n; i++) { // n unit_time
        sum += i; // n unit_time
    }
    return sum
}
Copy the code

From the CPU’s point of view, each piece of code is just reading and writing data or manipulating data. Although the number and time of each operation are different, we call the time of each operation unit_time.

T(n)=(2n+2)*unit_time, so we can write:

T(n) = O(f(n))
Copy the code

Among them:

  • N: Indicates the size of the data
  • F (n) : indicates the total number of times that each line of code is executed
  • O: indicates that the execution time of the code T(n) is proportional to the f(n) expression

When n is large, such as 10000, or even larger, T(n) = O(f(n)) can be expressed as T(n) = O(n).

This is the big O time complexity notation. The maximum O time complexity does not specifically represent the real execution time of the code, but the trend of code execution time with the increase of data size. Therefore, it is also called the asymptotic time complexity (also called time complexity).

Time complexity

When n is infinite, the time complexity T(n) is most affected by the highest order of n, and is not so related to constant, low order and coefficient in F (n). So when we analyze the time complexity of code, we only focus on the section of code that executes the most times.

Look at an example:

function fun1(n) {
    let sum = 0,i = 0; 
    for(; i <= n; i++) {
        sum += i; 
    }
    return sum
}
function fun2(n) {
    let sum = 0, sum1 = 0, i = 0, j = 0; 
    for(; i <= n; i++) { 1 / / cycle
        sum += i; 
    }
    for(i = 0; i <= n; i++) { 2 / / cycle
        for(j = 0; j <= n; j++) { sum += i * j; }}return sum
}
function fun3(n) {
    let sum = 0, i = 0; 
    for(; i <= n; i++) { 
        sum += fun(i); 
    }
    return sum
}
Copy the code

The first line of fun1 is constant and has little influence on n, so the total time complexity depends on the cycle of the second and third lines, that is, the time complexity is O(n).

In fun2, the time complexity of loop 1 is O(n), and that of loop 2 is O(n2). As n approaches infinity, the overall time complexity approaches O(n2), i.e., the time complexity of the code above is O(n2).

The time complexity of fun3 is O(n* T(fun)) = O(n*n), that is, O(n2).

So: T (c + n) = O (n), T (m + n) = O (Max (m, n)), T (n) = T1 T2 (m) = O (n) (nm), where c is constant

Common complexity (increasing in quantity order)

Polynomial magnitude:

  • Constant order: O(1) : If no circular statement or recursive statement exists in the algorithm, the time complexity of this algorithm is equal to two O(1) even if there are thousands of lines of code.
  • Logarithmic order: O(logn): A brief introduction
let i=1;
while (i <= n)  {
  i = i * 2;
}
Copy the code
  • Each loopiAll times2Until thei > n, namely, the execution process is: 20, 2,1, 2,2And… , 2,kAnd… , 2,xN Therefore, the total number of executions x can be written as 2x = n, and the time complexity is O(log2N). Here is the2And it could be any other constantkAnd the time complexity is also order (log)3n) = O(log32 * log2n) = O(log2n)
  • Linear order: O(n)
  • Order of linear logarithm: O(nlogn)
  • Square, cubic,… , k power order: O(n2), O(n3)… , O (nk)

Nonpolynomial magnitude order:

  • Exponential order: O(2N)
  • Factorial order: O(n!)

4. Spatial complexity

Time complexity represents the increasing relationship between the execution time of the algorithm and the data size. By analogy, spatial complexity represents the growth relationship between an algorithm’s storage space and data size. Such as:

function fun(n) {
    let a = [];
    for (let i = 0; i < n; i++) {
        a.push(i);
    }
    return a;
}
Copy the code

We can clearly see from the above code that the space of code execution is O(1+n) = O(n), that is, the storage space occupied by I and array A.

Therefore, spatial complexity analysis is much simpler than temporal complexity analysis.

Average time complexity

Time complexity is affected by the data itself and can be divided into:

  • Best time complexity: In the best case, the time complexity to execute this code
  • Worst time complexity: In the worst case, the time complexity of executing this code
  • Average time complexity: In all cases, find an average, and omit coefficients, low orders, and constants

Vi. Reference materials

The beauty of geek time’s data structures and algorithms

Learn JavaScript data structures and algorithms

Seven, know more front-end road friends, together with advanced front-end development

The first phase of front-end algorithm training camp is free 🎉🎉🎉, free yo!

Here, you can advance the front-end algorithm with like-minded friends, from 0 to 1 to build a complete data structure and algorithm system.

Scan code to join [front-end algorithm exchange group exchange group], if the TWO-DIMENSIONAL code fails, you can reply “algorithm” in the public number “front-end bottle Jun”

Finally recommend a front-end brush question magic device: “” “interviewers are using the question bank, scan code to learn “”