1 – algorithm
The definition of an algorithm is this: an accurate and complete description of the solution, a clear set of instructions to solve the problem. Blah, blah, blah, blah, blah, blah, blah, blah, blah, blah, blah, blah, blah, blah, blah, blah, blah, blah, blah, blah, blah, blah, blah, blah, blah, blah, blah, blah, blah, blah, blah, blah, blah, blah, blah, blah. But the description might be in a different way or “language.”
2 – Efficiency of the algorithm
Since an algorithm is a description of how to solve a problem, just as there are a thousand Amrttes in a thousand eyes, there are many different ways to solve the same problem, but in the process we use/consume time or other costs (in the case of computers, memory). For faster, better, stronger Oreo… Oh no, make the algorithm more efficient. So a lot of times a good algorithm is one that is significantly reduced in time or space (memory) or time and space (memory) compared to other algorithms that implement the same problem.
Therefore, the efficiency of the algorithm is mainly evaluated by the following two complexities:
Time complexity: Estimates how long it takes to execute a program. You can estimate how much your program uses the processor.
Space complexity: Estimates the storage required to execute the program. You can estimate how much memory your program is using on your computer.
When designing algorithms, time complexity is more problematic than space complexity, so in general we’ll just focus on time complexity. Complexity refers to time complexity unless specifically stated in an interview or job.
2.0 – Time complexity
Next we need to know another concept: time frequency. At this point you might say, “I thought we were going to learn algorithms together. What are these things? Freebies?” . No, no, no. This is not for sale.
Because the time consumed by the execution of an algorithm cannot be calculated theoretically, yes, it is theoretically, so we can still be tested in the program. But it is not possible and not necessary to test every algorithm, just to know roughly which algorithm takes more time to execute and which takes less time. If the time an algorithm takes is proportional to the number of code statements in the algorithm, then the more statements that algorithm executes, the more time it takes. We call the number of statements executed in an algorithm the time frequency. Usually (ps: it’s interesting to know who usually) is denoted by T(n).
In time frequency T(n), n represents the scale of the problem. When N changes constantly, T(n) will also change accordingly. In order to understand the law of this change, the concept of time complexity was introduced. In general, the number of repeated execution of this operation is a function of the problem size N, which is the time frequency T(n). If there is some auxiliary function F (n), the limit value of T(n)/f(n) is a constant that is not zero as it tends to infinity, then F (n) is a function of the same order of magnitude as T(n), denoted by T(n)=O(f(n)), which is called the asymptotic time complexity of the algorithm, also referred to as the time complexity.
2.1 – Big O notation
A notation that represents the time complexity of an algorithm in terms of O(n) is called the big O notation
Usually we evaluate an algorithm by directly evaluating its worst complexity.
The value of f(n) in the big O notation O(f(n)) can be 1, n, logn, n^2, etc., so we call O(1), O(n), O(n), O(logn), O(n ^2) the order of constant, linear, logarithmic and square respectively. Let’s look at the method of deriving large Order O:
Derive order O
There are three rules for deriving large order O:
- Replace all addition constants in running time with constant 1
- Only the highest order terms are kept
- Remove the highest order constant
Pick up a lot of chestnuts
- Constant of the order
let sum = 0, n = 10; Let sum = (1+n)*n/2; Console. log(' The sum is: ${sum} ') // The statement executes onceCopy the code
If this code is executed 3 times, and we apply rule 1, then the time complexity of this algorithm is O(1), which is constant.
- Linear order
let i =0; While (I < n) {console.log(' Current I is ${I} '); I++; // execute n times}Copy the code
The code executes 3n + 1 times in total, according to rule 2->3, so the time complexity of the algorithm is O(n).
- The logarithmic order
let number = 1; // statement execute once while (number < n) {// statement execute logn times number *= 2; // the statement executes logn times}Copy the code
In the above algorithm, the number is doubled each time. We assume that the loop body is executed m times, then 2^m = n, that is, m = logn, so the whole code is executed 1 + 2*logn, then F (n) = logn, and the time complexity is O(logn).
- Square order
for (let i = 0; i < n; I++) {// statement n times for (let j = 0; j < n; J++) {// the statement executes n^2 console.log('I am here! '); N ^2}}Copy the code
In the nested loop above, the code executes 2*n^2 + n, then f(n) = n^2. So the time of this algorithm is O(n^2).
Comparison of common time complexity
The common time complexity function is believed to have been seen in the university, and there is no more explanation here:
The O (1) < O (logn) < O (n) < O (nlogn) < O (n squared) < O (n) after < O (2 ⁿ) < O (n!)