1. Time complexity of the algorithm

  • In general, the number of repeated execution of the basic operation statement in the algorithm is a function of the problem size N, which is expressed by T(n). If there is some auxiliary function F (n) such that the limit value of T(n)/f(n) is a constant not equal to zero when n approaches infinity, then F (n) is said to be a function of the same order of magnitude as T(n). Denoted as T(n) = O(F(n)), O(F(n)) is called the asymptotic time complexity of the algorithm, or time complexity for short
  • T(n) is different, but the time complexity might be the same. Such as: T (n) = n squared + 7 n + 6 and T (n) = 3 + 2 n + 2 n squared them T (n) is different, but same time complexity is O (n squared)
  • A method for calculating time complexity
    • Replace all the addition constants in the middle of the run with constant 1 T(n) = 3n²+7n+6 => T(n) = 3n²+7n+1
    • In the modified run-time function, only the highest order term T(n) = 3n²+7n+1 => T(n) = 3n² is retained
    • The coefficient T(n) = 3n² => T(n) => n² => O(n²)

2.1 Two ways to measure the execution time of a program (algorithm)

  • 1) Post-statistical method (run first and then see the time)

    This method is feasible, but there are two problems: first, in order to evaluate the performance of the designed algorithm, it is necessary to run the program; The statistics of the 20 obtained time depend on the hardware, software and other environmental factors of the computer. In this way, it is necessary to run in the same state of the same computer to compare which algorithm is faster

  • 2)A method of prior estimation(common)

    By analyzing the time complexity of an algorithm to determine which algorithm is better

2.2 Time frequency

The time an algorithm takes is directly proportional to the number of statements executed in the algorithm, and it takes more time whichever algorithm has more statements executed. == The number of statements executed in an algorithm is called statement frequency or time frequency == and is denoted by T(n). For example, to calculate the sum of all numbers from 1 to 100, two algorithms are designed:

// T(n) = n+1; int total = 0; for(int i = 1; i<=100; i++){ total += i; } // time frequency T(n) = 1; total = (1+100)*100/2; // Directly calculateCopy the code

2.2.1 Ignore the constant term

In time complexity, as n gets bigger, the constant term is negligible

  • 2n+20 and 2n as n gets larger, the execution curves approach infinity, and 20 can be ignored
  • 3n+10 and 3n as n gets larger, the execution curves approach infinity, and 10 can be ignored

2.2.2 Ignore low-order items

2.2.3 Ignore coefficient (not very understanding, metaphysics)

2.3 Common time complexity

Time complexity from small to large is == : ==O(1) < O(logn) < O(n) < O(nlogn) < O(n^2) < O(n^3) < O(n^ k) < O(2^n)==, as the problem size n increases, the above time complexity increases, and the algorithm execution efficiency decreases. Exponential algorithms should be avoided whenever possible

The term order Example of a number of times function
Constant of the order O(1) 12
The logarithmic order O(logn) 5log2n+20
Linear order O(n) 2n+3
Linear logarithmic order O(nlogn) 2n+3nlog(2)n+19
Square order O(n^2 ) 3 n squared + 5
Cubic order O(n^3 ) After 6 n + 2 n2 + 3 n + 4
Order k to the power (k>3 and K ∈Z) O( n^k )
The index order O( 2^n ) K to the n, k is a constant

Common time complexity examples

1) constant order O(1)

No matter how many runs the code executes, as long as there are no complex structures such as loops, the time complexity of the code is O (1).

int i = 1;
int j = 2;
++i;
j++;
int m = i+j;
Copy the code

When the above code is executed, its consumption time does not increase with the growth of a variable, so no matter how long the code is, even if it has tens of thousands of lines, its time complexity can be expressed by O (1).

Order O(log(2)n)
int i = 1;
while(i<n){
    i = i*2;
}
Copy the code

Dsmt4. If N is equal to a to the x(a>0, a≠1), a to the x is equal to N, then x is called logarithm of N base A, x is equal to log of a N. Where a is the base of the logarithm, N is the true number, and x is “log base A of N.” In the while loop, you multiply I by 2 every time, and you get closer and closer to N. Let’s say that after x times, I is greater than n, and then the loop ends, so 2 to the x is equal to n, so x is equal to log 2 of n, so after x times, this code is done. So the time complexity of this code is order log(2)n, and order log(2)n is order log(3)n depending on the code, so I = 1*3.

3). Linear order O(n)

This code, the code inside the for loop is going to execute n times, so the time it takes is going to change with n, so this kind of code can be expressed as O(n)

for(int i = 1; i<=n; ++i){ j = i; j++; }Copy the code

4). Linear logarithmic order O(nlogN)

If I loop the code O(logN) n times, it’s O(nlogN).

for(int m=1,m<n,m++){ i = 1; while(i<n){ i = i*2; }}Copy the code
5). Square order O(n²)

If the O(n) code is nested again, its time complexity is O(nn). If the n of one layer of the loop is changed to M, its time complexity becomes O(mn).

for(int i = 1; i<=n; i++){ for(int i = 1; i<=n; i++){ j = i; J++; }}Copy the code
6). Exponential order k^n,k is a constant, e.g. 2^n
int fib(int n){
	if(n<2){
  		return n;
    }
    return fib(n-1)+fib(n-2);
}
Copy the code

Average time complexity and worst time complexity

  • 1) Average time complexity refers to the running time of the algorithm when all possible input instances appear with equal probability
  • 2) Worst-case time complexity is called worst-case time complexity. Generally, the time complexity discussed is the worst case time complexity. The reason for this is that the worst-case time complexity is a limit on how long the algorithm will run on any input force, which guarantees that the algorithm will not run longer than the worst-case time
  • 3) Whether the average time complexity is consistent with the worst time complexity depends on the algorithm (see figure)

2. Spatial complexity of the algorithm

  • Similar to the discussion of time Complexity, the Space Complexity of an algorithm is defined as the amount of storage the algorithm consumes as a function of the problem size N.
  • Space complexity is a measure of the amount of storage space temporarily occupied by an algorithm during its running. The number of temporary units of work required by some algorithms is related to the problem size n, which increases with the increase of N. When N is large, more storage units will be occupied, such as quicksort and merge sort algorithms
  • When doing algorithm analysis, == mainly discusses time complexity ==. From the perspective of user experience, the speed of program execution is more important. Some caching products (Redis, memcache) and algorithms (radix sort) essentially trade space for time.

3. Other terms

  1. Stable: If a is in front of B and a== B, a is still in front of B after sorting
  2. Unstable: If a comes before B and a== B, a may come after B after sorting
  3. Internal sort: All sort operations are done in memory
  4. External sort: Because the data is too large, it is placed on disk and sorted through disk and memory data transfer
  5. Time complexity: The amount of time an algorithm takes to execute
  6. Space complexity: The amount of memory required to run a program
  7. N: Data scale
  8. K: Number of buckets
  9. In-place: takes no extra memory
  10. Out-place: takes up extra memory

4. It was added on December 1

For the time complexity of O(log(n)), if n is 4, I is equal to 2, that is, the function body is executed twice; If n is 100, do the following analysis

First, when writing a program, you must have an understanding of the time and space complexity of your program and develop the habit of subconsciously analyzing the time and space complexity of your program after writing it. Secondly, to be able to complete this procedure with the simplest time and space complexity is basically a prerequisite for a top professional player. If the program is written badly, the cost to the company’s machines and resources can increase by hundreds of thousands. Conversely, the impact is also very large.

www.zhihu.com/question/21…

The time after that is O (1), constant time

Algorithmic answer sets are four pieces

  1. First, confirm the meaning of the question with the interviewer
  2. Compare these temporal and spatial complexities in every possible way
  3. Find the optimal solution that takes the least amount of time and uses the least amount of memory, and start writing
  4. Test results

Complex conditions: recursion

For recursion, the key thing is to know what the recursive statement is, how many times the statement is executed. Recursively it’s nested. HOW? What we’re going to do is, by taking the order of recursion, we’re going to draw this tree structure, and we’re going to call it the recursion tree of its recursion states or the state tree.

You’ll find a large number of redundant values, resulting in exponential growth. So don’t write that in your interview. You can either cache these intermediate results, or you can just loop through the whole Fibonacci sequence and sum n terms.

There are four general recursion scenarios that are used in interviews and engineering:

  • Binary search, which usually occurs when a sequence itself is ordered, you want to find your target number in the ordered sequence. If you just look on one side, you end up with log n.
  • 2. For binary tree traversal, its time complexity is O (n). Instead of using the master theorem, the simplified way of thinking about binary tree traversal, we’re going to visit every node once, and only once.
  • 3. Binary search in a well-ordered two-dimensional matrix. The time is order n, and if it’s a one-dimensional array binary lookup, the time is order logn.
  • Merge sort, time complexity is nlogn time complexity.

Consider:

1. Handwritten binary tree traversal – front, middle and back order, and time complexity analysis. A: O(n) where n represents the total number of nodes in a binary tree. Obtained by the master theorem; Or to put it this way, each node in the binary tree is visited once and only once, so its time is linear to the total number of nodes in the binary tree, so it’s order n.

2. Graph traversal: What is the time complexity? A: O (n), where n is the total number of nodes in the graph.

A: Whether DFS(depth first) or BFS (breadth first), the time complexity is O (n), where n refers to the total number of nodes in the search space.

Binary search: What is the time complexity? A: log (n) en.wikipedia.org/wiki/Master… Zh.wikipedia.org/wiki/%E4%B8… Summary: Common tool configuration and programming environment configuration deliberately practice, but also to use tools to use up, “to do good things must first sharpen its device”, for programming environment, commonly used IDE, basic skills and programming fingering, search best practices/ Top Tips on the Internet, Google good advice, good strategy, Good quality one or two, develop this awareness, at the same time more and less to force yourself to deliberately practice these tools, these tools are very skilled. Time complexity and space complexity