directory

  • Basic concepts of algorithms

  • A measure of algorithm efficiency

  • Time complexity analysis example

First, the basic concept of the algorithm

1, define,

Algorithm: A description of the steps to solve a particular problem. It is a finite sequence of instructions, each of which represents one or more operations.

2. Five characteristics of the algorithm

  • There are poor. An algorithm must always end after performing finite steps, and each step can be completed in finite time.

  • Certainty. Each instruction in the algorithm must have a definite meaning, for the same input can only get the same output.

  • There are poor. The operations described in the algorithm can all be implemented by performing the basic operations already implemented a limited number of times.

  • There are poor. An algorithm has zero or more inputs from a particular set of objects.

  • There are poor. An algorithm has one or more outputs, which are quantities that have a particular relationship to the inputs. ,

3. The four goals of a “good algorithm”

  • Is correct. The algorithm should be able to solve the problem correctly.

  • Readability. Algorithms should be readable to help people understand.

  • Robustness. When invalid data is input, the algorithm can react or process appropriately, without producing inexplicable output results.

  • Efficiency and low storage requirements. Efficiency refers to the time of algorithm execution, and storage requirement refers to the maximum storage space required during algorithm execution, both of which are related to the size of the problem.


Second, the measurement of algorithm efficiency

The measure of algorithm efficiency is described by time complexity and space complexity.

1. Time complexity

Note: In actual problem analysis, only the event complexity of the deepest loop statements is generally analyzed.

Problem size n: The variable n that has a major (temporal or spatial) impact on an algorithm.

Frequency: The number of times a statement is executed repeatedly in an algorithm.

T(n) : The sum of all statement frequencies in the algorithm is denoted as T(n), which is a function of the problem size N of the algorithm. The time complexity is mainly analyzed by the order of magnitude of T(n).

O(n) : order of magnitude of the summation T(n) of statement frequency. In mathematical terms T(n) and O(n) are infinite of the same order

Worst time complexity: Time complexity in the worst case;

Average time complexity: The expected running time of the algorithm for all possible input instances with equal probability.

Best time complexity. In the best case, the time complexity of the algorithm.

In general, the worst time complexity is always considered first to ensure that the time complexity of the algorithm does not exceed this maximum value.

2. Analyzing the temporal complexity of a program follows two rules

  • Addition rule.

    T(n) = T1(n) + T2(n) = O(f(n)) + O(g(n)) = O(max(f(n),g(n)));

  • Multiplication rule.

  • T = T1 T2 (n) (n) (n) = O (f (n)) * O (g (n)) = O (n) (f (n) (g)

Common progressive time complexity

Formula: often refers to the power order

O (1) < O (㏒ 2 (n)) < O (n) < O (n) (n ㏒ 2) < O (n squared) < O (n) after < O (2 ⁿ) < O (n!) < O (n ⁿ)

4. Spatial complexity (analysis method is similar to time complexity)

The spatial complexity S(n) of the algorithm is defined as the storage space consumed by the algorithm, which is a function of the problem size N.

S(n)=O(g(n))

In addition to storage space for instructions, constants, variables and input data, a program also needs some units of work to operate on data and some auxiliary space to store information needed for calculation. If the amount of space taken up by the input data depends only on the problem itself, independent of the algorithm, then only the extra space beyond the input and the program needs to be analyzed.

In situ operation of the algorithm means that the auxiliary space required by the algorithm is constant, namely O(1).


3. Analysis examples of time complexity

1. Constant order O(1)

x=90; y=100;
while(y>0) {if(x>100){
		x=x- 10;
		y--;
	}
	else{ x++; }}Copy the code

2. Multiplication rule O(mn)

for(int i=0; i<n; i++){for(int j=0; j<m; j++){ a[i][j]=0; }}Copy the code

3, square order O(n²)

s=0;
for(int i=0; i<n; i++){for(int j=0; j<n; j++){ s++; }}Copy the code

4, logarithmic order O(㏒3(n))

  • practice

    Let the while loop t times, then:

    I =1, 3, 9, 27, 81……

    T =0, 1, 2, 3, 4……

    3^t

    So: T(n)=O(T)=O(㏒3(n))

i=1;
while(i<=n){
	i=i*3;
}
Copy the code