This paper reference documentation: https://zhuanlan.zhihu.com/p/50479555 is adjusted with their own understanding language, equivalent to a notes. Please click the link to support the original authorCopy the code

An Algorithm is a set of methods used to manipulate data and solve presentation problems. In other words, it can obtain the required output in a limited time for a specified input. Different algorithms may perform the same task with different time, space, or efficiency. Therefore, the advantages and disadvantages of an algorithm can be measured from two dimensions of space and time.

  • Time dimension: Refers to the time taken to execute the current algorithm, which is usually described by “time complexity”.
  • Spatial dimension: Refers to the amount of memory required to perform the current algorithm, which is usually described as “spatial complexity”.

1. Time complexity

Obviously, running an algorithm once will give you an idea of how long it will take, but this approach is highly influenced by the environment in which it is being run, the performance of the machine, and the size of the data used in the test. Moreover, when we write the algorithm, there is no way to run it completely.

Thus, another, more general method was developed: “Big O notation”, that is, T(n) = O(f(n)).

To calculate the time complexity, take a look at this example:

For I in range(n): # 1 particle time j = I # n particle time j += 1 # n particle timeCopy the code

Assuming the execution time of each line of code is the same, the total time is: 1 particle time + N particle time + N particle time = (1+2n) particle time. That is, T(n)=(1+2n)* particle time. It can be seen that the time of this algorithm changes with the change of n, and when n is infinite, constant 1 and multiple 2 are meaningless. Therefore, we can simplify the time complexity of this algorithm as T(n)=O(n).

Common time complexity levels are:

  • Constant order O (1)
  • The logarithmic order O (logN)
  • Linear order O (n)
  • Linear log order O(nlogN)
  • Squared square order O (n)
  • Cubic order O (n) after
  • Order N to the K.
  • The index order (2 ^ n)

As the time complexity from top to bottom increases, the execution becomes less and less efficient.

Here are some more commonly used ones to explain:

1, logarithmic order O(logN)
i = 1
while i<n:
	i = i * 2
Copy the code

Let’s say that after x times, I is greater than 2, and the loop ends, so 2 to the x is equal to n, so x is equal to log base 2 to the n. So when I loop log2 to the N times, I’m done. So the time complexity of the above code is O(logN).

2. Linear logarithmic order O(nlogN)

Linear log order O(N logn) is actually very easy to understand, so if I loop O(logn) code N times, it’s N times O(logn), which is order N (logn).

for m in range(n):
	i = 1
	while i < n:
		i = i * 2
Copy the code

1. Spatial complexity

Like time complexity, space complexity is a measure of the amount of storage space temporarily occupied by an algorithm during its running. It also reflects a trend and is defined by S(n).

And the space complexity is commonly used: O(1), O(n), O(n²)

1. Space complexity O(1)

If the temporary space required for algorithm execution does not change with the size of a variable N, that is, the space complexity of this algorithm is a constant, it can be expressed as O(1).

i = 1
j = 2
i += 1
j += 1
m = i + j
Copy the code

The space allocated by I,j and m in the code does not change with the amount of data processed, so its spatial complexity S(n) = O(1)

We will categorize the data structure and sort out the python algorithms. We will indicate the time and space complexity after each algorithm. Welcome to pay attention and correct ~ ~