This is the 15th day of my participation in the August Wen Challenge.More challenges in August
Sorting algorithm
introduce
Sorting is also known as SortAlgorithm. Sorting is the process of sorting a group of data in a specified order.
Sorted classification
1) Internal sorting:
Refers to the loading of all data to be processed into internal storage (memory for sorting).
2) External sorting method:
The amount of data is too large to be loaded into memory, and external storage (files, etc.) is required for sorting.
3) Common sorting algorithm classification (see figure below):
Time complexity of the algorithm
1. Two ways to measure the execution time of a program (algorithm)
1) Post-statistical methods
This method is feasible, but there are two problems: first, in order to evaluate the performance of the designed algorithm, it is necessary to run the program; Second, the statistics of the time obtained depend on the hardware, software and other environmental factors of the computer. In this way, it is necessary to run in the same state of the same computer to compare the algorithm faster
2) Method of ex post estimation
The time complexity of an algorithm is analyzed to determine which algorithm is better.
2. Time frequency
Basic introduction
Time frequency: The time taken by an algorithm is directly proportional to the number of statements executed in the algorithm. The algorithm that has more statements executed takes more time. The number of statements executed in an algorithm is called statement frequency or time frequency. Notes for T (n). [Examples]
- An example. – Basic case
For example, to calculate the sum of all numbers from 1 to 100, we designed two algorithms:
int total = 0; int end = 100; // use the for loop to calculate for(int I = 1; i <= end; i++){ total += i; } T(n) = n + 1; Total = (1+end) * end /2; T(n) = 1;Copy the code
- For example – ignore the constant term
Conclusion:
-
2n+20 and 2n as n gets larger, the execution curves approach infinity, and 20 can be ignored
-
3n+10 and 3n as n gets larger, the execution curves approach infinity, and 10 can be ignored
- For example – Ignore lower order terms
Conclusion:
-
2n squared plus 3n plus 10 and 2n squared as n gets bigger, the execution curve gets infinitely close, so you can ignore 3n plus 10
-
N ^ 2 +5n+20 and n ^ 2 as n gets bigger, the execution curve gets infinitely close, so you can ignore 5n+20
- For example – ignore coefficients
Conclusion:
-
As the value of n increases, the execution curves of 5n2+7n and 3n2+2n coincide, indicating that in this case,5 and 3 can be ignored.
-
And n^3+5n and 6n^3+4n, perform the curve separation, show how many times the mode is critical
Time complexity
1) In general, the number of repeated execution of the basic operation statement in the algorithm is a function of problem size N, which is represented by T(n). If there is some auxiliary function F (n), such that when n approaches infinity, the limit value of T(n)/f(n) is a constant not equal to zero, then F (n) is said to be a function of the same order of magnitude as T(n). Denoted as T(n)=O(f(n)), O(f(n)) is called the asymptotic time complexity of the algorithm, or time complexity for short.
2) T(n) is different, but the time complexity may be the same. Such as: T (n) = n squared + 7 n + 6 and T (n) = 3 + 2 n + 2 n squared them T (n) is different, but same time complexity, is O (n squared).
3) Method for calculating time complexity:
- Replace all the addition constants in the running time T(n)=n²+7n+6 → T(n)=n²+7n+1 with constant 1
- In the modified run-time function, only the highest order term T(n)=n²+7n+1→T(n)= n² is retained
- T(n)=n ^ 2 →T(n)= n ^ 2 → O(n ^ 2)
Common time complexity
1) Constant order O (1)
2) Logarithmic order O (log2n)
3) Linear order O (n)
4) Linear logarithmic order O (nlog2n)
5) Order o(n^2)
6) Cubic order O(n^3)
7) Order O(n^k)
8) Exponential order O (2^n)
Common time complexity corresponding graph
Description:
-
The time complexity of common algorithms increases from small to large in order :0(1)<0(log2n)<0(n)
-
As can be seen from the figure, we should avoid using exponential algorithms whenever possible
1) Constant order O (1)
It doesn’t matter how many lines of code it executes, as long as it doesn’t have loops or anything, the time complexity of the code is o(1).
int i = 1;
int j = 2;
++i;
j++;
int m = i + j;
Copy the code
Note: when the above code is executed, its consumption time does not increase with the growth of a variable, so no matter how long the code, even tens of thousands of lines, can be used to express its time complexity o1).
2) Logarithmic order O(log2n)
int i = 1;
while(i<n){
i = i * 2;
}
Copy the code
Explanation: In the while loop, you multiply I by 2 each time, and as you do so, I gets closer and closer to n. Let’s say that after x times, I is greater than 2, and then the loop ends, which means that 2 to the x is equal to n, so x is equal to log base 2n which means that after x to the log base 2n, this code ends. So the time complexity of this code is O(log2n). Order log of 2n, this 2 is time dependent on the code, so I = I times 3 is order log of 3n.
3) Linear order O(n)
for(i = 1; i <= n; ++i){
j = i;
j++;
}
Copy the code
Note: this code, the code inside the for loop will execute n times, so it consumes time changes with the change of n, so this kind of code can use O(n) to express its time complexity
4) Linear logarithmic order O(nlogn)
for(m = 1; m < n; m++){ i = 1; while(i<n){ i = i * 2; }}Copy the code
Linear logarithmic order O(nlogN) is very easy to understand, the time complexity of o(logn) code loop N times, then its time complexity is N * O(logn), also o(nlogN).
5) Order O(n^2)
for(x = 1; i <= n; x++){ for(i = 1; i <= n; i++){ j = i; j++; }}Copy the code
If the code of o(n) is nested again, its time complexity is O (n2). This code is actually nested with two layers of N cycles, its time complexity is O (n * n), that is, O (n2). If the n of one layer of the loop is changed to M, So the time is order m times n.
6) cubic O(n^3), k power O(n^k)
Note: Refer to the above O(n^2) to understand, O(n^3) is equivalent to three layers of n cycle. Other things like that
Average time complexity and worst time complexity
1) Average time complexity refers to the running time of the algorithm when all possible input instances appear with equal probability.
2) Worst-case time complexity is called worst-case time complexity. – The time complexity discussed above is the worst-case time complexity. The reason for this is that the worst-case time complexity is the limit on the running time of the algorithm on any input instance, which guarantees that the algorithm will not run longer than the worst-case time.
3) Whether the average time complexity is consistent with the worst time complexity depends on the algorithm (as shown below)
The spatial complexity of the algorithm
Basic introduction
- Similar to the discussion of time Complexity, the Space Complexity of an algorithm is defined as the amount of storage the algorithm consumes as a function of the problem size N.
- Space Complexity is a measure of how much storage Space an algorithm temporarily occupies while running. The number of temporary units of work needed by some algorithms is related to the problem size N, which increases with the increase of N. When N is large, more storage units will be occupied, such as quicksort, merge sort and radix sort
- When we do algorithm analysis, we mainly discuss time complexity. From the perspective of user experience, the speed of program execution is more important. Some caching products (Redis, memcache) and algorithms (radix sort) essentially trade space for time.