preface

At the beginning of this paper, the data structure and algorithm are summarized. Although the series are more open, but they are very important.

Data structures and algorithms are one of the things that distinguish programmers from coders. Of course, I think software engineers are more advanced than programmers.

Each chapter in the series is digested and then organized to mark that you have understood the part. Cough cough, we learn them only for the purpose of application, such as what kind of data reasoning proof, uh, away from the scene is rogue, I don’t do algorithm direction, spend so much experience to study the proof process, MDZZ? So later articles are partial application direction.

Recommendation list

Do not recommend a wave, of course, are widely acclaimed. I’ve only seen The Algorithm.

Maybe it’s better to watch it together

Getting started: Big Talk Data Structure, Algorithm Diagram

System Learning: Description of Data Structures and Algorithms

Big book: Algorithms (4th Edition)

basis

If data structures and algorithms are one of the foundations of programming, then the following concepts are the general foundations of data structures and algorithms:

  • Time complexity
  • Spatial complexity

And their theoretical basis is:

  • Big O notation
  • A couple of little math things

OK, one by one.

Foundation of foundation

Big O notation

Let’s look at the coordinates of the two functions. They are:


Coordinate diagram:


The difference would be very, very large. In the algorithmic world, this growth is represented by the big O notation. so


This is the increase in y as a function of n, which is the increase in y as a function of n


The graph of this function. All right, we’ll talk about what the Y-axis means in a second.

A couple of math things

So the big O notation, we can write it in


Look at the situation. Such as


In the case of Equation 1, because the big O representation expresses the increasing trend of the Y-axis value. If n equals 110 million, it is clear that the determining factor is:



Obviously, yes


. so


. What this means is that even though equation 2 is a polynomial, in the case of the big O analysis, we only look at the terms that have the greatest influence on the growth trend. And the proof is pretty good: the trend of growth is, as the name suggests, the percentage of growth, which is the same thing as y over x. The formula 2 divided by n becomes:


, can be further simplified as, so the biggest influence on the growth trend is the first term,


.

Common function graph

In the field of algorithm, there are several functional images to master, as shown below:


Time and space complexity

Algorithm performance is evaluated by running time and storage footprint. Why these two parameters? I think it looks like this: combined with the Von Neumann architecture, the computer consists of: memory, calculator, controller and I/O devices. Input/output devices and controllers don’t matter, so that leaves memory and calculators. Aren’t algorithms meant to reduce computation and storage on a single machine? Humble opinion.

Ok, so that brings up the concept of time complexity and space complexity.

Time complexity

In the big O notation above, the y-coordinate value has not been defined. Let’s assume that the machine takes the same amount of time to execute each instruction, say 1 unit of time, so it’s equivalent to executing c instructions in units of time. When analyzing the time complexity of the algorithm, the vertical axis of the big O notation is expressed as the time taken by the program to execute (or the number of instructions to execute), so: time complexity O(n) is expressed as the increasing trend of the time taken by the program to execute correctly. And specific


The function in parentheses, which is the picture above, is greater than.

Spatial complexity

As with time complexity, O(n) represents the increasing amount of space it takes to correctly execute a program in space complexity, which is actually simpler than in time complexity.

Simple exercise

Let’s do an exercise to end it. Let’s break it up into two lectures. I’m going to throw up. For example, analyze the time and space complexity of the following programs:

void atom(int n) {
	int arr[n];
	int i;
	
	for (int i = 0; i < n; i++) { arr[i] = i; }}Copy the code

The time complexity is: no matter how big n is, the 2,3 lines of code are done once, and the 5 for loop is done n times, and the 6 for loop is done n times, so the total is: n + n + 1 + 1. We learned that if we only look at the one with the most impact, it is 2n, which further simplifies to several basic functions, and finally it is n. So:

The time complexity is O(n).

The space complexity is: 1 unit of I, n arR array, so finally:

Space complexity: O(n)