- Why learn data structures?
In the computer world, data is everywhere. Data structure is the study of how data is organized and stored in the computer, so that we can efficiently obtain or modify data.
Maybe some of you are going to say, what’s the use of learning this stuff, for job interviews? For exams? In today’s program development, a variety of frameworks rampant, major blog sites, Github open source projects and other ways, has let our development change more and more EZ, is simply a move CV in hand, all over the world are not afraid.
In my opinion, it does make some sense. If you just want to build a product using existing tools, then data structure is really of little use to you. The key lies in the logic of the product, the design of patterns, the design of interactive experience, and so on. But if you want to be an expert, you need data structures.
Why? We can go to Boss direct hire, or pull hook net to understand a wave. Algorithms and data structures are becoming more and more important in positions with a monthly salary of more than 20K, or in JD positions in major factories. After all, the bigger the company, the more it needs its own developers to have a solid foundation, because they are developing programs and tools for the next generation.
So learning data structures well will raise your technical ceiling and get you far in computer science!
- A preliminary introduction
I. In terms of categories, there are three:
1. Linear structures: arrays, stacks, queues, linked lists, hashes…
2. Tree structure: binary tree, binary search tree, AVL, red black tree, Treap, Splay, heap, Trie, line tree, K-D tree, and search set, Huffman tree…
3. Graph structure: adjacency matrix, adjacency list
Two, in terms of difficulty:
1. The basic and frequent interview are: array, stack, queue, linked list, binary search tree, heap. The basic function of the individual feels the need to achieve the realm of handwriting.
2. Competition related: line tree, Trie, and lookup set.
3. The more difficult ones are not often examined in interviews: balance tree (AVL, red black tree), hash table, etc.
In our daily work, we need the flexibility to choose the most appropriate data structure for different applications.
- Choice of language
In this series of notes on data structures, choose Java8 as the main language, so some people may ask, I can also use Python, JS, absolutely no problem. However, scripting languages can be used to learn the mechanics of data structures, but not to examine their performance. Not only does it have to do with logic, but it depends a lot on how the parser interprets different types of writing.
For python, for example, the pythonic writing may often be more important than the logic itself.
arr = []
for i in range(100000):
arr.append(i)
arr = [i for i in range(100000)]
Copy the code
So for scripting languages, it is important to verify the performance of data structures not only by looking at the logic but also by looking at how the logic is written.
- Big O notation
Big O describes the relationship between the running time of the algorithm and the input data.
public static int sum(int[] nums){
int sum = 0;
for (int num:nums){
sum += num;
}
return sum;
}
Copy the code
The time complexity is O(n), where n is the number of elements in NUMs, so it means that the running time of this function is linear with the number of elements n.
Why do we call it O(n)? Because we ignore the constant. Large O describes the asymptotic time complexity of an algorithm as n approaches infinity, and the lower order terms can be ignored.
In the field of algorithmic analysis, we usually look at the worst case.
Common time complexity:
Time consumed from small to large:
Okay, so that’s it for today, and then we’ll start with arrays, and we’ll start with arrays.