“This is the third day of my participation in the First Challenge 2022, for more details: First Challenge 2022”.
Introduction of algorithm
The introduction
What’s the algorithm?
An algorithm is a set of instructions for completing a task
Chapter 1 of Graphical Algorithms points out that any piece of code can be considered an algorithm, but algorithms are usually either fast or solve real problems.
This is similar to the philosophy I used to accept, which was:
Program = algorithm + data structure
The program is written with a certain data structure and algorithm, but the middle will use some expressions of the programming language.
Algorithms and data structures must be designed to solve real problems. I’ll introduce binary lookup, selective sorting, arrays, linked lists, and recursion to take you into the world of algorithms.
Binary search
Why binary search?
It’s like looking up a dictionary. We don’t go page by page.
Because it’s too slow to go through them one by one.
So binary search was designed to solve the problem of slow search.
The code implementation is as follows:
// Use binary search on an ordered array to return its index, Const binarySearch = (list, Item) => {let low = 0 // define low, high, and mid to trace let high = list.length -1 let mid while (low <= high) {mid = Math.floor((low + high) / 2) // (low + high) / 2) // (low + high) / 2) // if (list[mid] === item) {// If mid === item, Return mid} if (list[mid] < item) {mid = mid; mid = mid; mid = mid; Low = mid + 1} if (list[mid] > item) {mid = mid + 1} low = mid + 1} High = mid - 1}} return nullCopy the code
Test it out:
binarySearch([1, 2, 3, 4, 5], -2) // null
binarySearch([1, 2, 3, 4, 5], 2) // 1
binarySearch([1, 2, 3, 4, 5], 5) // 4
Copy the code
The elapsed time
Ordinary search (linear time) : O(n)
Binary search (logarithmic time) : O(logn)
Binary lookup takes up to seven times in a list of 100 numbers.
Binary lookup takes up to 32 times to find a list of 4 billion numbers.
It can be seen that the right algorithm can improve the efficiency of the program.
Big O notation
We mentioned the running time above, using the big O notation, which is used to indicate how fast the algorithm is.
- The speed of the algorithm is not time, but the speed of the operand (n).
- When we talk about the speed of an algorithm, we’re talking about how fast the running time increases as the input increases.
- The running time of the algorithm is expressed in big O notation.
- The big O notation represents the worst-case running time.
Common big O runtime
- O(log n) : logarithmic time, such an algorithm includes binary search.
- O(n) : Linear time, such an algorithm involves simple lookups.
- O(n * log n) : Such an algorithm includes quicksort.
- O(n2) : Such an algorithm includes selection sorting.
- O(n!) Such algorithms include solutions to the traveling salesman problem.
Arrays and linked lists
How memory works
To introduce arrays and linked lists, you need to understand how memory works
If you are going to swim, deposit your belongings in your locker, and then open the door through your key number.
This is similar to how computer memory works, in that a computer is like a collection of lockers, each with an address.
There are two basic ways to store multiple pieces of data — arrays and linked lists.
The difference between arrays and linked lists
An array is a contiguous memory space in memory. The memory address of each element can be calculated based on its index distance from the head of the array. So for an array, each element can be located directly by the index subscript of the array.
Nodes in a linked list are allowed to be scattered around the memory space. A pointer (next) is used to record the position of the next node.
In a linked list, the structure of each node consists of two parts: data field and pointer field. Javascript lists are implemented in the form of nested objects:
Next: {val:2, next:... }}Copy the code
To access any of the elements in the list, we need to start at the starting node, visit next one by one, and continue to the target node. To ensure that the start node is reachable, we sometimes set a head pointer to the start of the list.
Draw a picture to sum it up:
Creation, insertion, and deletion of linked list nodes
create
Create a linked list of 1 -> 2
Function ListNode(val, next) {this.val = (val===undefined? 0 : val) this.next = (next===undefined ? null : next) } const node = new ListNode(1) node.next = new ListNode(2)Copy the code
This creates a linked list with data field 1 pointing to data field 2.
insert
1 -> 2 becomes 1 -> 3 -> 2
const node3 = new ListNode(3)
node3.next = node1.next
node1.next = node3
Copy the code
delete
1 -> 2 -> 3 becomes 1 -> 3
node1.next = node1.next.next
Copy the code
Array element type problem
The book points out:
All elements in the same array must be of the same type (all int, double, and so on).
But not necessarily in JS.
JS is special. If we define only one type of element in an array, for example:
Const arr = [1, 2, 3, 4]Copy the code
It’s a purely numeric array, so it does correspond to contiguous memory.
But if we define different types of elements:
const arr = ['haha', 1, {a:1}]
Copy the code
It corresponds to a segment of memory that is not contiguous. At this point, the JS array no longer has the characteristics of the array, its bottom using hash map allocation of memory space, is realized by the object linked list.
Remember: A JS array is not necessarily a real array.
Zhihu: Explore the underlying implementation of “array” under JS V8 engine
Add, delete, and access linked lists and arrays
Array access
Because arrays correspond to a sequence of contiguous spatial addresses in memory, the memory address of each element can be calculated based on its index distance from the head of the array, so the time complexity is O(1).
Adding and deleting arrays
Since arrays correspond to a sequence of contiguous spatial addresses in memory, all subsequent elements are moved one bit behind when adding elements. Similarly, if you delete an element, all the following elements move forward one bit, so the time is order n.
Access to linked lists
Because the list can only be found one by one through Next, the access time is O(n).
Adding and deleting linked lists
No matter how large the number of nodes n in the linked list is, as long as we know the target location to insert/delete, all we need to do is change the pointer pointing of the target node and its precursor/successor nodes, so the time complexity of adding and deleting is O(1).
To summarize:
operation | An array of | The list |
---|---|---|
access | O(1) | O(n) |
insert | O(n) | O(1) |
delete | O(n) | O(1) |
Selection sort
Now that we have arrays to hold our elements, let’s talk about sorting them.
We introduce one of the most awkward sorting algorithms, selective sorting.
As the name implies, each time the largest (small) items are selected from the list, they are sorted into a new list. After the selection, the new list is sorted
To optimize, use element swap, sort in the original list, do not use the new list, the code implementation is as follows:
Const findSort = (list) => {let minIndex; let temp; Const len = list.length // Cache array length for (let I = 0; i < len; MinIndex = I; minIndex = I; minIndex = I; minIndex = I; j < len; J ++) {if (list[j] < list[minIndex]) {minIndex = j}} if (minIndex! == I) {// If the smallest index is not the first element initialized, Temp = list[I] list[I] = list[minIndex] list[minIndex] = temp}} return listCopy the code
Time complexity: O(n^2).
In fact, there are many sorting methods, such as js call array sort method on the sort, its principle is definitely not select sort, because select sort is too slow.
Other sorting algorithms will be introduced in later articles
recursive
Why do we have recursion?
Because computers and people’s way of thinking is different, people can see a lot of things at a glance, but computers can not, computers are good at dealing with a single task, and the operation speed is very fast.
So many problems can be broken down into single, repetitive tasks and thrown at a computer.
Doing the same single task over and over again is called looping.
A program invokes its own programming technique, called recursion.
Loops and recursion
Many problems can be solved by both loops and recursion. Loops are generally better and recursive programs are easier to understand.
Take the Fibonacci sequence:
Const fib = (n) => {if (n === 0) {return 0} if (n === 1) {return 1} return fib(n-1) + fib(n-2)}Copy the code
Const fib = (n) => {const arr = [0, 1] for (let I = 2; i <= n; i++) { arr[i] = arr[i - 1] + arr[i - 2] } return arr[n] }Copy the code
Recursion is much worse than a loop, where n is equal to something like 40 and it takes a little while to compute, whereas with a loop it takes a little while to compute.
Because as the level of recursion increases, there’s a lot of double counting.
Baseline conditions and recursive conditions
When writing a recursive function, you must tell it when to stop recursing, or it may cause the program to run indefinitely.
Because of this, every recursive function has two parts: the baseline condition (the termination condition) and the recursive condition.
A recursive condition refers to a function calling itself.
The baseline condition is that the function no longer calls itself.
Take this example
const countDown = (i) => { console.log('i :>> ', I) if (I <= 1) {return false} else {countDown(i-1)}Copy the code
Stack and call stack
The stack
Stack is a simple data structure, only push in and pop out of two operations, data first in and then out
The call stack
Synchronizes code, pushes it onto the stack, executes it, and then pushes it off the stack.
const second = () => { console.log('Hello there! ') } const first = () => { console.log('Hi there! ') second() console.log('The End') } first()Copy the code
Recursive call stack
Recursive functions also use call stacks.
Taking the Fibonacci number mentioned above as an example, when n = 2, the recursive function execution is shown as follows:
In runtime, recursive function recursive function itself will be calling again and again, repeatedly before previous recursive function pressure into the stack until the baseline parameters and popup stack, it is concluded that after the return value is the previous series of recursive function will also earn the return value and ejection stack, to run the rest by the stack of recursive functions.
While using the stack is convenient, it comes at a cost: storing detailed information can take up a lot of memory. Each function call takes up a certain amount of memory, and if the stack is high, it means that the computer stores a lot of information about function calls.
So be careful with recursion, it’s easy to run out of memory.
summary
This article introduces binary search, big O notation, selection sort, array, linked list, recursion and stack, briefly introduces some knowledge of the algorithm.
In fact, this is also my algorithm learning notes, a few years of work in the old front end, algorithm is not as good as students, very ashamed.
I will continue to output the algorithm series