introduce
Higher-order functions, that is, higher-order function(HOF), accept functions as arguments or functions that return functions
Javascript functions are first-class, which means they can
- Name by variable
- Supplied as an argument to a function
- Returns as a result of the function..
In fact, we are already using it unconsciously, such as Map, Reduce, filter…
Here, just to talk about an example…
An example of finding square roots
Sister: Square root? Math. SQRT ME: Well, the point here is, no Math. SQRT Little sister: What are you trying to say? Me: Little sister, I want to… Little elder sister: you old man, grade than I return big, return everyday call me little elder sister, no line, you have to ask me to drink milk tea I: forehead, we learn SQRT together, again say…
Newton iteration
Let’s start with a little bit of math, Newton’s iterative method (also known as Newton-Rafson)
Assuming f(x) = 0 has a root near x0, then use the above iteration to compute x1, x2, x3…. Will eventually approach the root of the equation wirelessly (if the iteration converges)
Feel it again with a GIF
For those of you who are interested, how do YOU explain Newton’s method in an easy way
The first SQRT
Well, what we’re trying to figure out is the square root of x, which is y equals the square root of x
So if you square both sides, you’re just taking the positive root of y squared minus x is equal to y minus square root of x, y plus square root of x is equal to 0
Let’s take the derivative of f of y is equal to y squared minus x with respect to y f prime of y is equal to 2y
Newton’s iterative formula is applied
All right, so if we look at the square root of 2,
guess | shang | The average |
---|---|---|
1 | 2/1 = 2 | (2 + 1) / 2 = 1.5 |
1.5 | 2/1.5 = 1.33333 | (1.33333 + 1.5) / 2 = 1.4167 |
1.4167 | 2/1.4167 = 1.418 | (1.4167 + 1.418) / 2 = 1.4142 |
1.4142 | . | . |
Writing code!
function sqrt(x, guess = 1.0) {
// If GUESS is good enough, stop the calculation
if (goodEnough(guess, x)) {
return guess
} else {
return sqrt(x, improve(guess, x))
}
}
// Improve the guess value
function improve(guess, x) {
return average(guess, x / guess)
}
/ / here we think | guess ^ 2 - x | < 0.001 guess is good enough
function goodEnough(guess, x) {
return abs(square(guess) - x) < 0.001
}
function average(num1, num2) {
return (num1 + num2) / 2
}
function square(num) {
return num * num
}
function abs(num) {
if (num >= 0) {
return num
} else {
return 0 - num
}
}
let r = sqrt(9)
/ / 3.00009155413138
console.log(r)
Copy the code
The second SQRT
And just a little bit of math, let’s find the number of fixed points of the function x is called the fixed points of f, if x satisfies the equation f of x is equal to x
Thinking: some functions, by starting from an initial guess, again and again the first application of f, f (x), f (f (x)) and f (f (x))… Until the value changes little, one of its fixed points can be found
Here the fixedPoint function takes functions as arguments, so it is a higher-order function
const toleracne = 0.0001
function abs(num) {
if (num >= 0) {
return num
} else {
return 0 - num
}
}
function closeEnough(v1, v2) {
return abs(v1 - v2) < toleracne
}
function tryGuess(f, guess) {
let next = f(guess)
if (closeEnough(guess, next)) {
return next
} else {
return tryGuess(f, next)
}
}
function fixedPoint(f, firstGuess = 1.0) {
return tryGuess(f, firstGuess)
}
// Let's find the fixed point of the cosine function
/ / r1 is 0.7390822985224023
let r1 = fixedPoint(Math.cos)
// Check again
// R2 is 0.7390870426953322
let r2 = Math.cos(0.7390822985224023)
Copy the code
Now let’s formalize the calculation of the square root as a fixed point finding y is equal to the square root of x, so finding a y such that y squared is equal to x is the same thing as y is equal to x over y, so we just need to find a fixed point where y is equal to x over y
Unfortunately, the search did not converge. Consider the initial guess y1, the next guess y2 = x/y1, and the next guess y3 = x/y2 = y1. The reciprocating oscillation adopts the technique of “average damping” to help convergence. The next guess value after y is (1/2)(y + x/y), that is, to find the fixed point of y = (1/2)(y + x/y)
function sqrt(x) {
return fixedPoint((y) = > {
return average(y, x/y)
})
}
Copy the code
And average damping is a common technique, so let’s define it
It takes one function as an argument and returns another function
function averageDamp(f) {
return (x) = > {
return average(x, f(x))
}
}
Copy the code
Redo the square root with averageDamp
function sqrt(n) {
return fixedPoint(averageDamp((y) = > {
return n / y
}))
}
Copy the code
The third SQRT
So let’s go back to Newton’s iteration. If g(x) is differentiable, then a solution of g(x) = 0 is a fixed point of f(x), where f(x) = x-g (x)/g'(x).
Now let’s define Newton’s method as a function, so let’s define the derivative
const dx = 0.0001
function deriv(g) {
return (x) = > {
return (g(x + dx) - g(x)) / dx
}
}
Copy the code
Newton’s method can now be expressed as a process for finding fixed points
function newtonTransform(g) {
return (x) = > {
return x - g(x) / deriv(g)(x)
}
}
function newtonMethod(g, guess = 1.0) {
return fixedPoint(newtonTransform(g), guess)
}
Copy the code
So SQRT can be defined as
function sqrt(x) {
return newtonMethod((y) = > {
return square(y) - x
})
}
Copy the code
abstract
In fact, there are two ways to describe square roots, one is to use fixed point search, the other is to use Newton’s method. Newton’s method itself is also a fixed point calculation process.
So we’re abstracting this as a function as well
function fixedPointOfTransform(g, transform, guess = 1.0) {
return fixedPoint(transform(g), guess)
}
Copy the code
Then the second SQRT can be changed to
function sqrt(x) {
return fixedPointOfTransform((y) = > {
return x / y
}, averageDamp)
}
Copy the code
So the third SQRT can be changed to
function sqrt(x) {
return fixedPointOfTransform((y) = > {
return square(y) - x
}, newtonTransform)
}
Copy the code
summary
I: Go, to buy milk tea little sister: no, headache…
Today, through an example, we learned about higher-order functions, and the process of using higher-order functions for abstraction
Supplement: this example is selected from the last century foreign computer science in a classic textbook… That’s a little bit of a mystery and then you can play down the math (you don’t have to think about the derivation), you can just kind of think about the programming and the example is still for the reader’s imagination, for example, the last section was called “abstraction,” but in fact we’re extracting “average damping,” “Newton’s method,” which is already “abstraction,” right? Then exactly when to abstract, whether to abstract as much as possible, in fact, are worth thinking about.