Buried pit
For those of you who don’t know, Javascript (ECMAScript) was developed in 10 days by the guy above. So research and development colleagues, if you think PM is crazy when he asks you to put another project on the line 10 days later, you might as well have a look at this old brother.
There is an interview of this elder brother on YouTube, and you can listen to him talk about his art life. I heard that this elder brother was promoted to be the CTO of Mozilla because of this technology topic. Therefore, it is not a bad thing for you to write more technology topics on OKR.
The difference between the code farmer and the great god lies in this: after 10 days of such a thing, the code farmer died, while the great God succeeded. But everything that goes up quickly leaves a pit. The difference between the great god and the yard farmer is that the great god leaves a puddle and the yard farmer leaves a sinkhole. Here is a list of pits:
- Js was designed as an interpreted language at the very beginning, because The target users of this language were “non-professional programmers & designers”, and Daishen thought it was cruel to let them know about compilers, so the language was comparable to the “21 Days to learn XXX” series at that time
- And there’s no type, because learning type means learning how the CPU works (at the time) means learning how the computer is made up, I think it’s cruel for “non-professional programmers and designers” to know that 1 and 1.0 are CPU processors and FPU processors.
- Classes, generics, default parameters, operator overloading, and so on are all standard features of mature programming languages.
If this project is limited to this, it is not a sad story. The great god completed the KPI within 10 days, and the boss felt good about it, so he gave an increase in salary and got a promotion. From then on, he did not lick the lid of yogurt, and he also wanted to dip his meal in white sugar and brown sugar. But the problem is that the market feedback is a bit beyond the expectation of Daishen and his leader. JavaScript has become popular all the way, which was not easy in the 1990s when there was no Douyin. Major browser manufacturers supported JS one after another, and it did not take long to become the de facto standard language in browsers. It also kills VBScript in the process.
Suddenly, without a bit of warning, the language for “non-professional programmers” has become one of the most important languages on the Internet, being used to develop projects that were previously unthinkable or unthinkable on the Internet. At this time finished calf, god left before the small puddle did not have time to fill, has been the Internet into a sinkhole. This leads to the development history of JS in the following years is a history of blood and tears filling pits, also let a large number of professional pit filling technology net celebrities to earn a pot full.
Appear in many tiankeng, the biggest tiankeng is the front-end performance pit, come, fellow tourists, let us go to this pit a circle…
Performance pit filling battle
Technically, this isn’t his fault, because JavaScript wasn’t designed to be fast when he wrote his first line of code in 1995 (when I was in second grade), and it wouldn’t have been designed to be interpreted in the first place. And for the first decade, his unhappiness didn’t really matter much (15 seconds or 30 seconds… HMM…) , only then the browser war intensified, the function of the web has rich rise rapidly, if the SEC kill system design, lame, because is likely to become 10 seconds to kill, kill this, of course, can not endure, so Google inside a bunch of smart programmers think of a way, although you are a scripting language, but I can sneak to compile, And don’t tell “non-professional programmers and designers” that I’ll just compile the code you’re about to run just before you run it. Secretly compile it.
So Google introduced JIT (Just In Time compiling), known as just-in-time compiling, for the 2009 V8 engine. With this killer, JavaScript execution speed instantly increases 20-40 times. We stole a graph to illustrate this node:
Nodejs was born at this point in history. Some people even wrote operating systems in browsers. Google established itself as the browser leader, and is still leading the way in front-end performance.
Growth of demand in life are endless, JIT bring performance improvement soon being squeezed dry, and it is easy to anticipate, the invention of the JIT is just mode innovation, don’t fill in the pit on the root and just added a little bridge on the pit, the following problems which we analyze the JIT:
- JIT compiles based on runtime analysis, and JavaScript is a weakly typed language, so most of the time the JIT is actually playing a game of you say, I guess, even though it will classify the same code by the number of times it runs (warm code vs. hot code). But for the vast majority of front-end programmers, a variable that can hold both numbers and strings is a great development experience. Here’s an everyday example:
function add(a, b) { returna + b; }var c = add(1.2);
Copy the code
Function add(int a, int b) {return a+b; function add(int a, int b) {return a+b; } var d = add(“hello”, “world”)
How to do, just compiled into machine code, it becomes garbage…
In this case, the JIT has to start all over again, so the JIT’s performance fluctuations are based on this, and in extreme cases, the performance gains are not as large as the recompilation overhead. In fact, most of the time will not generate JIT optimization code, a bytecode, bytecode directly, without the bytecode, thick compiled under the knot, because JIT also need their own time, unless it is a function used many times, otherwise will not be compiled into machine code, compiled because time may be more than directly run the bytecode.
To sum up, the performance improvement ceiling from JIT is not as high as expected, although it is increased by 20-50 times, because JS performance was so slow.
Performance pit filling WW2
Through ww1, we found that the ceiling that the JIT encountered was type indeterminacy and some language features (such as exceptions, for… In, JIT up is very troublesome), the idea to solve the problem has been given from the problem itself, as long as make something to solve the type of problem on the line. Following this line of thinking, there are two implementation paths:
- One is represented by Google’s approach to solving the problem, and the basic idea is, let me make another language, this language is strongly typed, front-end programmers, don’t always think cool, you should specify the type or you should specify the type
So Google released a technology in Chrome in 2011. Google called it: NaCL, the principle is that the browser provides a sandbox, and then use C/C++ to write programs, in this sandbox safely run, the full name of this technology is Native Client, read the translation of the principle, this idea is very strange, but very effective, since JS performance is not good, then let the browser support other performance can. It’s up to your browser anyway.
As shown in the figure below, the composition structure of a standard NaCl application is very similar to that of an ordinary JavaScript Web application. As a part of the application, NaCl module is mainly used for complex data processing and calculation, while JavaScript is responsible for the interaction logic between the application and external users. NaCl instances and JavaScript code can communicate with each other through a subscribe/publish model.
The ideal is full, but the reality of the bone feeling let a person tremble. One big problem with this technology is that the toolchain is complex and platform-dependent, and even though you can experiment with your own browser, it’s embarrassing that it’s not standard.
To use NACL, you need to first code and compile it locally, and then compile it for each platform based on the operating system (X86_32 / X86_64 / ARM, etc.). Secondly, due to the heavy platform dependency, modules compiled with NaCL can only be distributed in Chrome’s APP Store. Also, if you want to compile existing C/C++ code base into NaCL and use it in the browser, You’ll also need to rewrite this code through a library called Pepper. Pepper provides a number of wrapper types and apis for interacting with browsers, such as “PP_Bool.” These apis and special types make it easy to integrate traditional C/C++ code with the sandbox environment of a Web browser.
But the experiment didn’t stop there, and since it didn’t hit a wall, Google has followed up with NACL 2.0 (let’s call it that), named PNaCL, where the P stands for Portable. PNaCl has a different lifecycle. As you can see from the figure below, PNaCl compiles source C/C++ code into an intermediate code instead of directly containing platform architecture-related code in the NaCl module. This intermediate code is then converted to the corresponding platform-specific code when the browser actually loads the PNaCl module. Therefore, for PNaCl modules, the process of distribution becomes simpler without worrying about portability.
But even for technologies like PNaCl, where “portability” is no longer an issue, there are mountains to climb. For example: “Need to use Pepper to rewrite C/C++ code, the standard is closed, only supported by Chrome” etc.
In short, both NaCl and PNaCl are a thing of the past. Now, if you go back to NaCl/PNaCl’s official documentation site at Google, you’ll find a statement like this. Wasm will be a new generation of technology to take over and continue the mission Google has given to NaCl/PNaCl.
- The other is firfox’s asm.js, which makes a strict subset of js types (TS is a strict superset of types). At the same time, it tries to use annotation methods to add types to variables.
function asm (stdin, foreign, heap) {
"use asm";
function add (x, y) {
x = x|0; // the variable x stores a value of type int;
y = y|0; // y stores a value of type int;
var addend = 1.0, sum = 0.0; // The addend and sum variables hold "double precision floating point" values by default;
sum = sum + x + y;
return +sum; // The function returns a "double-precision floating point" type;
}
return { add: add };
}
Copy the code
Since it is a subset of JS, for earlier browsers, ASM code can be run as normal JS code, ensuring browser compatibility.
When the JavaScript engine meets certain conditions, it compiles the marked ASM.js code into the corresponding machine code through AOT static compilation and saves it. When the JavaScript engine executes this ASM.js code again (even for the first time), it directly uses the previously stored machine code version. As a result, engine performance is greatly improved. Here’s how ASM.js compares to JIT and native performance:
The birth of webassembly
It’s May 2015. Ben on the Chrome team is working on a new Prototype for V8, and Rosbery, another team member, is working on a bytecode format for that Prototype.
In fact, this Prototype and the corresponding bytecode format are the WAT readable text format and binary bytecode format that Wasm corresponds to today. Inside Google at the time, these two parts were temporarily referred to as ML-Proto and V8-Native Prototype. Ml-proto and V8-Native Prototype became part of the early Wasm standard as the V8 team refined and refined them.
At the same time, there was an internal tool called “Sexpr-WASM” that was used to convert the two formats back and forth. As Wasm became standardized, it also became part of a common debugging tool for Wasm, known as WABT.
The Chrome V8 team, which was involved in the development of PNaCL and ASM.js standards, also took into account many of the advantages and disadvantages of these two technologies when designing and implementing Wasm. This experience will also help Wasm prepare to avoid the potholes he’s been through. In the end, this experience makes Wasm visible to people in a much better way.
On December 5, 2019, the World Wide Web Consortium (W3C) announced that the WebAssembly core specification has become an official standard. It is also the fourth Web language to support code running in the browser, after HTML, CSS, and JavaScript.
Summary: The front-end performance of these things are covered, may not be too serious, we will look at the next article, we will start the principles of WebAssembly, for more information, please listen to the next time to break down, bye bye!
Reference:
- Introduction to WebAssembly by Yu Hang
- Part of A Cartoon Intro to WebAssembly