Translation: Crazy geek
Original text: medium.com/the-node-j….
Without permission, prohibit reprinting!
This article first issue wechat public number: front-end pioneer welcome to pay attention to, every day to push you fresh front-end technology articles
introduce
The rise of JavaScript has brought many changes, and the landscape of Web development today is radically different. A few years ago it would have been hard to imagine running JavaScript on a server.
Before delve into Node.js, you might want to understand the benefits of using cross-stack JavaScript, which harmonizes the language and data format (JSON), allowing you to reuse developer resources in an optimal way. Incorporating Node.js into the technology stack is a key advantage.
Node.js is a JavaScript runtime environment built on Chrome’s JavaScript engine called V8. It’s worth noting that Ryan Dahl, the creator of Node.js, was “inspired by Gmail and other apps” with the goal of developing a website with real-time push capabilities. In Node.js, it provides a tool for handling non-blocking event-driven I/O.
In a word: Node.js is shining in real-time Web applications based on WebSockets push technology. We’ve been using stateless Web applications based on stateless request-response patterns for more than 20 years, and now we finally have a Web application with real-time two-way connectivity, where both client and server can initiate communication and allow them to freely exchange data.
This is in stark contrast to the typical Web response pattern, where communication is always initiated by the client. It is also based on the open Web technology stack (HTML, CSS and JS) running on standard port 80.
One could argue that we’ve been doing this for years in the form of Flash and Java applets — but really, these are just sandbox environments that use the Web as the transport protocol to pass data to clients. In addition, they run in isolation, often on non-standard ports, which may require additional permissions.
With its advantages, Node.js plays a key role in the technology stack of many well-known companies that rely on its unique advantages. The Node.js Foundation has put together almost all of the best ideas, and a short powerpoint presentation on why companies should consider Node.js can be found on the Foundation’s case studies page.
In this article, I’ll discuss not only how to use these advantages, but also why you might want to use Node.js, using some classic Web application models as examples.
How does it work?
The main idea behind Node.js is to use non-blocking, event-driven I/O to be lightweight and efficient for data-intensive, real-time applications running across distributed devices.
It’s a mouthful.
This means node.js is not a new platform that will solve everything soon to dominate the Web development world. Rather, it is a platform that meets specific needs. It is absolutely necessary to understand this. You never want to use Node.js for CPU-intensive operations; In fact, using it for a lot of heavy computing would eliminate almost all of its advantages. Where Node.js really comes into play is in building fast, scalable network applications because it can handle a large number of concurrent connections at high throughput, which equates to high scalability.
The underlying mechanism is very interesting. Whereas traditional Web services techniques create a new thread for each connection (request), taking up system memory and ultimately being limited to the maximum memory available, Node.js runs on a single thread, using non-blocking I/O calls, allowing it to support tens of thousands of concurrent connections (maintained in event loops).
Fast calculation: Assuming that each thread requires 2 MB of memory, running on a system with 8 GB of memory could theoretically have up to 4000 concurrent connections (calculated from Michael Abernethy’s “Just What is Node.js? “, released on IBM developerWorks in 2011; Unfortunately, the link to this article is now dead), and that’s not counting the cost of context switching between threads. This is the scenario you would normally deal with in traditional Web server technology. By avoiding all of these problems, Node.js achieves a level of over 1M concurrent connections, and 600K webSockets concurrent connections.
Of course, the potential downside of writing Node.js applications is the problem of sharing a single thread between client requests. First, heavy computation can block a single thread of Node and cause problems for all clients (more on that later), as incoming requests are blocked until the computation is complete. Second, developers need to be very careful not to let exceptions bubble up into the core (topmost) Node.js event loop, which will cause the Node.js instance to terminate (the program crashes).
To avoid exceptions bubbling up to the top level, a common technique is to pass errors back to the caller as callback arguments (rather than throwing them as in other environments). Even if some unhandled exceptions bubble up to the top level, there are tools to monitor the Node.js process and perform the necessary recovery crashes (though it may not recover to the current state of the user session), the most common being the Forever module.
NPM: Node package manager
One thing that should never be overlooked when discussing Node.js is support for package management using the built-in NPM tool, which is installed in every Node.js environment by default. The concept of an NPM module is very similar to Ruby Gems: a set of reusable components with version and dependency management that can be easily installed through an online repository.
A complete list of packaged modules can be found on the NPM web site or accessed using the NPM CLI tool that automatically installs with Node.js. The module ecosystem is open to everyone, and anyone can publish their own modules, which will appear in the NPM repository. For an introduction to NPM, see the Beginner’s Guide and the section on publishing modules in the NPM publishing tutorial.
Some useful NPM modules are:
- Express — Express.js, a Sinatra-inspired Node.js Web development framework, is the de facto standard for most Node.js applications today.
- Hapi — a modular and very easy to use configuration-centric framework for building Web and service applications
- Connect — Connect is the Node.js extensible HTTP server framework that provides a series of high-performance “plug-ins” called middleware as the foundation for Express.
- Socket. IO and sockJS — two of the most common WebSockets server-side components today.
- Pug (formerly Jade) – the default option in Express.js, one of the popular HAML inspired template engines.
- Mongodb and Mongojs — mongodb wrappers that provide apis for the mongodb object database in Node.js.
- Redis — Redis client.
- Forever – is probably the most commonly used utility to ensure that a given Node script runs continuously. Keep node.js processes in production in the event of unexpected failures.
- Bluebird — Promises/A+ is A fully featured implementation that Promises great performance
- Moment – A lightweight JavaScript date library for parsing, validating, manipulating, and formatting dates.
The list goes on and on. There are lots of useful bags for all to use.
When should Node.js be used
Online chat
Live chat is the most typical real-time multi-user application, and node.js is at its best: it is a lightweight, high-traffic, data-intensive (but low-processing and computation-intensive) application that can be distributed across devices. It is also a good case study because it is simple but covers most of the paradigms you would use in a typical Node.js application.
Let’s try to picture how it works.
In the simplest scenario, we have a chat room on our website where people can exchange messages on a one-to-many (actually, to everyone) basis.
On the server side, we have a simple express.js program that does two things: 1) a GET request handler that provides functionality including a message board and a “send” button to initialize new message input, and 2) a W SOCKET server that listens for new messages sent by webSocket clients.
On the client side, we have an HTML page with several handlers set up, one for the “send” button click event, which receives the input message and sends it to the Websocket, The other listens for new incoming messages and displays them on the WebSockets client (that is, messages sent by other users that the server wants the client to display).
When one of the customers posts a message, the following happens:
- The browser captures the “send” button event and handles the JavaScript program, gets the value from the input field (that is, the message text), and issues a WebSocket message (initialized when the web page is initialized) using a WebSocket client connected to our server.
- The webSocket connected server-side component receives the message and forwards it to all other clients using a broadcast mode.
- All clients receive new messages through the WebSockets client component running in the web page. They then update the page by adding new messages to it.
This is the simplest example. For a more powerful solution, you can use a simple cache based on Redis. Or, in more advanced solutions, message queues can be used as message routing, and more powerful delivery mechanisms can be implemented, such as the ability to store messages when a connection is lost or when the client is offline. But no matter what improvements you make, Node.js will still run on the same basic principles: react to events, handle many concurrent connections, and keep the user experience smooth.
API at the top of the object database
While Node.js is great for developing real-time applications, it is also great for exposing data from object databases such as MongoDB. JSON stored data allows Node.js to perform well with objects consistent with stored data and without data conversion.
For example, if you’re using Rails, you need to convert JSON to binary models, and then convert them back to JSON via HTTP for use in react.js or angular.js, or even simple jQuery AJAX calls. With Node.js, you can expose your JSON objects directly to the client via the REST API. In addition, you don’t have to worry about converting between JSON and anything else when reading or writing from a database (if you’re using MongoDB). In general, using a uniform data serialization format in the client, server, and database can avoid the trouble of multiple transformations.
Input queue
If you receive a large amount of concurrent data, your database may become a bottleneck. As mentioned above, Node.js can easily handle concurrent connections itself. But because database access is a blocking operation (in this case), we run into trouble. The solution is to validate the client’s behavior before the data is actually written to the database.
In this way, the system can remain responsive under high loads, which is especially useful when the client does not need to confirm that data has been written successfully. Typical examples include batch processing when recording or writing user trace data; And final consistency (often used in the NoSQL world) acceptable operations that do not require immediate reflection (such as updating the “Likes” count on Facebook).
Data is queued through some kind of cache or message queue (for example, RabbitMQ, ZeroMQ) and written through a separate database batch write process, or digested by computationally intensive back-end services to better platforms capable of performing such tasks. Similar behavior can be implemented in other languages or frameworks, but not on the same hardware to maintain the same high throughput.
In short: With Node, you can write databases first to a place and process them later as if they had already been processed successfully.
The data flow
In more traditional Web platforms, HTTP requests and responses are treated as isolated events; they are actually streams. You can use this property in Node.js to build some cool features. For example, files can be processed as they are uploaded, and because the data comes in through streams, we can process it in real time. This can be used for real-time audio and video encoding and for proxying between different data sources (see the next section).
The agent
It’s easy to use Node.js as a server-side proxy that handles a large number of concurrent connections in a non-blocking manner. This is particularly useful in scenarios where you have multiple services with different response times for the broker, or where data is collected from multiple sources.
For example, when a server program communicates with a third-party resource, it extracts data from different sources or stores resources such as images and videos to the third-party cloud service.
Dedicated proxy servers are available, but If you don’t have an underlying proxy architecture, or you need a local development environment, Node may be helpful.
Data interface for stock dealers
Let’s go back to the application. Another example that could easily be replaced with a real-time web solution is stockbroker trading software, which is used to track stock prices, perform calculations, perform technical analysis, and create charts.
If you switch to a real-time web-based solution, brokers will be able to easily switch workstations or workplaces. Soon, we may start seeing them on Florida beaches……
Application monitoring dashboard
Another common use case where Node-with-web-socket fits perfectly: tracking web site visitors and visualizing their interactions in real time. You can collect statistics from users in real time, and even open communication channels at specific points in the access channel for targeted interaction with visitors, which can be found here: CANDDi.
Imagine how you could improve your business if you could see what your visitors were doing in real time. This can now be done using Node.js real-time bidirectional sockets.
System monitoring dashboard
In terms of infrastructure, Consider an SaaS provider that wants to provide a service monitoring page for its users (for example, the GitHub status page). With the Node.js event loop, we can create a powerful Web-based dashboard that asynchronously checks the state of the service and pushes data to the client using WebSockets. The status of both internal and public services can be reported in real time using the technology.
Note: Do not try to build hard real-time systems (that is, systems that require consistent response times) in Node.js. For that type of application, Erlang might be a better choice.
When can you use Node.js
Server-side Web applications
Node.js with Express.js can also be used to create classic Web applications on the server side. There are pros and cons to this approach. Here are some things to consider:
Advantages:
- If your program doesn’t have any CPU intensive computation, you can build it with Javascript and an object storage database like MongoDB, or even at the database level. This significantly simplifies the development effort.
- A crawler receives a fully rendered HTML response, which is much better for SEO than a single page application or webSockets application running on Node.js.
Disadvantages:
- Any CPU-intensive computation will prevent Node.js from responding, so a threaded platform is a better approach.
- It’s still very difficult to use Node.js with a relational database (see more details below). If you want to work with a relational database, choose another environment such as Rails, Django, or ASP.Net MVC.
An alternative to CPU-intensive computing is to create a highly scalable MQ support environment with back-end processing capabilities to make Node a foreground “clerk” and handle client requests asynchronously.
When should you not use Node.js
Server-side Web applications with relational databases
For example, comparing Node.js + Express.js with Ruby on Rails, the latter is clearly more appropriate when it comes to relational data access.
Node.js’s relational database tools are still fairly primitive compared to their competitors. Rails, on the other hand, provides out-of-the-box data access setup and database schema migration support, among other Gems. Rails and similar frameworks have mature and proven implementations of the Active Record or Data Mapper Data access layer, and good luck trying to replicate these capabilities in pure JavaScript.
However, if you really prefer to do everything in JS, check out Sequelize and Node ORM2.
This is fine, and not uncommon, if you just use Node.js as a public-facing interface while using the Rails back end to access a relational database.
Heavy server side computation and processing
Node.js is not the best platform when it comes to heavy computing. You don’t want to build a Fibonacci computing server using Node.js. In general, any CPU-intensive operation will have an event-driven, non-blocking I/O model to offset all the throughput benefits Node provides, since any incoming requests will be blocked when the thread is occupied by number operations.
As mentioned earlier, Node.js is single-threaded and uses only one CPU core. When adding concurrency on multi-core servers, the Node core team does some of the work in the form of cluster Modules. You can also easily run several Node.js server instances behind the reverse proxy Nginx.
If you use clustering, you should still put all your heavy computing into a background process that is written in a more suitable environment and communicated via a message queue server such as RabbitMQ.
Even though all of your backend processing may initially run on the same server, this approach has the potential to achieve very high scalability. These background processing services can easily be distributed to a separate worker server without needing to configure the front-end Web server load.
Of course, you can use the same approach on other platforms, but with Node.js, you can get the high REQS/SEC throughput we talked about because each request is a very fast and efficient small task.
conclusion
We discussed Node.js from theory to practice, starting with its goals and aspirations, and ending with its sweet spots and pitfalls. When people have problems with Node, it almost always comes down to the fact that blocking operations are the root of all evil — 99% of which are directly due to Node misuse.
Remember: Do not use Node.js to solve computing scaling problems. It was designed to solve the I/O scaling problem, and it did a really good job.
So, if your application doesn’t contain CPU intensive operations and doesn’t access any blocking resources, you can take advantage of Node.js and enjoy fast, scalable web applications.
The first wechat official account of this article is the front End pioneer
Without permission, prohibit reprinting!
Welcome to scan the TWO-DIMENSIONAL code to follow the public account, every day to push you fresh front-end technology articles
Read on for the column’s other favorites:
- Understand Shadow DOM V1 in depth
- Step by step teach you to use WebVR to achieve virtual reality games
- 13 modern CSS frameworks to help you develop more efficiently
- Get started with BootstrapVue quickly
- How does a JavaScript engine work? Everything you need to know, from call stacks to promises
- WebSocket: Communicates with Nodes and React in real time
- 20 interview questions about Git
- Parse node.js console.log in depth
- What exactly is Node.js?
- Build an API server with Node.js in 30 minutes
- Object copy of Javascript
- If a programmer can’t earn 30K a month by the age of 30, what will he do
- 14 of the best JavaScript data visualization libraries
- 8 top VS Code extensions for the front end
- Node.js Complete guide to multi-threading
- 4 schemes and implementation of HTML into PDF
- More articles…