Node.js is the world’s leading tool for creating server applications using JavaScript, the world’s most popular programming language. Providing the functionality of both a Web server and an application server, Node.js is considered a key tool for all kinds of microservices-based development and delivery.

Node.js can replace or enhance Java and. NET for back-end application development.

Node.js is single-threaded and uses non-blocking I/O, allowing it to scale and support tens of thousands of parallel operations. It shares these architectural features with NGINX and addresses the C10K problem — supporting over 10,000 concurrent connections — as well as the parallel operation problem. Node.js is known worldwide for its high performance and high development efficiency.

So what could go wrong?

Node.js has a number of vulnerabilities and vulnerabilities that can make Node-based systems vulnerable to poor performance or even crashes. This is especially true when Node.js-based Web applications experience rapid traffic growth.

In addition, Node.js is a powerful tool for creating and running the logic that generates core variable web content. But it’s not as powerful at serving static content — such as images and JavaScript files — and balancing the load across multiple servers.

To use Node.js most effectively, you need to cache static content, broker and balance multiple application server loads, and manage port contention between clients, Node.js, and helpers such as the server running socket.io. NGINX can be used to address these issues, making it a great tool for node.js performance optimization.

Use these techniques to improve node.js application performance:

  1. Implement the reverse proxy server
  2. Cache static files
  3. Load balancing of multiple servers
  4. Proxy WebSocket connection
  5. Implement SSL/TLS and HTTP / 2

Note: A quick fix for Node.js application performance is to modify your Node.js configuration to take full advantage of modern multicore servers. You can also read another article on how to make Node.js generate separate child processes.

1. Implement the reverse proxy server

When we were at NGINX.Inc., we were always a little apprehensive when we saw application servers that were directly in touch with incoming traffic for high-performance web site cores. This includes many WordPress-based sites, as well as Node.js.

Designed for scalability, Node.js is easier to scale than most application servers, and its Web server side can handle a lot of access traffic. But Web services aren’t the raison d ‘etre of Node.js — it wasn’t built for that purpose.

If you have a high-traffic website, the first step to improving application performance is to put a reverse proxy server in front of your Node.js server. This protects node.js servers from direct access to external traffic, and gives you the flexibility to use multiple application servers, balance load servers, and cache content.

njp-01

Placing NGINX as a reverse proxy server in front of an existing server setup is a core NGINX use case that has been implemented by tens of millions of websites around the world.

There are some specific advantages to using NGINX as a reverse proxy server for Node.js, including:

  • Simplified operation rights and port allocation
  • Serve static images more efficiently (see tip 2)
  • Managed node.js crash successfully
  • Mitigating DoS attacks

Note: These tutorials show you how to use NGINX as a reverse proxy server in Ubuntu 14.04 or CentOS environments, and give you an overview of NGINX before Node.js.

2. Cache static files

As the use of Node.js-based sites grows, the strain on servers is increasing. Here are two things you should do:

  1. Take full advantage of node.js servers.
  2. Makes it easy to add application servers and load balancers.

This is actually very easy to do. Implement NGINX as a reverse proxy server from the start, as described in tip 1. This makes caching, load balancing (if you have multiple Node.js servers), and so on easy.

Modulus, an application container platform, has a very useful article on boosting node.js application performance with NGINX. Since Node.js does all the work on its own, our site can only serve about 900 requests per second on average. Using NGINX as a reverse proxy server to serve static content, a site can serve more than 1600 requests per second — a nearly two-fold improvement in performance.

The performance boost gives you time to take additional steps to keep up with traffic growth, such as reviewing (or improving) your website design, optimizing your application code, and deploying more application servers.

The following configuration code applies to the Modulus web site:

server {
  listen 80;
  server_name static-test-47242.onmodulus.net;
  root /mnt/app;
  index index.html index.htm;
  location /static/ {
   try_files $uri $uri/ =404;
  }
  location /api/ {
   proxy_pass http://node-test-45750.onmodulus.net;
  }
}
Copy the code

For example, in an Nginx location block, you might not want to cache something. For example, you wouldn’t normally want to cache the admin interface of a blogging platform. Here is the configuration code to disable [or avoid] caching the Ghost management interface:

location ~ ^/(? :ghost|signout) { proxy_set_header X-Real-IP$remote_addr;
        proxy_set_header Host $http_host;
        proxy_pass http://ghost_upstream;
        add_header Cache-Control "no-cache, private, no-store, must-revalidate, max-stale=0, post-check=0, pre-check=0";
    }
Copy the code

Caching static files on the NGINX server can significantly reduce the load on the Node.js application server, allowing it to achieve better performance.

We recommend you a free learning group, which covers mobile application website development, CSS, HTML, Webpack, Vue Node Angular and interview resources. Access to information 👈👈👈 interested in web development technology students, welcome to join Q group: 👉👉👉582735936 👈👈👈, whether you are small white or DANniu I welcome, and danniu organized a set of efficient learning routes and tutorials to share with you free, while updating video materials every day. In the end, I wish you all success as soon as possible, get satisfactory offer, fast promotion and salary increase, and walk on the peak of life.

3. Implement node.js load balancing

Node.js applications are really key to high performance running multiple application servers and balancing loads.

Node.js load balancing can be particularly tricky because Node.js allows for a high level of interaction between JavaScript code running on a Web browser and JavaScript code running on a Node.js application server, while using JSON objects as a medium for data exchange. This means that a given client session runs continuously on a particular application server, and session persistence is inherently difficult to achieve with multiple application servers.

One of the main advantages of the Internet and the Web is that it is highly borderless, including the ability to satisfy client requests through arbitrary server access to request files. Node.js is borderless and works best in stateful environments, where the same server consistently responds to requests from any particular client.

This need is best met through NGINX Plus, rather than open source NGINX software. The two versions of NGINX are similar, but one major difference is their support for load-balancing algorithms.

NGINX supports stateless load balancing:

  • Cycle. New requests go to the next server in the list.
  • Minimum connections. New requests go to the server with the fewest active connections.
  • IP Hash. The new request goes to the server that hashes the client’s IP address.

Just one of these methods, IP Hash, reliably sends specified client requests to the same server, which benefits Node.js applications. However, IP Hash can easily cause one server to receive a disproportionate number of requests at the expense of other servers, as the load-balancing technique described in this blog post does. Stateful support in this approach comes at the expense of potentially suboptimal request allocation across server resources.

Unlike NGINX, NGINX Plus supports session persistence. When session persistence is used, the same server can also reliably receive all requests from specified clients. The advantages of Node.js — stateful communication between client and server — and the advantages of NGINX Plus — advanced load balancing capabilities are maximized.

So, you can use NGINX or NGINX Plus to support load balancing across multiple Node.js servers. Only NGINX will allow you to maximize load balancing performance and friendly Node.js stateful. Application health checks and monitoring capabilities built into NGINX are also useful.

NGINX Plus also supports session persistence, thus allowing the application server to gracefully complete the current session even after it has taken an out-of-service request.

4. Proxy WebSocket connection

HTTP, of all versions, is designed for “pull” communication-client requests for files from the server. WebSocket is a tool that allows “push” and “push/pull” communication, that is, the server can actively send files that the client has not requested.

The WebSocket protocol makes it easier to support a more robust interaction between client and server, while reducing the amount of data transferred and minimizing wait times. When required, a full-duplex transport connection can be implemented, which means that both the client and the server can initiate and receive requests as needed.

The WebSocket protocol has a powerful JavaScript interface that makes it ideal for Node.js as an application server — and, for web applications with a low transaction volume. As transaction volume increases, it becomes necessary to insert NGINX using NGINX or NGINX Plus between clients and Node.js Web servers and between multiple application servers.

Node.js is often used in conjunction with socket. IO, a WebSocket API that is popular in Node.js applications. This can cause port 80 (for HTTP) or port 443 (for HTTPS) to become quite crowded, and the solution is to proxy the socket.io server. You can use NGINX as a proxy server, as mentioned earlier, and get other features such as static file caching, load balancing, etc.

njp-02

Here is the code to listen for Port 5000 as a server.js node application file. It acts as a proxy server (rather than a Web server) and routes requests to the correct port:

var io = require('socket.io').listen(5000);
io.sockets.on('connection'.function (socket) {
  socket.on('set nickname'.function (name) {
    socket.set('nickname', name, function () {
      socket.emit('ready');
    });
  });
  socket.on('msg'.function () {
    socket.get('nickname'.function (err, name) {
      console.log('Chat message by ', name); }); }); }); var socket = io(); // This is your initialization code.Copy the code

For a complete introduction, including NGINX configuration, see this blog post. See this blog post for more insight into the potential architecture and infrastructure of web applications.

5. Implement SSL/TLS and HTTP / 2

More and more sites are using SSL/TLS to protect all user interactions on the site. You can decide if and when to make this move, but if you choose to do so, NGINX supports this transition in two ways:

  1. You can terminate SSL/TLS connections to clients in NGINX if you set up NGINX as a reverse proxy. Node.js servers use Nginx reverse proxy servers to send and receive unencrypted requests and content back and forth.
  2. Early indications are that the performance penalty imposed by using SSL/TLS can be largely or completely offset by using HTTP / 2, a new version of the new HTTP protocol. NGINX supports HTTP / 2 and you can terminate HTTP / 2 and SSL without making any changes in the Node.js application server.

As you take these implementation steps, you’ll also need to update urls in your Node.js configuration file, establish and perfect secure connections in your NGINX configuration, and use SPDY or HTTP / 2 if necessary. Adding HTTP / 2 support means that browser versions use a new protocol to support HTTP / 2 to communicate with applications: older browsers use HTTP / 1.x.

njp-03

The following configuration code applies to Ghost blogs that use SPDY. It includes some advanced features such as OCSP Stapling. Use NGINX for SSL terminals, including the OCSP stapling option, see here. For an overview of the same topic, see here.

One minor change you’ll need to make is to configure your Node.js application to upgrade from SPDY to HTTP / 2, either now or when SPDY support goes away in early 2016.

server { server_name domain.com; listen 443 ssl spdy; spdy_headers_comp 6; spdy_keepalive_timeout 300; keepalive_timeout 300; ssl_certificate_key /etc/nginx/ssl/domain.key; ssl_certificate /etc/nginx/ssl/domain.crt; ssl_session_cache shared:SSL:10m; ssl_session_timeout 24h; ssl_buffer_size 1400; ssl_stapling on; ssl_stapling_verify on; ssl_trusted_certificate /etc/nginx/ssl/trust.crt; Resolver 8.8.8.8 8.8.4.4 valid = 300 s; add_header Strict-Transport-Security'max-age=31536000; includeSubDomains';
   add_header X-Cache $upstream_cache_status;
   location / {
        proxy_cache STATIC;
        proxy_cache_valid 200 30m;
        proxy_cache_valid 404 1m;
        proxy_pass http://ghost_upstream;
        proxy_ignore_headers X-Accel-Expires Expires Cache-Control;
        proxy_ignore_headers Set-Cookie;
        proxy_hide_header Set-Cookie;
        proxy_hide_header X-powered-by;
        proxy_set_header X-Real-IP $remote_addr;
        proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
        proxy_set_header X-Forwarded-Proto https;
        proxy_set_header Host $http_host;
        expires 10m;
    }
    location /content/images {
        alias /path/to/ghost/content/images;
        access_log off;
        expires max;
    }
    location /assets {
        alias /path/to/ghost/themes/uno-master/assets;
        access_log off;
        expires max;
    }
    location /public {
        alias /path/to/ghost/built/public;
        access_log off;
        expires max;
    }
    location /ghost/scripts {
        alias/path/to/ghost/core/built/scripts; access_log off; expires max; } location ~ ^/(? :ghost|signout) { proxy_set_header X-Real-IP$remote_addr;
        proxy_set_header Host $http_host;
        proxy_pass http://ghost_upstream;
        add_header Cache-Control "no-cache, private, no-store, must-revalidate, max-stale=0, post-check=0, pre-check=0"; proxy_set_header X-Forwarded-Proto https; }}Copy the code

conclusion

This article describes some of the most important performance improvements that can be implemented in Node.js applications. It focuses on adding NGINX and Node.js to applications — by using NGINX as a reverse proxy server, caching static files, load balancing, proxying WebSocket connections, and terminating SSL/TLS and HTTP / 2 protocols.

The combination of NGINX and Node.js is widely seen as a way to create new microservice applications or add flexibility and performance to existing SOA-based applications that use Java or Microsoft.NET. This article will help you optimize your Node.js application to make the Partnership between Node.js and NGINX work for you.

We recommend you a free learning group, which covers mobile application website development, CSS, HTML, Webpack, Vue Node Angular and interview resources. Access to information 👈👈👈 interested in web development technology students, welcome to join Q group: 👉👉👉582735936 👈👈👈, whether you are small white or DANniu I welcome, and danniu organized a set of efficient learning routes and tutorials to share with you free, while updating video materials every day. In the end, I wish you all success as soon as possible, get satisfactory offer, fast promotion and salary increase, and walk on the peak of life.