The emergence of Node, is really used to JS front-end engineers met love, and stride towards the back end, although sometimes will be questioned, not understood. What does that matter?

This article is “one station to the end — front-end basic network” code collation. However, it is also a separate Node zero-based learning note. First you need to install the Node environment. Just read the tutorial yourself. This article is the same idea as the functional programming article. Let’s use it first. If we have the opportunity to go back and fill in the theory, in fact, there is nothing to fill in the API, we have time to write node asynchronous queue and DC algorithm, but you can see what you don’t understand along with the documentation. Ok, as usual, let’s see, this article has done that.

The code is on Github

  • Set up the TCP server with node
  • Set up HTTP server with Node
  • Use node file FS module to read the file, and write in stream mode
  • The url path module is used to complete the Node routing
  • The PATH module determines the file type
  • Compress the file with gzip
  • Implementation of browser cache protocol
  • Node handles cross-domain
  • Setting up an HTTPS Node server
  • Http2 node server setup

1 node Creates a TCP server

const net = require('net');

let server = net.createServer((socket)=>{
   socket.on('data'.function (res) {
      console.log(res.toString())
   });
});

server.listen({
   host: 'localhost',
   port: 8080
});
Copy the code
  • The first thing you need to know is that Node uses the idea of modularity. You can require modules,
  • Net is a TCP networking API. We first use it to create a TCP server
  • We introduced the NET module to create a service using the createServer method
  • The “data” event is triggered when data is received, and the concatenated message is given to us in the form of parameters.
  • The message is a binary buffer and we need toString to convert it to a string
  • And then we set up a TCP service to listen on port localhost 8080
  • We execute node tcp1.js in terminal and the server starts
  • We will now access localhost:8080 in the browser
  • When the server receives data, the ‘data’ event is triggered
  • We saw the request header in Terminal

Here’s a look at node’s event mechanism:

EventEmitter module provides only one object: Event. EventEmitter //EventEmitter encapsulates the functions of event triggers and event listeners. var EventEmitter = require('events').EventEmitter; Var socket = new EventEmitter(); // We bind the data event to the socket object. If more than one function is called in sequence, socket.on('data'.function(res) { 
        console.log(res); 
    }); 

    socket.on('data'.function(res) { 
        console.log(res + '111'); }); // We emit the event using the emit method. After 1 second we go. When we emit the event, we can pass the parameter.setTimeout(function() { 
        socket.emit('data' , "hello" ); 
    }, 1000); 

Copy the code

We will see the following information on the console.

And then we’re going to look over our shoulder and say, in the bottom left corner of our browser, is that waiting for a response because we haven’t returned any data yet, so let’s give it some data back. We know to conform to the HTTP format.

Write (responseDataTpl) to return HTTP data

letResponseDataTpl = 'HTTP/1.1 200 OK Connection:keep-alive Date:${new Date()}
Content-Length: 12
Content-Type: text/plain

Hello world!
`;
Copy the code
  • We trigger Node 01-tcp02.js
  • In the browser, we see Hello World! Returned.

Question: Since we’ve found it difficult to write a fixed format HTTP response cup, why can’t we encapsulate it?

2 node Creates an HTTP server

2.1 Creating an HTTP Server

const http = require('http');

const server = http.createServer((req, res) => {
    res.writeHead(200, { 'Content-Type': 'text/plain' });
    res.end('hello world'); // Send response data}) server.on('clientError', (err, socket) => {
    socket.end('the HTTP / 1.1 400 Bad Request \ r \ n \ r \ n')
})

server.listen(10080) 
Copy the code
  • We introduced a Node HTTP module that listens on port 10080 (default address localhost).
  • We created an HTTP service that returns a 200 status code on success
  • Res.end (‘hello world’) sends the response data with ‘Hello World’
  • We run node 02-http-simple.js in terminal and the server starts
  • We will now go to localhost:10080 in the browser
  • And we see that the browser says’ Hello world ah ‘

Q: What if I want to pass in a file instead of a string?

2.2 Node File Module (FS)

Node file module is very powerful, you can read files, add, delete, change and check. So we’re going to start with how to read it. There are two types of reading: synchronous and asynchronous.

const fs = require('fs'); 
const http = require('http');

const server = http.createServer((req, res) => {
  res.writeHead(200, { 'Content-Type': 'text/html'}); / / / / synchronizationlet  data = fs.readFileSync('index.html'); // res.write(data); // res.end(); // Send response data // async fs.readfile ('index.html'.function(err, data) { res.write(data); res.end(); // Send response data})}) server.on('clientError', (err, socket) => {
    socket.end('the HTTP / 1.1 400 Bad Request \ r \ n \ r \ n')
})

server.listen(8088) 
Copy the code
  • Const fs = require(‘fs’);
  • In synchronization, we read first and then execute the write and send functions
  • Asynchronously, we perform write and send in asynchronous read callback functions

Problem: Now there is a problem, whether synchronous or asynchronous, we need to read the file first, then write, so the file is very large, the pressure on memory is very large. Is there any way we can read and write?

2.3 Node Streams

Stream is an abstract interface that allows you to read and write files. So you don’t have to take up a lot of memory. So let’s see how do we do that?

const fs = require('fs');
const http = require('http');

const server = http.createServer((req, res) => {
  res.writeHead(200, { 'Content-Type': 'text/html' });
  // let resStream = fs.createReadStream('index.html'); // resStream.pipe(res); Fs.createreadstream (fs.createreadStream)'index.html').pipe(res)
})

server.on('clientError', (err, socket) => {
    socket.end('the HTTP / 1.1 400 Bad Request \ r \ n \ r \ n')
})

server.listen(10080)
Copy the code
  • Create a readable stream with fs.createreadStream (‘index.html’).
  • Use resStream. Pipe (res); Pipeline read and write operations that write response messages
  • You’ll notice that we didn’t use res.end() in the code above; Send data. By default, the ‘end’ event is automatically emitted when the data transfer is complete
  • Finally, streams support chained operations, so you can do it in one line of code

Problem: After we fix the memory problem, you will notice that one of the images in our index. HTML is not loaded. The reason is simple. Because no matter what request is sent, we only return the same operation. So how can we distinguish between different requests?

2.4 the node routing

We know that application protocols use urls to indicate the location of files. An important task in differentiating between requests is to differentiate paths. So path handling node provides a URL module, let’s take a look.

const fs = require('fs');
const http = require('http');
const url = require("url"); Const server = http.createserver ((req, res) => {// PathName is the address taken after the port numberlet pathname = url.parse(req.url).pathname;
  if(pathname === '/') pathname = '/index.html';
  let resPath = '. '+ pathname; // Check whether the path existsif(! fs.existsSync(resPath)){ res.writeHead(404, {'Content-Type': 'text/html'});
    return res.end('<h1>404 Not Found</h1>'); } // If present, return the file in the path to the page res.writehead (200, {'Content-Type': 'text/html' });
  fs.createReadStream(resPath).pipe(res)
})

server.on('clientError', (err, socket) => {
    socket.end('the HTTP / 1.1 400 Bad Request \ r \ n \ r \ n')
})

server.listen(10080) 
Copy the code
  • We introduced a URL module to help us handle paths
  • Url.parse (req.url) is a path that helps us to process into an object, and it contains the path properties that we commonly use
  • One of the properties is pathName, which is the path between the URL port number and the parameter, which is the path we’re accessing
  • If we go directly to the site without a path, we give the default to /index.html
  • Relative path access we put a ‘.’ in front of it
  • We then use the existsSync method provided by the file module to determine if the file is present on the server
  • If not, we return 404 to tell the file was not found. Return the file if it does.

Question: So now we can see our beautiful image in the browser, but we learned from HTTP that the content-Type is a file Type, so the image Type is not ‘text/ HTML ‘, even though the browser was smart enough to display it for me, But we still have to correct this mistake.

2.5 Path Module Determine the file type

We know that we can simply change the file Type of ‘content-Type’.

function getFileType(resPath){
  const EXT_FILE_TYPES = {
    'default': 'text/html'.'.js': 'text/javascript'.'.css': 'text/css'.'.json': 'text/json'.'.jpeg': 'image/jpeg'.'.jpg': 'image/jpg'.'.png': 'image/png', / /... }let path = require('path');
  let mime_type = EXT_FILE_TYPES[path.extname(resPath)] || EXT_FILE_TYPES['default'];
  return mime_type;
}
Copy the code
  • We define a getFileType function and give the common file types and their content-Type values
  • We apply the path module and use the extName method on the path module to retrieve the extension
  • Then we match it to the object we define, and if we don’t find it, we give it a default value

You need to go to the terminal every time you modify the Node file is not very troublesome. Now, one more hot start tip.

sudo npm install supervisor -g

supervisor 02-http-fs-url.js 
Copy the code
  • Install the Supervisor globally
  • Start the file with supervisor instead of Node
  • This way you don’t have to manually reset the terminal every time you modify the Node file

Problem: our big Juan picture is only more than 100 K, if it is a big picture we can compress and then transmit

2.5 Using Gzip to Compress files

(1) We first retrieve the accept-encoding parameter in the request header. If it does not exist, we assign it to ”.

 let acceptEncoding = req.headers['accept-encoding'];
 if(! acceptEncoding) { acceptEncoding =' '; };Copy the code

(2) We then use the re to determine whether acceptEncoding is gzip compressed. So let’s just write one here.

if(/\ bgZip \b/.test(acceptEncoding)){// Perform compression and tell the browser in the response header what format to compress}else{// do not perform compression}Copy the code

(3) We need to use the zlib module to compress the file. Here we use Gzip and call the method of Gzip. We then compress the file stream one step before writing it to the response body.

const zlib = require('zlib');

let raw = fs.createReadStream(resPath);
raw.pipe(zlib.createGzip()).pipe(res);
Copy the code

(4) Finally, we need to tell the browser in the response header that my file has been compressed into what format

'Content-Encoding': gzip
Copy the code

Then we open two terminals with and without Gzip compression

  • http://localhost:8088/home.html
  • http://localhost:10080/home.html

In the home file I put a 5M picture I took with my camera in the Summer Palace

You can open multiple browser Windows, access two files first, test them a few times, and you’ll find that the one with gzip compression is significantly slower.

The reason for this is simple, because our servers and browsers are all on the same computer and transfer speed is very fast. So the compression and decompression time is magnified. This also tells us that not all scenarios are suitable for file compression.

  • If your browser doesn’t have the time option, you can call it up by clicking on the navigation bar.
  • You can turn clear cache on while testing.

2.6 Implementation of browser cache protocol

In this section, which is nothing new for Node, we’ll implement the HTTP browser caching protocol. We also don’t need to compress, so we don’t add anything that we compressed in the last section.

** (1) Strong cache ** Strong cache We give a one-week expiration reference in the response header code cache.js

Cache-Control : max-age = 604800'
Copy the code

  • We can see that on the second refresh, the resources in the file are fetched from the browsed cache.
  • If you do not want to fetch it from the Cache, you can force refresh or enable Disable Cache
  • Cache-control: no-cache
  • You will normally refresh the resource file with cache-control: no-cache. This is because the resource file is fetched from the Cache, and cache-Control: no-cache is what you brought in the last time you flushed it.
  • If you open a new window and visit the same page again, you don’t need to fetch it from the cache
  • Is that why, sometimes when you’re developing, you change a JS file and it doesn’t work, but when you open it in another window you see the latest file

** (2) Weak caching ** Reference code cache2.js

Etag requires a double-quoted string, which we then write to the response header

 let etagStr = "dajuan"; // writeHead(200, {'Content-Type': getFileType(resPath),
    'etag' : etagStr
  });
Copy the code

When revisiting, we need to determine if if-none-match is the same as the current etagStr value. If consistent directly return 304, do not return the file again. When the browser sees 304, it knows to pull it from the cache.

 let etagStr = "dajuan"; //etag should be quotedif(req.headers['if-none-match'] === etagStr){
    res.writeHead(304, { 
      'Content-Type': getFileType(resPath),
      'etag' : etagStr
    });
   res.end();
 }
Copy the code

2.7 Node handles POST and GET requests

(1) We first write a form submission using get and POST respectively, and make it jump to form_result. HTML with a line hello, name

  //form.html
  <form action="form_result.html" method="get">
       <p> get: <input type="text" name="name" /></p>
       <input type="submit" value="Submit" />
  </form>
   <form action="form_result.html" method="post">
       <p> post: <input type="text" name="name" /></p>
       <input type="submit" value="Submit"/> </form> //form_result. HTML <div>Copy the code

(2) Get method to handle the reference code method.js

 let pathAll = url.parse(req.url);
 letgetArgument = pathAll.query; // retrieve the parameter name=XXXif(pathname === '/form_result.html'&& getArgument ! = undefined){let text = fs.readFileSync('form_result.html').toString().replace(/name/, getArgument)
   fs.writeFileSync('form_result.html',text)
 }
Copy the code
  • We know that url.parsl() reads urls, and query is the argument to get
  • When the path to jump to is ‘/form_result. HTML ‘and getArgument has a value
  • We use the file module to synchronously read the contents of ‘form_result. HTML ‘
  • After converting to a string, replace name in the form with name=XXX

At this point, the form submitted by GET can be processed, but the post parameter is not in the URL, so there is no effect on the POST

(3) The post method handles the reference code method2.js

  req.on('data',(data)=>{
    let text = fs.readFileSync('form_result.html').toString().replace(/name/, 'post'+ data)
    fs.writeFileSync('form_result.html',text) 
  })
Copy the code
  • The POST method listens for data events in the request header and is triggered when the request body is contained in the request message
  • So when we listen for a ‘data’ event to fire, we do the same
  • If you send a GET request, it will not be answered
  • We learned about events that we can bind multiple events to ‘data’ and each POST request will inevitably fire. This is the side effect on the server.

So I’m going to leave you with a question, what do we do if we’re doing it asynchronously when we’re doing it synchronously?

2.8 Node handles cross-domain

Reference code: cors.js cors2.js

  if(req.headers['origin'] ) {
    res.writeHead(200, { 
      'Access-Control-Allow-Origin': 'http://localhost:5000'.'Content-Type': 'text/html'
    });
    return fs.createReadStream(resPath).pipe(res)
  };  

Copy the code
  • We started two services locally
  • Let one port be 5000 and the other 9088
  • We access cers.html on port 5000
  • In HTML, we make an Ajax call to data.json on port 9088
  • So it’s cross-domain, we allow 5000 port access, and it returns data
  • If we don’t fill in, or if we don’t write port 5000, you’ll see that we don’t get the data

Note: there are still some minor problems, first I only in the first access, if the port does not conform to the prompt error. I wonder if the browser whitelisted the server address. Second, why not the two pleas in the book. I don’t make a second request even if I don’t write the data the first time. But the cross-domain effect was achieved.

3 the HTTPS and http2

3.1 Setting up an HTTPS Node Server

Now that we know how this works, let’s generate the certificate and private key at the terminal.

Key 1024 // Generate the server private key. (2) Openssl rsa -in server.key -pubout server.pem // Generate the public key // Act as the CA authority to issue a certificate to your server. The CA authority also needs its own private key, CSR file (certificate signing request file), Key 1024 // Generate the CA private key openssl req -new -key ca.key -out ca. CSR // Generate the CA CSR file openssl X509 -req -in ca. CSR -signkey ca.key -out ca. CRT // Generates a CA certificate // Generates a certificate signature request file. (4) Openssl req -new -key server Openssl x509 -req -ca ca. CRT -cakey ca.key -cacreateserial erial -in server Server. CRT // Generates a server certificateCopy the code

Note: Fill in the information freely, but note the format in the prompt, babies…

const https = require('https');
const fs = require('fs');

const options = {
  key: fs.readFileSync('./key/server.key'),
  cert: fs.readFileSync('./key/server.crt')}; https.createServer(options, (req, res) => { res.writeHead(200); res.end('hello world\n');
}).listen(8000);

Copy the code
  • We import the HTTPS module and fill in our certificate and private key
  • The rest of the code looks pretty simple now

Server access: https://localhost:8000/

  • So we can go to HTTPS and request the web page
  • Of course it tells us it’s not safe. Just keep going
  • Why will prompt us not safe, just oneself how to fill in the certificate, in the heart did not count. Ha, ha, ha

3.2 Http2 Node server setup

Node’s HTTP2 is the experimental API. If the Node version is earlier, upgrade it first. My is v8.11.3

const http2 = require('http2');
const fs = require('fs');

const server = http2.createSecureServer({
  key: fs.readFileSync('./key/server.key'),
  cert: fs.readFileSync('./key/server.crt')}); server.on('error', (err) => console.error(err));

server.on('stream', (stream, headers) => {
  // stream is a Duplex
  stream.respond({
    'content-type': 'text/html'.':status': 200}); stream.end('<h1>Hello World</h1>');
});

server.listen(8443);
Copy the code
  • We will also introduce the private key and certificate created with HTTPS
  • We create the http2 service
  • The concept of streaming in HTTP2. So we write the request header. And returns the request body
  • We went to the browser: https://localhost:8443/

Now we have a simple http2 access.