Today, I will learn about Web application development, how to debug during the development process, and how to deploy online
Web Application Development
The HTTP module
We can build the simplest HTTP service using the node.js built-in HTTP module
const http = require("http");
http
.createServer((req, res) = > {
res.end("Hello YK!!!");
})
.listen(3000.() = > {
console.log("App running at http://127.0.0.1:3000/");
});
Copy the code
Console output
Web access (i.e. send a GET request)
Koa
There are frameworks used in real development such as Express and Koa, and today I’m going to talk about Koa
introduce
Koa – The next generation Web development framework based on the Node.js platform
Install NPM I KOA
Koa simply provides a lightweight and elegant library of functions that makes it easy to write Web applications without tying any middleware to kernel methods
const Koa = require("koa");
const app = new Koa();
app.use(async (ctx) => {
ctx.body = "Hello YK!!!! by Koa";
});
app.listen(3000.() = > {
console.log("App running at http://127.0.0.1:3000/");
});
Copy the code
Koa execution process
- The service start
- instantiation
application
- Registered middleware
- Create services and listen to ports
- instantiation
- Accept/process the request
- Get request
req
,res
object req
->request
,res
->response
encapsulationrequest
&response
->context
- Execution middleware
- Output Settings to
ctx.body
The content on the
- Get request
Koa source code is relatively friendly, you can see the source code, only four files
The middleware
A Koa application is an object containing a set of middleware functions that are organized and executed according to the Onion model
Middleware execution order: output 1, 2, 3, 2, 1
const Koa = require("koa");
const app = new Koa();
app.use(async (ctx, next) => {
console.log("1-1");
await next();
console.log("1-2");
});
app.use(async (ctx, next) => {
console.log("2-1");
await next();
console.log("2-2");
});
app.use(async (ctx, next) => {
console.log("* * * * 3 * * * * *");
ctx.body = "Sequential demonstration of middleware";
});
app.listen(3000.() = > {
console.log("App running at http://127.0.0.1:3000/");
});
Copy the code
Output 1, 2, 3, 2, 1
Let’s briefly implement middleware
Middleware simple code implementation
const fn1 = async (ctx, next) => {
console.log("before fn1");
ctx.name = "YKjun";
await next();
console.log("after fn1");
};
const fn2 = async (ctx, next) => {
console.log("before fn2");
ctx.age = 18;
await next();
console.log("after fn2");
};
const fn3 = async (ctx, next) => {
console.log(ctx);
console.log("in fn3...");
};
const compose = (middlewares, ctx) = > {
const dispatch = (i) = > {
let fn = middlewares[i];
return Promise.resolve(
fn(ctx, () = > {
return dispatch(i + 1); })); };return dispatch(0);
};
compose([fn1, fn2, fn3], {});
Copy the code
Based on the principle of middleware, obtain the execution time of processing function?
const Koa = require("koa");
const app = new Koa();
// Logger middleware
app.use(async (ctx, next) => {
await next();
const rt = ctx.response.get("X-Response-Time");
if(ctx.url ! = ="/favicon.ico") {
console.log(`${ctx.method} - ${ctx.url} - ${rt}`); }});// X-Response-time middleware
app.use(async (ctx, next) => {
const start = Date.now();
await next();
const ms = Date.now() - start;
ctx.set("X-Response-Time".`${ms}ms`);
});
app.use(async (ctx) => {
let sum = 0;
for (let i = 0; i < 1e9; i++) {
sum += i;
}
ctx.body = `sum=${sum}`;
});
app.listen(3000.() = > {
console.log("App running at http://127.0.0.1:3000/");
});
Copy the code
You can see it’s going from top to bottom and then from bottom to top
Common middleware
koa-router
: Route parsingkoa-body
: Request body parsingkoa-logger
: Log recordskoa-views
: Template renderingkoa2-cors
: Cross-domain processingkoa-session
: the session handlingkoa-helmet
: Safety protection
Koa middleware is varied in quality and needs reasonable selection and efficient combination.
Koa-based front-end framework
Open source: ThinkJS/Egg…
Internal: Turbo, Era, Gulu…
What did they do?
- Koa object response/Request/Context/Application extension
- Koa common middleware library
- Company internal service support
- Process management
- The scaffold
- .
debugging
Breakpoint debugging
Use Node’s inspect to debug breakpoints on the specified file
node --inspect=0.0.0.0:9229 bootstrap.js
Copy the code
You can also install an NDB package
npm install ndb -g
> ndb node bootstrap.js
Copy the code
Code debugging with vscode (I like to use this)
Create a JSON configuration file before using it
Debug at the break point
Log debug
- The SDK to report
- toutiao.fe.app.2021-07-28.log
Online deployment
Node.js maintains the single-threaded nature of JavaScript in browsers (one thread per process)
Node.js is a single-threaded model, but its event-driven, asynchronous, non-blocking mode can be applied to high concurrency scenarios, while avoiding the resource overhead caused by thread creation and context switching between threads.
Disadvantages:
- Unable to utilize multi-core CPUS
- Errors can cause the entire application to quit, resulting in poor robustness
- A large number of calculations occupy the CPU and cannot continue
Utilize multi-core cpus
const http = require("http");
http
.createServer((req, res) = > {
for (let i = 0; i < 1e7; i++) {} / / a single core CPU
res.end(`handled by process.${process.pid}`);
})
.listen(8080);
Copy the code
Implement the simplest HTTP Server
This is a single CPU service
How to use multi-core CPU?
Node.js provides the cluster/child_process module
const cluster = require("cluster");
const os = require("os");
if (cluster.isMaster) {
const cpuLen = os.cpus().length;
console.log("cpus=", cpuLen);
for (let i = 0; i < cpuLen; i++) { cluster.fork(); }}else {
require(". / mononuclear HTTP. Js. "");
}
Copy the code
The performance comparison
Ab-c200-t10 http://localhost:8080/ Test for 10 seconds and create 200
Single process 1828
Multiple processes 6938
Multi-process robustness
A single process is easy to exit when something goes wrong. Let’s look at the robustness of multiple processes
fork
: Triggers this event after replicating a worker process.online
: After replicating a worker process, the worker process proactively sends an online message to the master process. The master process triggers this event after receiving the message.listening
: After calling LISTEN (), a worker process sends a listening message to the master process. The master process triggers this event after receiving the listen message.disconnect
: This event is triggered when the IPC channel between the main process and the worker process is disconnected.exit
: This event is triggered when a worker process exits.setup
:cluster.setupMaster() Triggers the event
const http = require("http");
const numCPUs = require("os").cpus().length;
const cluster = require("cluster");
if (cluster.isMaster) {
console.log("Master process id is", process.pid);
for (let i = 0; i < numCPUs; i++) {
cluster.fork();
}
cluster.on("exit".function (worker, code, signal) {
console.log("worker process died, id", worker.process.pid);
cluster.fork(); // Fork a new process immediately after exit
});
} else {
const server = http.createServer();
server.on("request".(req, res) = > {
res.writeHead(200);
console.log(process.pid);
res.end("hello world\n");
});
server.listen(8080);
}
Copy the code
The process to protect
We don’t have to manage processes by hand every time, we can use management tools to implement process daemons.
Node.js process management tool:
- Multiple processes
- Automatic restart
- Load balancing
- The log view
- Performance monitoring
The PM2 process is used for management and monitoring
Complex calculations
Node.js complex computing, CPU usage stuck?
const http = require("http");
const complexCalc = () = > {
console.time("Calculation time");
let sum = 0;
for (let i = 0; i < 1e10; i++) {
sum += i;
}
console.timeEnd("Calculation time");
return sum;
};
const server = http.createServer();
server.on("request".(req, res) = > {
if (req.url === "/compute") {
const sum = complexCalc();
res.end(`sum is ${sum} \n`);
} else {
res.end("success"); }}); server.listen(8000);
Copy the code
For example, let’s do a loop here to simulate a complex calculation
The calculation takes more than ten seconds
127.0.0.1:8000/ping will block the output of success until the complex calculation is complete
Complex computations take a long time to compute on the CPU, and simple tasks wait for complex tasks to complete before they can be executed
You can consider using multiple processes, putting complex calculations into the child process, not occupying the main process, and then sending the results to the parent process, and then you can get the results
Multiple processes communicate with processes
Complex computation subprocess
The main process
Front-end versus back-end development
Finally, let’s compare the focus of front-end and back-end development
The front-end development
- Dealing with browsers, compatibility issues
- UI specification and componentization
- Loading speed, execution performance, rendering performance
- Crash monitoring and white screen monitoring
- Security vulnerabilities such as XSS and CSRF
Server-side development
- Database, Redis and other storage services
- Server management and DISASTER recovery
- Performance and memory leaks
- Service monitoring, error monitoring, traffic monitoring, alarm
- SQL injection, directory traversal and other security vulnerabilities