The front end early chat conference, and nuggets jointly held. Add Codingdreamer into the conference technology group, win at the new starting line.
29th | front-end data visualization, high intensity of disposable insight into visual the front end of the play, live all day 7 to 17, 9 lecturer 9 hours of knowledge bombing (ali cloud/ant/odd essence, millet, etc.), registration on 👉) :
All past episodes are recorded in full,Start to unlock all annual tickets at one time
The text is as follows
Do not want to build their own database and background editing management functions, if the Finch as a cloud database, there is no way to cheat?
Node.js (hereinafter referred to as Node) is the biggest charm of the front-end, nothing more than to start an HTTP service, to provide the service capability of the website, which can help front-end engineers to complete a lot of fun works, how to service:
Node starts a service
Two of the most amazing things about Node’s network requests are request and return, request and Response, the so-called web services that we provide, which do all sorts of processing on top of REq and RES. Native Nodes provide this capability. However, we have to do all the dirty work ourselves, whether it is the determination of the request type, url parsing, or the return of the status code, the Node framework on the market is also based on REQ and RES encapsulation:
const http = require('http')
const server = http.createServer((req, res) = > {
// Perform various business processes based on reQ request types and parameters
res.statusCode = 200
res.setHeader('Content-Type'.'text/plain')
res.end('Hello ZaoZaoLiao')
})
server.listen(3000.'127.0.0.1'.() = > {})
Copy the code
Express starts a service
The Old Express framework (now much lighter than it was a few years ago) does a lot of things for you. For example, request types and routes are not required for you to handle. You can pick them up and immediately return different content based on the user’s visit.
mkdir iexpress && cd iexpress && npm init –yes && npm i express -S
const express = require('express')
const app = express()
app.get('/'.(req, res) = > {
res.send('early')
})
app.get('/hi'.(req, res) = > {
res.send('early chat')
})
app.listen(3001)
Copy the code
Koa starts a service
While Express was fragrant, it was too heavy, especially since the early callback based design left teams stuck in callback Hell for years, while Koa was smaller and beautiful and supported asynchrony (although the generator for Koa 1 was ugly), It does only the purest parts, such as context handling, stream handling, Cookie handling, and so on.
Of course, the most attractive thing about Koa is its onion model, in which requests can go in layer by layer and come out layer by layer. If the onion core is the business we want to deal with, then each layer of skins can be regarded as some peripheral business processes. The skins through which requests go in and out of Koa are the middleware of Koa. So in theory we can scale out 30 to 40 middleware for an application, handling security, handling caching, handling logging, handling page rendering… In order to make the application grow fat again, but the middleware should also be added and deleted according to the actual situation, not the more the better, the more means the stronger uncertainty (especially three-party middleware), performance will be affected (the community of code hierarchy is not uniform, the overall may not be controllable, middleware has more execution nodes naturally).
In any case, Koa gives us much more fine-grained control over the incoming and outgoing requests, which has many advantages for developers:
mkdir ikoa && cd ikoa && npm init –yes && npm i koa -S
const Koa = require('koa')
const app = new Koa()
const indent = (n) = > new Array(n).join(' ')
const mid1 = () = > async (ctx, next) => {
ctx.body = '< H3 > request => go to tier 1 middleware '
await next()
ctx.body += The '< H3 > response <= passes through the ' layer 1 middleware
}
const mid2 = () = > async (ctx, next) => {
ctx.body += `<h2>${indent(4)}Request => go to layer 2 middleware </h2> '
await next()
ctx.body += `<h2>${indent(4)}The response <= comes out </h2> 'from the second layer of middleware
}
app.use(mid1())
app.use(mid2())
app.use(async (ctx, next) => {
ctx.body += `<h1>${indent(12)}: : Handle core business: : </h1> '
})
app.listen(2333)
Copy the code
Egg starts a service
While Koa is small and beautiful and can integrate a large amount of middleware, a complex enterprise-level application requires more stringent constraints, both on the design of the functional model (embodied in the directory structure) and the integration of the capabilities of the framework itself (embodied in the way modules are written, the interfaces exposed to each other, and the form of invocation). Needs to have a binding and convenient extension, the architecture of the Egg at this time on, carries out an Egg “convention over configuration”, according to the agreement of a unified application development, in addition to the service/controller/loader/context… In addition to further abstraction and transformation, it also provides powerful plug-in capabilities. As written in the official documentation, a plug-in can contain:
- Extend: Extends the context of the underlying object, providing various utility classes, properties.
- Middleware: Adds one or more pieces of middleware that provide pre – and post-processing logic for requests.
- Config: configures the default plug-in configuration items in each environment.
A domain-independent plug-in implementation can achieve great functionality with very high code maintainability, while plug-ins also support configuration of the default (best) configuration for each environment, allowing us to use plug-ins with little need to change configuration items.
mkdir iegg && cd iegg && npm init egg –type=simple && npm i && npm run dev
// app/controller/home.js
const Controller = require('egg').Controller
class HomeController extends Controller {
async index() {
const { ctx } = this
ctx.body = 'hi, egg'}}module.exports = HomeController
// app/controller/router.js
module.exports = app= > {
const { router, controller } = app
router.get('/', controller.home.index)
}
// config/config.default.js
module.exports = appInfo= > {
const config = exports = {}
config.keys = appInfo.name + '_1598512467216_9757'
config.middleware = []
const userConfig = {
// myAppName: 'egg',
}
return{... config, ... userConfig, } }// config/plugin.js
module.exports = {
// static: {
// enable: true,
// }
}
Copy the code
You can check out Egg and Koa for more information. It’s pretty good.
Start a simple blog service locally
Egg is packaged based on Koa, and can continue to be packaged based on Egg’s more business-oriented enterprise framework. Let’s return the focus to Koa, combined with the ability to get the Language API, let’s build a local service using Koa. There is no local database installed, the data is taken from the language, the template engine can use Pug. The catalogue can be designed like this:
.├ ─ Md ├─ App ├─ Controllers: │ │ ├─ ├─ ├─ ├─ ├─ router # ├─ routes.js # ├─ ├─ ├─ ├─ bass.js # ├─ bass.js # ├─ bass.js # ├─ bass.js # The tasks # docking third-party service tasks │ │ └ ─ ─ yuque. Js # yuque business logic to handle: Get list of documents, document details, save the document │ └ ─ ─ views # page │ ├ ─ ─ includes │ ├ ─ ─ layout. │ relation └ ─ ─ pages ├ ─ ─ # config service configuration file │ └ ─ ─ config. Js ├ ─ ─ ├─ ├─ ├─ download.txt └─ download.txt ├─ download.txt ├─ download.txt ├─ download.txt ├─ download.txt ├─ download.txt ├─ download.txt ├─ download.txt ├─ download.txt ├─ download.txt ├─ download.txt ├─ download.txt ├─ download.txt │ ├── ├─ ├─ ├─ ├─ pc-bannerCopy the code
Modules can install these:
- Axios: Can be used in browsers and Node.js promise-based HTTP clients
- Koa: Web development framework based on node. js platform
- Koa-static: KOA static file service middleware
- Koa-router: indicates the KOA routing middleware
- Koa-views: KOA template rendering middleware
- Moment: JavaScript date handling library
Retrieving finch data can be handled like this:
const fs = require('fs')
const { resolve } = require('path')
const axios = require('axios')
// Get the configuration information
const config = require('.. /.. /config/config')
const { repoId, api, token } = config.yuque
// Save the article locally
const saveYuque = (article, html) = > {
// Check if the path to the pages directory exists
// If the path does not exist, a pages directory will be generated automatically (if the service is used for the first time), otherwise an error will be reported and the local cache will be consistently unavailable
// The path exists, save the blog post directly in that path
const path = __dirname.substring(0, __dirname.length - 9) + 'public/pages'
if(! fs.existsSync(path)) { fs.mkdirSync(path) }const file = resolve(__dirname, `.. /.. /public/pages/${article.id}.html`)
if(! fs.existsSync(file)) { fs.writeFile(file, html,err= > {
if (err) console.log(err)
console.log(`${article.title}Local 'has been written)}}}// Encapsulate a unified request
const _request = async (pathname) => {
const url = api + pathname
return axios.get(url, {
headers: { 'X-Auth-Token': token }
}).then(res= > {
return res.data.data
}).catch(err= > {
console.log(err)
})
}
// Get all articles under the repoId specified in the configuration file
const getDocList = async() = > {try {
const res = await _request(`/repos/${repoId}/docs`)
return res
} catch (err) {
console.log('Failed to get list of articles:', err)
return[]}}// Get the specified article content under the specified repoId in the configuration file
const getDocDetail = async (docId) => {
try {
const res = await _request(`/repos/${repoId}/docs/${docId}? raw=1`)
return res
} catch (err) {
console.log('Failed to get article content:', err)
return{}}}module.exports = {
// getYuqueUser,
getDocDetail,
getDocList,
saveYuque
}
Copy the code
Routing can add several blog pages:
/ / page
const Home = require('.. /controllers/home')
const Article = require('.. /controllers/article')
module.exports = router= > {
// The front page of the website
// router.get(url, controller)
router.get('/', Home.homePage)
router.get('/about', Home.about)
router.get('/joinus', Home.joinus)
router.get('/contact', Home.contact)
router.get('/article/:_id', Article.detail)
}
Copy the code
Several pages are assigned to the master controller:
// Get the method for all articles under the repoId specified in the configuration file
const { getDocList } = require('.. /tasks/yuque')
const { teamName } = require('.. /.. /config/config')
// Use a controller to return the page to the client based on the specified path
exports.homePage = async ctx => {
const articles = await getDocList()
// render(pug, pug required variable)
ctx.body = await ctx.render('pages/index', {
title: 'home',
teamName,
articles
})
}
exports.about = async ctx => {
ctx.body = await ctx.render('pages/about', {
teamName
})
}
exports.joinus = async ctx => {
ctx.body = await ctx.render('pages/joinus', {
teamName
})
}
exports.contact = async ctx => {
ctx.body = await ctx.render('pages/contact', {
teamName
})
}
Copy the code
The controller code can be handled like this:
const fs = require('fs')
const { resolve } = require('path')
const { getDocDetail, saveYuque } = require('.. /tasks/yuque')
const config = require('.. /.. /config/config')
const { root } = config
const streamEnd = fd= > new Promise((resolve, reject) = > {
fd.on('end'.() = > resolve())
fd.on('finish'.() = > resolve())
fd.on('error', reject)
})
// View the article details
exports.detail = async ctx => {
const _id = ctx.params._id
const fileName = resolve(root, `${_id}.html`)
const fileExists = fs.existsSync(fileName)
// Check whether the resource has been cached locally
if (fileExists) {
console.log('Hit article cache, return directly')
// Pipe the file stream to koa's res to take over the return of the stream
ctx.type = 'text/html; charset=utf-8'
ctx.status = 200
const rs = fs.createReadStream(fileName).pipe(ctx.res)
await streamEnd(rs)
} else {
console.log('Missed article cache, pull again')
// If it is not cached, it is returned directly from the finch API
const article = await getDocDetail(_id)
const body = article.body_html.replace('
'.' ')
// The server returns the newly received article data
const html = await ctx.render('pages/detail', {
body,
article,
siteTitle: article.title
})
// Write a copy of the local file cache
saveYuque(article, html)
ctx.body = html
}
}
Copy the code
Although the process is simple, but if you go to the interview, the interviewer asked how to handle the cache, in this form will certainly not pass the close, here also need to consider a lot of boundary conditions and risk points, such as resources, permissions, validity, type and security check, traffic judgment… And so on and so forth, the caching part of it is often a focus of investigation, you can pay more attention to it, the following pseudo-code is just a introduction:
// Use if-modified-since or Etag to determine the cache validity period
const fStat = fs.statSync(filePath)
const modified = req.headers['if-modified-since']
const expectedModified = new Date(fStat.mtime).toGMTString()
if (modified && modified == expectedModified) {
res.statusCode = 304
res.setHeader('Content-Type', mimeType[ext])
res.setHeader('Cache-Control'.'max-age=3600')
res.setHeader('Last-Modified'.new Date(expectedModified).toGMTString())
return
}
// Set the file header
res.statusCode = 200
res.setHeader('Content-Type', mimeType[ext])
res.setHeader('Cache-Control'.'max-age=3600')
res.setHeader('Content-Encoding'.'gzip')
res.setHeader('Last-Modified'.new Date(expectedModified).toGMTString())
// gzip compressed file pipe back
const stream = fs.createReadStream(filePath, {
flags: 'r'
})
stream.on('error'.() = > {
res.writeHead(404)
res.end()
})
stream.pipe(zlib.createGzip()).pipe(res)
Copy the code
Front-end early chat from time to time will send some of the learning articles for small white technology, we can decisively pay attention to this account, perennial follow up new developments.
Don’t forget the 29th | front-end data visualization, high intensity of disposable insight into visual the front end of the play, live all day 7 to 17, 9 lecturer (ali cloud/ant/odd essence, millet, etc.), registration on 👉) :
All past issues are recorded and can be purchasedGo toUnlock all at once
👉 More activities
Like, comment, ask Mark.
Final effect