background

Since the completion of the client and management back-end project, a complete web application front-end project is built. Finally, there needs to be a back-end project service front-end application to provide API services. After a month of development, it has basically been completed, and the final online operation has been completed after deployment.

The project uses a technology stack

This project is a lightweight server application based on NodeJS mainly using KOA +mongodb as the core. Interfaces are designed in a RESTful style

The main middleware used are KOA-COMPRESS, KOa-parameter, Koa-connect-history-API-fallback, KOa-static, and KOa-mount. Please check the specific usage in koA official warehouse

Database operation: Mongoose

Interface authentication: JsonWebToken

User password encryption: bcryptjs

Upload the resource store: koa-multer

Route distribution: KOA-router

Koa-bodyparser Indicates the bodyParser interface

Interface using document: konglingwen94. Making. IO/elm – seller -…

The development process

Database design

This project chooses to use mongodb as the database for data storage, because it has natural advantages for front-end developers and is easy to use. As I am still in the primary stage of the server field, I have limited experience in database design, so I share it here for reference only.

Mongodb uses the Bson type as the data store format. Because the json type of the front-end JS can be converted to each other. So this reduces the difficulty of beginners to design database table fields, we can refer to the front-end page needs to display the data for design. Then, by using mongoose as the database model for fast operation, we can design the mongodb table field model in the form of code, which can be stored in the real database table after compilation by Mongoose.

Take the list of items in this project, which is a declared Mongoose seschema

 {
    name: String,
    price: Number,
    oldPrice: Number,
    description: String,
    sellCount: Number,
    rating: Number,
    info: String,
    menuID: ObjectId,
    image: String,
    online: { type: Boolean, default: true}},Copy the code

The fields stored in the database through the compilation method of the Mongoose model look like this

Database storage field

Field Type Description
menuID ObjectId Category ID
name String Commodity title
info String Commodity information
description String Product introduction
image String Products cover
online Boolean Whether issued
oldPrice Number Commodity price
price Number Commodity price
sellCount Number Selling the number

View the full model file here

Interface structures,

A complete API interface from the receipt of the request to the completion of the response data, the process is the server to handle the various code logic. This mainly includes the exposure of interface address, interface permission verification, request parameter verification, query database, return response information of these several stages. In order to conform to the hierarchical design pattern of server business logic, each processing stage can be separated into a single module, and then various related modules can be assembled and packaged into a complete project. Such modular design can greatly enhance the maintainability and readability of the project. This is what it looks like in a directory structure

├ ─ ─ model / / database model │ ├ ─ ─ administrator. Js │ ├ ─ ─ seller. Js │ ├ ─ ─ rating. Js │ ├ ─ ─ category. Js │ └ ─ ─ the food. The js ├ ─ ─ helper │ ├─ ├─ exercises, ├─ garbage, exercises, exercises, exercises, exercises, exercises, exercises ├ ─ ─ the controller / / controller │ ├ ─ ─ administrator. Js │ ├ ─ ─ seller. Js │ ├ ─ ─ rating. Js │ ├ ─ ─ category. Js │ └ ─ ─ the food. The js ├ ─ ─ the config │ ├─ ├─ ├─ ├─ ├─ ├─ └Copy the code

The model folder is used to store the database table model, which fields are stored in this file directory to see at a glance. The helper directory holds some auxiliary project files and some scripts. The middleware.js file holds all the middleware of the project. Following the principle of pattern layering, I have removed some of the server-side interface processing logic to the middleware, including interface permission verification. Request parameters validate these two main code processing logic. The Controller directory is where the business logic of the interface is stored. We also call it the controller. Querying the database and returning the response information is also done in this module. The router directory is the place where all the interfaces of the project are distributed. In this directory, different controllers can be distributed to one or more routing interface addresses. In this way, the reuse of controller files can be realized without the need to write repetitive business codes.

Permission verification and login (including registration)

Authorization validation is essential for back-end projects that are oriented toward multi-user services. This project uses authorization request header verification to determine the authorization of each request. To facilitate processing, I pulled the code logic out of this piece into a middleware. In this way, it is easy to manage and read the authentication permissions of each interface. The permission verification of this project uses jsonWebToken, a third-party plug-in, as a tool to generate the secret key token. When users log in, the server side generates a token response to the front end, which stores it according to the running environment, and each subsequent request carries this token to the server side according to business needs. The server returns different authentication results according to the set authentication rules, which is the overall operation process of interface permission authentication in this project.

How to analyze the service logic through the user login interface of the controller

// Only the business logic code is shown here
  async login(ctx) {
    const { username, password } = ctx.request.body;

    let result = await AdministratorModel.findOne({ username });
    // Create a new user if there is no result
    if(! result) {// Encrypt the password
      const hashPass = await bcrypt.hash(password, 10);

      const newUser = await AdministratorModel.create({ password: hashPass, username });

      const token = jwt.sign({ username, role: newUser.role, level: newUser.level }, secretKey, {
        expiresIn,
      });

      return (ctx.body = { admin: omit(newUser.toObject(), ["password"]), token });
    }

    if(! bcrypt.compareSync(password, result.password)) { ctx.status =400;

      return (ctx.body = { message: "Password error" });
    }
    const user = result.toObject();
    const token = jwt.sign(user, secretKey, { expiresIn });
     
    ctx.body = { admin: omit(user, ["password"]), token };
  },
Copy the code

In order to support the management of the function of the first login after the day after tomorrow, the login code interface also includes the business logic of user registration. After the process of parameter parsing and verification (the code part is processed by middleware in other modules), the effective parameters passed by the front end are obtained through deconstruction. Different service logics are processed according to the database query results. After obtaining the created user information, a token is generated using the jsonWebToken signature. This token is also the only authentication information used by other interfaces to verify the user login status.

After the token is generated through the login interface, we can authenticate other interfaces that need to add access rights. Below is the logical code middleware that determines the user’s login status by verifying whether the token is valid

// The process of importing other modules is omitted here
module.exports={
 adminRequired() {
    return async (ctx, next) => {
      let token = ctx.headers["authorization"];

      if(! token) { ctx.status =400;
        return (ctx.body = { message: "No token passed" });
      }
      token = token.split("") [1];

      try {
        var decodeToken = jwt.verify(token, secretKey, { expiresIn });
      } catch (error) {
        ctx.status = 403;
        if (error.name === "TokenExpiredError") {
          return (ctx.body = { message: "Expired Token" });
        }
        return (ctx.body = { message: "Invalid token" });
      }

      ctx.state.adminInfo = decodeToken;
      awaitnext(); }; }},Copy the code

After the request enters here, the token variable is extracted through the authentication request header. When the token reaches the specific value, the verification function provided by JSONWebToken is used to verify the token. According to different verification results, different status codes and error messages are responded. The specific error types of verification results can be viewed in the plug-in repository, which is not detailed here. When the authentication passes, the token’s signature content is parsed out. If the authentication interface needs this information for business logic elsewhere, we can mount it to the specific namespace field provided by KOA, which is convenient for local logic code to obtain.

Note: The type of authentication used by token needs to be used according to the conventions of the front and back end developers. This project uses Bearer ${token} format as token access headers

In order to comply with the design principle of koa middleware export format, the file of the middleware is exported in the form of closure and the actual application to the interface is the closure function, the benefits of this design is that we can pass parameters when the middleware function called in, and within the actual effect of middleware can do it according to the external parameters passed logical processing. The uniform use of this middleware in routing tables looks like this

// Some code is omitted
const Router = require("koa-router");

const router = new Router({ prefix: "/api" });
const middleware = require(".. /helper/middleware");

router.post("/admin/foods", middleware.adminRequired(),FoodController.createOne);

Copy the code

Database paging query function

For most front-end projects, paging data is a very common feature, and the code logic corresponding to the server side is filtering queries from the database. Using the Filter query operation API provided by Mongoose can easily accomplish this requirement, and the problem arises when we use it in many places. The interface path for the front-end request would look something like this: / API /foods? Page =1&size=20, we need to do further judgment and parsing of the querystirng to apply to the database parameter query. The problem is that many interfaces need this function, which is cumbersome to use. It is better to separate the process of parsing query parameters into a module, which is more convenient for us to use and maintain. Now let’s take a look at all the wrapped code!

module.exports = {
  resolvePagination(pagination = {}) {
    const defaults = { page: 1.size: 10 };

    pagination.page = parseInt(pagination.page, 10);
    pagination.size = parseInt(pagination.size, 10);

    if (Number.isNaN(pagination.page) || pagination.page <= 0) {
      pagination.page = defaults.page;
    }
    if (Number.isNaN(pagination.size) || pagination.size <= 0) {
      pagination.size = defaults.size;
    }

    const { page, size } = pagination;
    return {
      page,
      size,
    };
  },
  resolveFilterOptions(filter = {}) {
    let sort = {
      createdAt: -1}; sort = defaults({}, filter.sort, sort);const { page, pageSize } = resolvePagination({
      page: filter.page,
      size: filter.size,
    });
    return {
      limit: size,
      skip: (page - 1) * size, sort, }; }};Copy the code

First of all, we can parse valid query parameters through resolvePagination function, and then we can parse query options conforming to Mongoose data filtering operation through resolveFilterOptions function. The operation modes introduced by modularization are applied to the actual database query process as follows

// The code snippet is from the project 'controller' directory
const {  resolveFilterOptions, resolvePagination } = require(".. /helper/utils");

module.exports={
 async queryListByOpts(ctx) {
    const { page, size } = resolvePagination({ page: ctx.query.page, size: ctx.query.size });

    const { skip, limit, sort } = resolveFilterOptions({ page, size });

    const total = await FoodModel.countDocuments();

    var results = await FoodModel.find().populate("category").sort(sort).skip(skip).limit(limit);

    ctx.body = {
      data: results,
      total,
      pagination: { page, size, }, }; }},Copy the code

The resolvePagination function is responsible for parsing valid data paging query options. The resolveFilterOptions function parses the parameters of mongoose’s specific query statement format. By separating the business code from the logical code, we effectively enhanced the modular structure of the code, increased the reuse of the code, and improved the development efficiency of the project.

Deployment and operation of applications

This project uses the continuous integration function of Github-Actions to automatically deploy to the cloud server. With the continuous integration service, manual construction, testing and release of the project are eliminated, and the risk of manual operation errors is reduced. The specific configuration file is as follows

name: Deploy files
on: [push]
jobs:

  build:
    name: Build
    runs-on: ubuntu-latest
    steps:
    - uses: actions/checkout@master
    - name: copy file via ssh key
      uses: appleboy/scp-action@master
      with:
        host: The ${{ secrets.SERVER_HOST }}
        username: The ${{ secrets.SERVER_USERNAME }}
        key: The ${{ secrets.SERVER_SSH_KEY }}
        port: The ${{ secrets.SERVER_PORT }}
        source: "*"
        target: "/var/www/elm-seller-server"

    - name: executing remote ssh commands using ssh key
      uses: appleboy/ssh-action@master
      with:
        host: The ${{ secrets.SERVER_HOST }}
        username: The ${{ secrets.SERVER_USERNAME }}
        key: The ${{ secrets.SERVER_SSH_KEY }}
        port: "22"
        script: | cd /var/www/elm-seller-server npm install          
          npm start
          
Copy the code

As can be seen from the configuration file, the environment variables published by the server are encrypted. For example, ${{secrets.server_host}} is the environment variable. The real storage value needs to be configured in the secret option in the Settings panel of github repository. A github-Actions deployment is triggered when a local git-managed repository is pushed to a remote repository. You can also configure multiple configuration files ending in.yml under the workflows folder, one for each actions deployment task. In this project, I used two continuous integration tasks, because the corresponding documentation of the project also needs to be updated and released in time. The github actions market provides common integration task templates for you to choose from

I choose to use PM2 to manage the application. The configuration file for application startup is here. Pm2 is a node application management tool, we can easily view, restart, delete, stop, start applications

API documentation

Document writing is an indispensable part of a back-end project. Learning to write documents can review the process from design to development of the project, and identify problems in the first time and fix bugs in time. Project documents are REAEME files written using markdown syntax, all in the project docs directory. Documents are previewed and published using VuePress as a build tool. Please refer to the official document for details

Document release address: konglingwen94. Making. IO/elm – seller -…

Development tools and environments

vscode mac node mongodb git github postman ssh

Summary result

From project requirement planning, database table design, separation of API interface logic concerns, and finally successful deployment and operation as well as the completion of writing documents, I have initially mastered the complete development process of the server project, and accumulated some development experience to share here.

As a programming developer, it is very normal to encounter difficulties in the process of project development, especially in the debugging code when a variety of error messages to see the “dazzling “, especially the server node environment is not as convenient as browser client debugging. In error code don’t be afraid, we need to step by step, trying to identify the cause of the error, if error message doesn’t look we can directly use third-party tools to debug, this project I am nodemon this tool use, he can heat load application, can also open the debug command to open a browser developer tools like the debug panel, We can see the error message thrown by the application in the console panel, and the stack of code that went wrong in the Source panel. With these tools’ analysis, with a little patience and a little thought, we can solve the problem eventually.

support

Thank you all for your likes and attention. If you are interested in this project, you can communicate with me. Please leave a comment below!

If you have good suggestions or found bugs in this project, you can go to the project warehouse to raise issues, and welcome your collection and attention, thank you!

Warehouse address :github.com/konglingwen…

The document address: konglingwen94. Making. IO/elm – seller -…