MongoDB Atlas is a database in the cloud, which eliminates the need for database construction and maintenance. Through the Web UI provided by MongoDB Atlas, you can quickly build a cluster within 5 minutes. Node.js is a JavaScript runtime, in JavaScript functions as first-class citizens, enjoy very high treatment, usually using Node.js we can quickly build a service, ServerLess is a “ServerLess architecture”, From a technical point of view, FaaS is a combination of FaaS and BaaS. FaaS (Function as a Service) is a platform for running functions.

So what can be done with this? In this article we will use a combination of ServerLess, MongoDB Atlas Cloud, and Node.js to quickly build a REST API, whether you are a front-end engineer or a back-end engineer, as long as you know some basic JavaScript syntax.

What can you learn from this article?

MongoDB Atlas cloud

What is MongoDB Atlas Cloud?

Let’s get this out of the way, MongoDB Atlas Cloud is a database that runs in the cloud, no installation, no configuration, no Mongo service installed on our machine, just a URL to access the database, and offers a cool UI that’s easy to use. Most importantly for us beginners it is free to use with a maximum limit of 512 MB, which is sufficient for small projects.

Create cluster

Now it’s easy to follow my lead and quickly create a MongoDB Cluster in 5 minutes.

  1. Registration: www.mongodb.com/cloud/atlas…
  2. The following page is displayed. Click Build a Cluster button to create a Cluster

Linking to a cluster

How do I select a cluster link string once the cluster is created? Follow these three steps to complete the process.

  1. To connect to the cluster, add your IP address to the whitelist and create a MongoDB user. After completing the two steps, select Choose a Connection Method to go to the next step

  1. Select the second option “Connect Your Application”

  1. The driver version uses the default node.js 3.0 or later. Copy the link string, which will be used in the next project

Open a Serverless Function link to the DB

What is Serverless?

Serverless means “Serverless architecture”, but this does not mean that servers are not really needed, the management of these servers is provided by the cloud computing platform, for the user side does not need to pay attention to server configuration, monitoring, resource status, etc., can focus on the business logic.

In the figure below, Microservices are further subdivided into Function as a Service (FaaS), which is less granular than Microservices.

Photo credit: Stackify

For a primer on ServerLess, see my other getting Started practice article to quickly start ServerLess Functions with Node.js: Getting Started Practice Guide

1. Project creation and plug-in installation

Create a project and install mongodb and Serverless-Offline plug-ins.

$ serverless create --template hello-world --path mongodb-serverless-conn-test
$ npm init
$ npm i mongodb -S 
$ npm i serverless-offline --save-dev
Copy the code

2. Create the db.js file in the root directory of the project

The database link string is the same as the MongoDB Atlas Cloud link cluster above. Replace your username and password. In the following code, the initialize function takes two arguments dbName and dbCollectionName to initialize a connection.

// db.js
const MongoClient = require("mongodb").MongoClient;
const dbConnectionUrl = 'mongodb+srv://<user>:<password>@cluster0-on1ek.mongodb.net/test?retryWrites=true&w=majority';

async function initialize(dbName, dbCollectionName,) {
    try {
        const dbInstance = await  MongoClient.connect(dbConnectionUrl);
        const dbObject = dbInstance.db(dbName);
        const dbCollection = dbObject.collection(dbCollectionName);

        console.log("[MongoDB connection] SUCCESS");
        return dbCollection;
    } catch (err) {
        console.log(`[MongoDB connection] ERROR: ${err}`);

        throwerr; }}module.exports = {
    initialize,
}
Copy the code

3. Modify the handler. Js

We want to test MongoDB links. Here is a ServerLess Function where we initialize a Connection and call find() to find the collection data

// handler.js
'use strict';
const db = require('./db');

module.exports.find= async (event, context) => {
  const response = {
    statusCode: 200};try {
    const dbCollection = await db.initialize('study'.'books2');
    const body = await dbCollection.find().toArray();

    response.body = JSON.stringify({
      code: 0.message: 'SUCCESS'.data: body,
    });

    return response;
  } catch (err) {
    response.body = JSON.stringify({
      code: err.code || 1000.message: err.message || 'Unknown error'
    });

    returnresponse; }};Copy the code

4. Test

Start local debugging
$ serverless offline

# Interface test$curl http://localhost:3000/find Serverless: GET/find (lambda: find) [mongo connection] SUCCESS []Copy the code

Everything seems to be Ok, proving that our cluster creation and linking were successful, but sometimes you may encounter the following errors

Error: querySrv ENODATA _mongodb._tcp.cluster0-on1ek.mongodb.net
Copy the code

This is exactly the problem I encountered when linking to MongoDB Alats. I hope it helps because it took me a long time, trying to use Google, Stackoverflow… Through the error report, I roughly confirmed that the problem might be the network and DNS. After modifying the DNS, there was still no result. Later, I switched the network and the problem was solved. If you have an answer, please feel free to discuss it with me. I also recommend checking the link string and MongoDB Alats whitelist to see if they are set correctly.

Two questions

The above example has a simple method, but it is bad, which causes two problems:

1. Serious coupling of business logic with FaaS and BaaS is not conducive to unit testing and platform migration: The business logic is completely written in the find function of handler.js file. On the one hand, the event and context objects of find function are provided by the FaaS platform, on the other hand, DB belongs to the back-end service. This results in serious coupling of business logic with FaaS and BaaS.

2. Not conducive to context reuse: ServerLess is event-driven. After the first request comes, it will download the code, start the container, start the runtime environment and execute the code. This process is called cold start. The execution context is frozen for some time after the function call. In our example above, each function execution initializes the database link, which is a very convenient operation. We can take this logic out of the function and use context reuse for further optimization at the development level.

Serverless REST API development best practices

With the points raised above in mind, this section will refactor this business logic to develop a REST API best practice.

What is a REST API?

API design should ensure the principles of single responsibility, clear and reasonable, and easy for others to understand and use. REST is also a principle of API design, and it is also an architectural idea for client and server resource transfer and interaction.

In this section, GET, POST, PUT, and DELETE verbs are used to obtain resources, create resources, update resources, and DELETE resources respectively.

More understanding about a RESTful architecture, reference nguyen other teacher’s blog “understanding a RESTful architecture” www.ruanyifeng.com/blog/2011/0…

REST API programming

Here is the REST API plan we are going to complete, with four CRUD operations

CRUD API Routes Description
POST /books Add a book
GET /books Get a list of all books
PUT /books/:id Update the numbered book by id
DELETE /books/:id Deletes a numbered book by id

Directory planning

A good project is inseparable from a good directory planning, of course, you can also follow your own ideas to do

Directing a serverless - node -- rest API ├ ─ ─ package. The json ├ ─ ─ the env ├ ─ ─ serverless. Yml ├ ─ ─ app | ├ ─ ─ handler. Js │ ├ ─ ─ the controller │ | └ ─ ─ books. Js │ └ ─ ─ model │ | ├ ─ ─ the js │ | ├ ─ ─ books. Js (optional) │ └ ─ ─ utils │ ├ ─ ─ message. Js └ ─ ─test└ ─ ─ controller └ ─ ─ books. Test. JsCopy the code

Project creation, plug-in installation

This time I did not use the MongoDB driver directly, but used Mongoose to replace the MongoDB operation.

$ serverless create --template hello-world --path mongodb-serverless-node-rest-api $ npm init $ npm i dotenv mongoose -S  $ npm i serverless-offline --save-devCopy the code

Create the.env configuration file

Put the configuration separately into the. Env configuration file for unified management.

DB_URL=mongodb+srv://admin:[email protected]/test?retryWrites=true&w=majority
DB_NAME=study1
DB_BOOKS_COLLECTION=books
Copy the code

Create a Model

app/model/db.js

const mongoose = require('mongoose');
mongoose.connect(process.env.DB_URL, {
    dbName: process.env.DB_NAME,
});
Copy the code

app/model/books.js

Mongoose started with Schema. Each schema is mapped to a MongoDB collection that defines the document composition within the collection.

const mongoose = require('mongoose');
const BooksSchema = new mongoose.Schema({
    name: String.id: { type: Number.index: true.unique: true }, 
    createdAt: { type: Date.default: Date.now },
});

module.exports = mongoose.models.Books || mongoose.model('Books', BooksSchema, process.env.DB_BOOKS_COLLECTION);
Copy the code

The following problems may occur when using Mongoose to create model and serverless-Offline is called several times after running

OverwriteModelError: Cannot overwrite `Books` model once compiled.
Copy the code

This error is caused by defining a Schema and then defining it again. The error code is as follows:

module.exports = mongoose.model('Books', BooksSchema, process.env.DB_BOOKS_COLLECTION); 
Copy the code

One way to solve this problem is to make sure you only instantiate once, as shown below. Another way is to add –skipCacheInvalidation after serverless Offline to skip the require cache invalidation. For details, see serverless-offline/issues/258.

module.exports = mongoose.models.Books || mongoose.model('Books', BooksSchema, process.env.DB_BOOKS_COLLECTION);
Copy the code

Write business logic Books

Putting the business logic processing in the Books class and not relying on any external services, this.booksModel can simulate data incoming during testing. Separate the business logic from FaaS and BaaS.

app/controller/books.js

const message = require('.. /utils/message');

class Books {
    constructor(BooksModel) {
        this.BooksModel = BooksModel;
    }

    @param {*} event */
    async create(event) {
        const params = JSON.parse(event.body);

        try {
            const result = await this.BooksModel.create({
                name: params.name,
                id: params.id,
            });
        
            return message.success(result);
        } catch (err) {
            console.error(err);

            returnmessage.error(err.code, err.message); }}/** * update @param {*} event */
    async update(event) {
        try {
            const result = await this.BooksModel.findOneAndUpdate({ id: event.pathParameters.id }, {
                $set: JSON.parse(event.body),
            }, { 
                $upsert: true.new: true
            });
        
            return message.success(result);
        } catch (err) {
            console.error(err);

            returnmessage.error(err.code, err.message); }}@param {*} event */
    async find() {
        try {
            const result = await this.BooksModel.find();

            return message.success(result);
        } catch (err) {
            console.error(err);

            returnmessage.error(err.code, err.message); }}@param {*} event */
    async deleteOne(event) {
        try {
            const result = await this.BooksModel.deleteOne({
                id: event.pathParameters.id
            });

            if (result.deletedCount === 0) {
                return message.error(1010.'Data not found! Probably deleted! ');
            }

            return message.success(result);
        } catch (err) {
            console.error(err);

            returnmessage.error(err.code, err.message); }}}module.exports = Books;
Copy the code

Write a handler

In handler, the event and context parameters are provided by the FaaS platform, and we put the business logic in the books.js of the Controller. The advantage of this is that if we want to migrate from one platform to another, Just change the way Books is called in handler.js, and the business logic is unaffected.

For this kind of initialization link operation, try to put outside the function, avoid to initialize such a time-consuming operation every time the function comes, we can use the function execution context reuse, when the startup environment executes the code to initialize our database link, For example, the handler. Js header requires (‘./model/db’).

app/handler.js

require('dotenv').config();
require('./model/db');

const BooksModel = require('./model/books');
const BooksController = require('./contrller/books');
const booksController = new BooksController(BooksModel);

module.exports = {
    create: event= > booksController.create(event),

    update: event= > booksController.update(event),

    find: (a)= > booksController.find(),

    deleteOne: event= > booksController.deleteOne(event),
}
Copy the code

Serverless configuration file

Serverless-offline plugins are used for local debugging, and the paths and routing rules for function files are defined in functions. Note that serverless. Yml uses books/{id} as the route rule.

service: mongodb-serverless-node-rest-api

provider:
  name: aws
  runtime: nodejs12.x

plugins:
  - serverless-offline

functions:
  create:
    handler: app/handler.create
    events:
      - http:
          path: books
          method: post
  update:
    handler: app/handler.update
    events:
      - http:
          path: books/{id}
          method: put
  find:
    handler: app/handler.find
    events:
      - http:
          path: books
          method: get

  deleteOne:
    handler: app/handler.deleteOne
    events:
      - http:
          path: books/{id}
          method: delete
Copy the code

The deployment of

Deployment This is the final step in completing our REST API, and it’s as simple as this: serverless Deploy enables us to deploy our services to the cloud.

$ serverless deploy Serverless: Packaging service... Serverless: Excluding development dependencies... Serverless: Uploading CloudFormation file to S3... Serverless: Uploading artifacts... Zip file to S3 (2.17 MB) Serverless: Uploading service mongodb-serverless-node-rest-api.zip file to S3 (2.17 MB)... Serverless: Stack update finished... . endpoints: POST - https://******.execute-api.us-east-1.amazonaws.com/dev/books PUT - https://******.execute-api.us-east-1.amazonaws.com/dev/books/{id} GET - https://******.execute-api.us-east-1.amazonaws.com/dev/books GET - https://******.execute-api.us-east-1.amazonaws.com/dev/books/{id} DELETE - https://******.execute-api.us-east-1.amazonaws.com/dev/books/{id}Copy the code

Endpoints enumerates the service’s interface call addresses. Now you can debug in POSTMAN.

Complete code reference source code address

https://github.com/Q-Angelo/project-training/tree/master/serverless/mongodb-serverless-node-rest-api
Copy the code

conclusion

ServerLess is a brand new technology system that reduces the cost of server development. Node.js is lightweight to use and friendly to front-end developers, but front-end developers are relatively unfamiliar with server operation and maintenance. Using ServerLess can help developers isolate a series of operations such as server operation and maintenance, environment setup, and focus more time on business development. In this paper, MongoDB Alats Cloud is combined with data storage, eliminating the need to build and maintain the database. Now as long as you master some basic JavaScript syntax, you can easily complete a REST API through the explanation of this article, which is how Nice things. Come and practice it!

About the author: May Jun, Nodejs Developer, moOCnet certified author, love technology, love to share the post-90s youth, welcome to the public account “Nodejs Technology Stack” and Github open source project www.nodejs.red