preface

Because at present, the company adopted the ProtoBuf do end data before and after interaction, the company has been with the company a great god write good base library, completely don’t understand how the underlying parsing, once an error can only ask for, as a still apes, spirit should be to understand the underlying implementation, here to record the learning process.

Brief introduction to Protobuf

Google Protocol Buffer(Protobuf) is a portable and efficient structured data storage format that is platform independent, language independent, and extensible for communication Protocol and data storage.

There are several advantages:

  • 1. Platform independent, language independent, extensible;
  • 2. Provides a friendly dynamic library, easy to use;
  • 3. Fast parsing speed, about 20-100 times faster than the corresponding XML;
  • 4. Serialized data is very concise and compact. Compared with XML, the amount of serialized data is about 1/3 to 1/10.

It doesn’t really matter whether you use json or Protobuf for front and back end data transfer. Protobuf has to be parsed into JSON to be used. In my opinion, the best points are:

  • 1. Protobuf can be used directly in a project on both the front and back ends without defining the Model;
  • 2. Protobuf can be used directly as a document for front and back end data and interfaces, greatly reducing communication costs;

Before protobuf is used, interfaces and fields defined by the back-end language cannot be directly used by the front end. The communication between the front end and the back end often needs to maintain an interface document. If the back-end field changes, the document needs to be modified and the front end is notified. After protobuf is used, the protobuf file is defined by the backend. The protobuf file can be used as a document directly. The front-end only needs to copy the protobuf file into the front-end project. If the back end field changes, you just need to notify the front end to update the Protobuf file, because the back end uses the protobuf file directly, so the protobuf file is usually not missing or wrong. In the long run, the efficiency of teamwork is obviously improved.

A lot of nonsense, let’s get down to business. What I talk about here is mainly the use of VUE, which is the project practice of my company. You can take it as a reference.

Train of thought

The library protobuf.js is used in the front end to process proto files.

Protobuf.js provides several ways to handle proto.

  • Direct analysis, such asprotobuf.load("awesome.proto", function(err, root) {... })
  • Converted to JSON or JS after use, such asprotobuf.load("awesome.json", function(err, root) {... })
  • other

As we all know, the dist directory generated after the build of the vue project is only HTML, CSS, JS,images and other resources, there is no. Proto file, so you need to use the library protobuf.js to process *. Proto into *.js or *.json. Then use the methods provided by the library to parse the data and finally get the data object.

PS: Practice found that converted to JS file will be better to use some, converted js file directly in the prototype chain definition of some methods, very convenient. So this will be used later to parse proto.

target

When you call the API, you only need to specify the model for the request and response, and then pass the request parameters. You don’t care how the underlying proto is parses. The API returns a Promise object:

// / API /student.js defines the interface file import Request from'@/lib/request'// PBStudentListReq is a defined request body model // school.pbStudentListrsp is a defined response model // GetStudentList indicates the interface nameexport function getStudentList (params) {
  const req = request.create('school.PBStudentListReq', params)
  return request('getStudentList', req, 'school.PBStudentListRsp'} // Import {getStudentList} from helloWorld.vue'@/api/student'
export default {
  name: 'HelloWorld'.created () {

  },
  methods: {
    _getStudentList () {
      const req = {
        limit = 20,
        offset = 0
      }
      getStudentList(req).then((res) => {
        console.log(res)
      }).catch((res) => {
        console.error(res)
      })
    }
  }
}
Copy the code

The preparatory work

1. Get a defined Proto document.

Although the syntax is simple, the front end doesn’t really care much about writing proto files, and is generally defined and maintained by the back end. And here you can just use a demo that I’ve defined.

// User.proto
package framework;
syntax = "proto3";

message PBUser {
    uint64 user_id = 0;
    string name = 1;
    string mobile = 2;
}

// Class.proto
package school;
syntax = "proto3";

message PBClass {
    uint64 classId = 0;
    string name = 1;
}

// Student.proto
package school;
syntax = "proto3";

import "User.proto";
import "Class.proto"; message PBStudent { uint64 studentId = 0; PBUser user = 1; PBClass class = 2; PBStudentDegree degree = 3; } enum PBStudentDegree { PRIMARY = 0; // MIDDLE = 1; // SENIOR = 2; // High school COLLEGE = 3; } Message PBStudentListReq {uint32 Offset = 1; uint32limit = 2;
}

message PBStudentListRsp {
  repeated PBStudent list = 1;
}



// MessageType.proto
package framework;
syntax = "proto3"; // Public request body MESSAGE PBMessageRequest {uint32type= 1; // Message type bytes messageData = 2; // Request data uint64 timestamp = 3; // Client timestamp string version = 4; // API version string token = 14; // Token returned by the server for login verification} // Message response packet message PBMessageResponse {uint32type= 3; // Message type bytes messageData = 4; // Mandatory data uint32 resultCode = 6; // Return the result code. String resultInfo = 7; // All interfaces enum PBMessageType {// Student related getStudentList = 0; // Student related getStudentList = 0; // Get a list of all students, PBStudentListReq => PBStudentListRsp}Copy the code

In fact, you don’t need to learn the grammar of proto. There are two types of namespace framework and school. PBStudent refers to PBUser. You can think of PBStudent as inheriting PBUser.

For example, which fields are required in the request and which fields are required in the return body. In this case, the PBMessageRequest of messageType. proto is used to define the required fields in the request body. PBMessageResponse is defined as the field of the return body.

PBMessageType is an enumeration of interfaces, where all the interfaces in the back end are written, annotated with specific request parameters and return parameter types. For example, there’s only one interface, getStudentList.

Proto is a getStudentList interface, the request parameter is PBStudentListReq, and the returned parameter is PBStudentListRsp.

So the proto file can be directly used as a document to communicate back and forth.

steps

1. Create a vUE project

Add install axios and Protobufjs.

# vue create vue-protobuf
# npm install axios protobufjs --save-dev
Copy the code

In 2.srcCreate a new one in the directoryprotoDirectory, used to store*.protoFile and copy the written proto file into it.

This time the project directory and package.json:

3.*.protoFiles are generatedsrc/proto/proto.js(key)

Protobufjs provides a tool called PBJS, which is an artifact that can be packaged into either xx.json or xx.js files depending on the parameters. Let’s say we want to package a JSON file and run it in the root directory:

npx pbjs -t json src/proto/*.proto > src/proto/proto.json
Copy the code

You can generate a proto. Json file in the SRC /proto directory by clicking here. As I said before, it has been proven that it is best packaged as a JS module. I’m going to give you the final command

npx pbjs -t json-module -w commonjs -o src/proto/proto.js  src/proto/*.proto
Copy the code

The -w argument specifies the wrapper that wraps js. In this case, commonjs is used. See the documentation for details. Proto. Js is generated in the SRC /proto directory after the command is run. In Chrome, console.log(proto.js)

load
lookup

  "scripts": {
    "serve": "vue-cli-service serve"."build": "vue-cli-service build"."lint": "vue-cli-service lint"."proto": "pbjs -t json-module -w commonjs -o src/proto/proto.js src/proto/*.proto"
  },
Copy the code

After you update the proto file, you just need NPM run proto to regenerate the latest proto.js.

4. Packaging request. Js

With the proto.js file generated earlier, you can begin to encapsulate the base modules that interact with the back end. First of all, we’re using Axios to make the HTTP request.

The whole process: start to call the interface -> Request.js to turn the data into binary -> the front end really initiates the request -> the back end returns binary data -> Request.js processing binary data -> obtain the data object.

Request. js is equivalent to an encryption and decryption relay station. Add a request.js file to the SRC /lib directory and start developing:

Since our interface is binary data, we need to set the AXIos request header, using arrayBuffer, as follows:

import axios from 'axios'
const httpService = axios.create({
  timeout: 45000,
  method: 'post',
  headers: {
    'X-Requested-With': 'XMLHttpRequest'.'Content-Type': 'application/octet-stream'
  },
  responseType: 'arraybuffer'
})
Copy the code

Messagetype. proto defines the interface enumeration, request body, and response body with the backend convention. Before making a request, we need to convert all requests to binary. Here is the main function of request.js

import protoRoot from '@/proto/proto'
import protobuf from 'protobufjs'Const PBMessageRequest = protoRoot. Lookup (const PBMessageRequest = protoRoot)'framework.PBMessageRequest'Const PBMessageResponse = protoRoot. Lookup (const PBMessageResponse = protoRoot)'framework.PBMessageResponse')

const apiVersion = '1.0.0'
const token = 'my_token'

function getMessageTypeValue(msgType) {
  const PBMessageType = protoRoot.lookup('framework.PBMessageType')
  const ret = PBMessageType.values[msgType]
  returnRet} /** ** @param {*} msgType Interface name * @param {*} requestBody Parameter * @param {*} responseType Return value */functionrequest(msgType, requestBody, Const reqData = {timeStamp: new Date().getTime(),type: _msgType, version: apiVersion, messageData: requestBody, token: Const req = pbMessagerequest.create (reqData) // Call Axios to initiate the request // Use axios configuration items: // transformRequest makes a request, calls the transformRequest method, // transformResponse processes the returned data. The goal is to convert the binary into real JSON datareturn httpService.post('/api', req, { transformRequest, transformResponse: Then (({data, status}) => {// Handle the requestif(status ! == 200) { const err = new Error('Server exception') throw err} console.log(data)},(err) => {throw err})} // Encode the request data into binaryfunction transformRequest(data) {
  return PBMessageRequest.encode(data).finish()
}

function isArrayBuffer (obj) {
  return Object.prototype.toString.call(obj) === '[object ArrayBuffer]'
}

function transformResponseFactory(responseType) {
  return functionTransformResponse (rawResponse) {// Determine whether the response is an arrayBufferif(rawResponse == null || ! isArrayBuffer(rawResponse)) {returnRawResponse} try {const buf = protobuf.util.newBuffer(rawResponse) // Decode response body const decodedResponse = PBMessageResponse.decode(buf)if(decodedResponse.messageData && responseType) { const model = protoRoot.lookup(responseType) decodedResponse.messageData  = model.decode(decodedResponse.messageData) }return decodedResponse
    } catch (err) {
      returnErr}}} // Add a method under request to handle request parameter request.create =function (protoName, obj) {
  const pbConstruct = protoRoot.lookup(protoName)
  returnPbconstruct.encode (obj).finish()} // expose moduleexport default request
Copy the code

Finally write the specific code see: request.js. Some proto.js methods are used: lookup(),encode(), finish(), decode(), etc.

5. Call request. Js

Before the.vue file directly calls the API, we generally do not directly use request.js to directly initiate the request, but encapsulate all the interfaces in another layer, because when using request.js directly, we need to specify the request body, response body and other fixed values, which will cause code redundancy.

It is customary to put all back-end interfaces in the SRC/API directory in projects. For example, the interface for student is in the SRC/API /student.js file for easy management. Write the getStudentList interface in SRC/API /student.js

import request from '@/lib/request'// PBStudentListReq is a defined request body model // school.pbStudentListrsp is a defined response model // GetStudentList indicates the interface nameexport function getStudentList (params) {
  const req = request.create('PBStudentListReq', params)
  return request('getStudentList', req, 'school.PBStudentListRsp'} // If you want to add an interface, do the sameexport function getStudentById (id) {
  // const req = ...
  // return request(...)
}
Copy the code

6. Interfaces are used in. Vue

Import the desired interface and return a Promise object, which is very convenient.

<template>
  <div class="hello">
    <button @click="_getStudentList"</button> </div> </template> <script> import {getStudentList} from'@/api/student'
export default {
  name: 'HelloWorld',
  methods: {
    _getStudentList () {
      const req = {
        limit: 20,
        offset: 0
      }
      getStudentList(req).then((res) => {
        console.log(res)
      }).catch((res) => {
        console.error(res)
      })
    }
  },
  created () {
  }
}
</script>

<style lang="scss">

</style>
Copy the code

conclusion

The whole demo code: demo.

The entire process used by the front end:

  • 1. Copy all proto files provided by the backendsrc/protofolder
  • 2. Runnpm run protoGenerate the proto. Js
  • 3. According to the interface enumeration insrc/apiThe write interface
  • 4. .vueUse the interface in the file.

Where 1 and 2 can be combined to write an automated script that can be run with each update.

Write more verbose, writing is not good, everyone forgive me.

This process is a good practice of proto at the front end in my opinion, but it may not be the best. If there are other better practices in your company, you are welcome to share with us.

subsequent

It is better to use vue as a JS module (this is because vue is packaged as only HTML, CSS, JS and so on in production). However, in some scenarios, such as Node environment, an Express project, production environment, where the.proto file is allowed, you can use other methods provided by protobuf.js to dynamically parse the proto, instead of NPM run proto.

When I have time, I’ll write another post about dynamically parsing Proto on the Node side.