This article is participating in node.js advanced technology essay, click to see details.
preface
Two years ago, I was a little white developer. At that time, IN order to launch a website I developed, I bought a student server, and installed the Linux server of the little White friends pagoda panel, I believe there are many friends to buy a server, the first thing is to install a pagoda. In the pagoda panel, I used the built-in script to quickly install the nginx + mysql + phpAdmin family bucket. It was quickly installed, but AT that time I knew nothing about Nginx, and it took me almost a week to deploy the project because of any small problem I encountered when deploying the server function.
As a front-end developer, I have written a lot of small projects to implement the server side through Node.js. For front-end development, Node.js is very user-friendly, and both front-end and back-end can be js easily, but once encountered problems on the server, it is very easy to jam. Especially for Linux command do not understand the friends are fatal.
Recently I tried using Apache APISIX instead of Nginx for a simple back-end development deployment, using a visual palette to configure my API without having to configure the nginx.conf file, and I found it very friendly for those who don’t know Nginx.
Apache APISIX introduction
Node.js is familiar to everyone, but it is estimated that Apache APISIX may not know much about front-end development. Here is the official introduction:
Apache APISIX is a dynamic, real-time, high-performance API gateway that provides rich traffic management features such as load balancing, dynamic upstream, grayscale publishing, service meltdown, authentication, observability, and more. As an API gateway, Apache APISIX not only has many useful plug-ins, but also supports dynamic plug-in change, hot plug, and the development of custom plug-ins.
To put it simply, Apache APISIX is an API gateway like Nginx, but with additional integration of plugins and the like, and a visual console to configure our routing and plugins.
There are two main reasons why Apache APISIX should be used instead of Nginx.
- Visual control panel, there is no need to manually modify nginx configuration for small white.
- A large number of plug-ins are available, and we can directly configure many functions written in the back end of the plug-in, which can save a lot of business code.
Official documents and warehouses are as follows:
๐งพ Apache APISIX official document
๐ฆ Apache APISIX Github repository
Development environment setup
The following operations are developed based on centos7 system
Node. Js installed
Node.js is a server, so you can install node.js on centos. Run the following command to install node.js:
sudo yum -y install nodejs
Copy the code
Apache APISIX installation
Docker compose: Docker compose: docker compose: Docker compose: Docker compose: Docker Compose: Docker Compose: Docker Compose: Docker Compose: Docker Compose: Docker Compose: Docker Compose
# CentOS Docker installation
# Docker Compose
Docker compose: docker compose: docker Compose: Docker Compose: Docker Compose: Docker Compose: Docker Compose: Docker Compose: Docker Compose: Docker Compose: Docker Compose You can refer to the apisix-Docker Github repository, which includes the Apisix and Apisix Dashboard service containers.
git clone https://github.com/apache/apisix-docker
cd apisixdocker/example
docker-compose -p docker-apisix up -d
Copy the code
After startup, execute Docker PS, and you can see that many containers are printed, as shown below:
Apisix services run on port 9080 by default, while the Apisix Dashboard visual panel runs on port 9000. If you are running on the cloud server, you can access Apisix Dashboard using the server external IP :9000. If you are developing on the server through VScode SSH Remote, you can also access it more conveniently through the port mapping of VScode. The operation mode is as follows:
First, find a small cell tower button at the bottom of vscode, as shown below:
Click the following button to bring up the window, click Forward a Port.
After entering port 9000, the mapping is set.
After setting up, open your browser and go to Localhost :9000 to get to the Apisix Dashboard page. The effect is shown below:
The default account password is admin. You can enter the password after entering it. The component library used for this site is Ant Design. To the left is a menu of concepts that can be found in the official documentation.
Minimize service setup
Now that the basic environment is set up, let’s simulate our normal back-end development process and see how Node.js and Apache APISIX can be used together.
Set up basic back-end services
On the back end I’ll use the koa.js framework to build the service. First we’ll create the project directory:
mkdir server
cd server
touch app.js
Copy the code
Next initialize package.json, execute NPM init, and press Enter. The directory structure is shown below:
Then we execute NPM I koa to install KOA and write the following code in app.js:
const Koa = require('koa');
const app = new Koa();
app.use(async ctx => {
ctx.body = 'Hello World';
});
app.listen(3100);
Copy the code
$curl localhost:3100 $curl localhost:3100 $curl localhost:3100 $curl localhost:3100
Create an Apisix route
After the service is successfully started, we of course cannot access the service directly through port 3100, but need to let Apisix proxy us access the service, just like nginx does. There are many ways to add routes, we can use the command line or visual web page, of course, the lowest threshold is the most intuitive or visual form to configure it.
So next we go to the newly launched Apisix Dashboard in the browser, click on the Route menu, click on the Add button in the upper right corner, we need to add a new route.
There are four steps to add a route, as shown in the following figure:
-
First set the route information, we fill in the route name as
koa-server
, path we fill in/koa
Do not fill in anything other than mandatory. -
Next, set the upstream service information. The upstream is actually the koA service we started at 3100, so only fill in the host name of the target node, and do not fill in other optional information.
๐ Note: The host name used here is 172.17.0.1 instead of localhost. This is because the container in docker cannot access the host port directly, so it needs to use bridge Docker0 as external bridge. And in Linux docker0 IP for general > 172.17.0.1, in Windows for the host. Docker. Internal for macOS > docker. For. MAC. Host. Internal.
If none of the above IP addresses can access the upstream service, you can run the IP addr show docker0 command to query the IP address
-
In the next step, we can see that there are a lot of plug-ins. On the left side, we can arrange or configure plug-ins freely. These plug-ins can replace many functions that we used to need to implement in the back-end service.
-
The last step is to preview all the configuration information, click submit in the right corner, and we are finished creating a route. When you return to the route list, you can see the route we created.
Curl -i localhost:9080/ koA: $curl -i localhost:9080/koa: $curl -i localhost:9080/ koA The request body also returns normally, indicating that the route configuration is complete
The function development
Here are some common backend and API gateway functions that are useful even for individual developers.
Reverse proxy resolves cross-domain
First is deployed on the front end project launch will encounter a problem – cross domain, since most projects are the separation of front and rear end, and the front site due to the same origin policy can’t directly after the domain name or IP to access the service, there are many way to solve the cross domain, in which the reverse proxy a way should be used more widely, Therefore, we can configure the reverse proxy through the gateway.
The principle of reverse proxy is as follows: Since the front end cannot access other domain names across domains, the front end directly accesses the domain name of the front end/a special path. When the gateway matches the special path, the front end accesses the URL of the back end and sends the result returned by the back end to the front end.
Nginx does this by modifying the nginx.conf file:
location /api {
rewrite ^.+api/? $(. *) /The $1 break;
include uwsgi_params;
proxy_pass http://localhost:3000/;
}
Copy the code
By matching a special path/API in the server and configuring the URL to be sent by the agent via proxy_pass, we can forward our request to this URL.
In Apache APISIX, we have configured a reverse proxy for IP :9080/ KOA to IP :3100. If we want to proxy to another server, we can change the configuration in the route
If you don’t want Apisix running on 9080 and want to use port 80 for easy browser access, if you are using Docker-compose to start the service, /example/ docker-comemage. yml/docker-comemage. yml/docker-comemage. yml
If you started the apisix service from source, you can modify node_listen in /conf/config-default.yaml to change the port of the Apisix service.
Jwt-auth implements user authentication
When the front end sends a request to the back end, in order to verify that the request is valid, the back end returns a token to the front end after the user logs in, and the front end places the token in the request header in subsequent requests. Generally, the token is composed of user information, timestamp, and signature encrypted by hash algorithm. If the backend determines that the token is incorrect or expired, it requires the user to log in again to obtain the token.
To do this in Node.js, we can use the JsonWebToken library. Here is how to generate and verify tokens based on the library implementation.
const jwt = require("jsonwebtoken")
const auth = {
createToken(userId) {
// Generate tokens based on userId
const payload = {
userId,
time: new Date().getTime(),
timeout: 1000 * 60 * 60 * 48,}const token = jwt.sign(payload, config.tokenSecret)
return token
},
verifyToken(allowUrl) {
return async (ctx, next) => {
// Verify that the token is valid, or return false if not
if (
allowUrl.indexOf(ctx.request.url) === -1
) {
if(! ctx.request.header.token) { ctx.body = {code: 110.msg: "The token is invalid" }
return
}
try {
const token = ctx.request.header.token
const payload = jwt.verify(
token,
config.tokenSecret
)
if (
payload.time + payload.timeout <
new Date().getTime()
) {
ctx.body = { code: 111.msg: "The token expired" }
return
}
ctx.request.header.userId = payload.userId
await next()
} catch (err) {
ctx.body = {
code: 110.msg: "The token is invalid".err: err.toString(),
}
return}}else {
await next()
}
}
},
}
module.exports = auth
Copy the code
In the above method, we can pass the userId to generate the token with the createToken method, check the token with the verifyToken method, allowUrl can pass an array, set a whitelist, Requests such as login interfaces can be bypassed.
The whole thing is to implement a KOA middleware, it is very simple to use, in the project entry file and use the use method.
/ / token authentication
const { verifyToken } = require("./server/middleware/auth")
app.use(verifyToken(["/admin/login"."/user/login"]))
Copy the code
Of course, I am just a simple implementation, there will be a lot of special cases in the actual project, such as some static resources access judgment, but also according to the actual needs ~
Here’s how to do this in Apache APISIX. Apache APISIX has a plug-in called jwt-Auth that makes it very easy to do this.
- Add a Consumer first:
2. Then enable itjwt-authPlug-in, fill it outkey
Value: 3. The last route created before us/koa
Also enablejwt-authThe plug-in.
/ APISIX /plugin/ JWT /sign/APISIX /plugin/ JWT /sign/APISIX /plugin/ JWT /sign The token can be created by /apisix/plugin/ JWT /sign. We need to pass the customer’s key into the API, and we can also attach some additional data to the payload, such as the user ID.
curl -G --data-urlencode 'payload={"uid":10000,"uname":"test"}'http://127.0.0.1:9080/apisix/plugin/jwt/sign?key=oiloil - i.Copy the code
The string in the request body is the token value:
HTTP/1.1 200 OK Date: Thu, 10 Mar 2022 10:07:43 GMT Content-Type: text/plain; charset=utf-8 Transfer-Encoding: chunked Connection: keep-alive Server: APISIX / 2.12.1 eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJ1bmFtZSI6InRlc3QiLCJ1aWQiOjEwMDAwLCJleHAiOjE2NDY5OTMyNjMsImtleSI6Im9pbG9pbCJ9.VI RZ7Sxkm3gfUsvripE3FaqdweilrdljE-GuJjvsBoACopy the code
After creating the token, let’s try the authentication function again. Let’s not pass the token and send a request to test it:
HTTP / 1.1 401 Unauthorized curl http://127.0.0.1:9080/koa - I... {"message":"Missing JWT token in request"}Copy the code
The request header Authorization field contains the token and the request header Authorization field contains the token.
curl localhost:9080/koa -i -H 'Authorization:eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJ1bmFtZSI6InRlc3QiLCJ1aWQiOjEwMDAwLCJleHAiOjE2NDY5OTMyNjMsImtleSI6 Im9pbG9pbCJ9. VIRZ7Sxkm3gfUsvripE3FaqdweilrdljE - GuJjvsBoA 'HTTP / 1.1 200 OK the content-type: text/plain. charset=utf-8 Content-Length: 11 Connection: keep-alive Date: Thu, 10 Mar 2022 10:18:49 GMT Server: APISIX / 2.12.1 Hello WorldCopy the code
You can see that the correct data is returned, but Apache APISIX’s jwT-Auth plugin has a lot more functionality, and the security is much more reliable than the authentication method we wrote ourselves, and we don’t need to implement it ourselves.
The following are some of the parameters that can be passed in from the official documentation, and from these parameters we can get an idea of what the plugin does.
The name of the | type | Will options | The default value | Valid values | describe |
---|---|---|---|---|---|
key | string | Must be | differentconsumer The object should have different values. It should be unique. Different consumers use the same onekey , a request matching exception will occur. |
||
secret | string | optional | Encrypt the secret key. If you do not specify, the background will automatically generate it for you. | ||
public_key | string | optional | RSA public key,algorithm Attribute selectionRS256 This parameter is mandatory for algorithm |
||
private_key | string | optional | RSA private key,algorithm Attribute selectionRS256 This parameter is mandatory for algorithm |
||
algorithm | string | optional | “HS256” | [“HS256”, “HS512”, “RS256”] | The encryption algorithm |
exp | integer | optional | 86400 | [1,… | Timeout period of the token |
base64_secret | boolean | optional | false | Whether the key is base64 encoded |
More Apache APISIX features
There are many plug-ins in Apache APISIX that are widely applicable, such as request traffic limiting, RBAC authentication, log printing and other functions, which can be configured in a visual way. It gives students who do not know nGINx a more intuitive way to manage the API.
In fact, the above application scenarios are the advantages of using Apache APISIX for simple single service development. The advantages of Using Apache APISIX are more obvious in microservice architecture or high load application scenarios.
Koa usage tips
In addition to introducing Apache APISIX, here I will also talk about some of the cautious and useful libraries I used to develop server-side using the kao.js framework.
Sequelize orm framework
Sequelize is a Node. js ORM framework, the orM framework is relational database data mapping into our program objects, in simple terms, ORM can convert each table in the database into JS objects, objects provide add, delete, change related API. Another advantage is that the structure of the table can be maintained directly in code, without the need to use SQL statements or tools like Navicat to manipulate it.
Note: Only mysql 5.7 or later is supported.
The installation
Sequelize: Sequelize: sequelize: sequelize: sequelize: sequelize
npm i sequelize
Copy the code
Initialize the
Then create a new database.js file and write the following code:
const { Sequelize } = require("sequelize")
const db = new Sequelize(
config.db.database, // Database name
config.db.username, // Database user name
config.db.password, // Database password
{
host: config.db.host, // database host
dialect: "mysql".dialectOptions: {
charset: 'utf8mb4'.dateStrings: true.typeCast: (field, next) = > { // for reading from database
if (field.type === 'DATETIME') {
return field.string()
}
return next()
},
},
timezone: "+ 08:00." ".// China time zone
define: {
paranoid: true.// False delete. After data is deleted, a new deletedAt field will be added to the data and will not be queried
freezeTableName: true.// The Model name is the same as the table name
},
logging: false.// Displays execution logs})module.exports = db;
Copy the code
Initialize sequelize and export a db object with some definitions and operations. Next, we need to define the structure of the table.
// User.js
const { DataTypes } = require("sequelize")
const db = require(".. /database")
const User = db.define("user", {
id: {
type: DataTypes.INTEGER,
primaryKey: true.notNull: true.autoIncrement: true,},name: {
type: DataTypes.STRING,
notNull: true,},tel: {
type: DataTypes.STRING,
notNull: true,},intro: {
type: DataTypes.STRING,
notNull: true}})module.exports = User
Copy the code
The above code is still very intuitive, we introduce the previous initialization of DB, through db.define the User table data structure to define, and the User object to expose. Then we can use the User object to operate on the User table. Here is an example:
const userModel = require('.. /.. /model/User')
router.post("/login".async (ctx) => {
const [err, user] = await to(userModel.findOne({
where: { tel: params.tel }
}))
if (err) {
ctx.err('Login failed, please try again')
return
}
if(! user) { ctx.suc('First Login')
return
}
const { createToken } = require(".. /.. /middleware/auth")
const token = createToken(user.id)
ctx.suc('Login successful', { user, token })
})
Copy the code
Here is a simple implementation of the login interface. We use the parameter tel to query whether the User exists, introduce the previously defined User object as userModel, use the findOne method of userModel to query, and finally return the result. Other data tables are roughly the same. The data table can be manipulated or queried directly using the API in the object.
League table operation
In a relational database, there are a large number of join table relationships, such as one-to-many and one-to-one. In Sequelize, if you need to use a join table query, you must first declare the relationships between tables. Therefore, you can create a new file to maintain the relationships between tables.
// association.js
const User = require("./model/User")
const Experiment = require("./model/Experiment")
const db = require('./database')
const association = async() = > {try {
UserModel.hasMany(Experiment)
/ /... more association
} catch (error) {
console.log(error)
}
await db.sync({ alter: true}); // Synchronize to the database
}
Copy the code
In the code above we set up a one-to-many relationship between User and Experiment using the hasMany API, where a User has multiple experiments. We can import all the data table objects into this file and set up the linked table relationships uniformly, then synchronize the table objects and the table relationships to the database, and refer to the documentation for more linked table apis.
Finally, call association. Js in the entry file.
// app.js
const Koa = require("koa")
const app = new Koa()
// Set the union table relationship
const association = require("./server/association")
association()
app.listen(config.port)
Copy the code
We can use the include API when we need to query a table for some data, as shown in the following example:
router.get("/getExpDetail".async (ctx) => {
if (ctx.empty(['expId']) {return
}
const params = ctx.query
const [err, exp] = await to(expModel.findByPk(params.expId, {
include: [{model: require('.. /.. /model/User'), attributes: ['name']}]}))if (err) {
ctx.err('Obtaining failed, please try again', err)
return
}
ctx.suc('Obtain success', exp)
})
Copy the code
Sequelize also has a number of apis and can be used differently by different projects, depending on your business.
Asynchronous programming
Koa is a framework that advocates async await syntax for asynchronous programming. In the code example above, I often use the to method, as in:
const [err, user] = await to(userModel.findOne({
where: { tel: params.tel }
}))
Copy the code
This method is await-to-js github link to an asynchronous request wrapper for the repository and can be used to handle errors in asynchronous requests.
The code is also very simple, which is to use promise. catch to concatenate the error with the result in an array:
function to(promise, errorExt) {
return promise
.then(function (data) { return [null, data]; })
.catch(function (err) {
if (errorExt) {
Object.assign(err, errorExt);
}
return [err, undefined];
});
}
Copy the code
This way we can easily abort this method to execute other logic when we encounter errors in asynchronous programming
The global method
Methods such as the to method, which is used in almost every file, can be cumbersome to import manually, but we can also mount some methods through the Node.js global variable so that we can use them without importing them throughout the project.
// app.js
const to = require('await-to-js')
global.to = to
Copy the code
If you have many methods that need to be hung in you can maintain a separate directory as shown below:
Index.js is an entry file that reads all methods in the same directory and mounts them to global:
// utils/index.js const fs = require('fs'); const utils = fs.readdirSync(__dirname); utils.splice(utils.indexOf('index.js'), 1) const importGlobal = () => { for (let item of utils) { let libName = item.replace('.js', '') global[libName] = require('./' + item) } } module.exports = importGlobalCopy the code
Then import and execute in app.js:
// Import global configuration
const importGlobal = require("./utils")
importGlobal()
Copy the code
Of course this is convenient, but it is not a good choice when the team is large, because it is hard to see how the global variable or method is introduced directly, so try to make sure that the method is used in many places and provide some comments to ensure the readability of the project.
The middleware
Because of koa adopt is onion rings model, context will be one layer to deal with, so we can through to the context of fixed the middleware to implement, for example, we want to the presence of each routing judgment parameters, fixed the response status code of success and failure response and response format, we can achieve a middleware like the following code:
// Return value middleware
const response = () = > {
return async (ctx, next) => {
ctx.empty = (arr) = > {
var isnull = []
const req =
ctx.request.method == "POST"
? ctx.request.body
: ctx.query
for (let item of arr) {
if(! req[item]) { isnull.push(item) } }if (isnull.length) {
ctx.body = {
code: -1.msg: "Missing parameters" + isnull.join("ใ"),}return true
}
return false
}
ctx.suc = (msg, data) = > {
ctx.body = { code: 1, msg, data }
}
ctx.err = (msg, err) = > {
ctx.body = {
code: -1,
msg,
err: err ? err.toString() : "",}}await next()
}
}
module.exports = response
Copy the code
We mounted the empty, SUC, and Err methods on CTX, and then introduced middleware into the entry file and used it:
const Koa = require("koa")
const app = new Koa()
// The configuration request is returned
const response = require("./server/middleware/response")
app.use(response())
Copy the code
This way we can directly use the three methods on CTX when processing requests, as shown in the following example:
router.get("/getChapterDetail".async (ctx) => {
if (ctx.empty(['chapterId']) {return
}
const params = ctx.query
var [err, chapter] = await to(chapterModel.findByPk(params.chapterId))
if (err) {
ctx.err('Obtaining failed, please try again', err)
return
}
ctx.suc('Obtain success', { chapter })
})
Copy the code
More and more
In fact, there are a lot of experience in development practice want to share, such as the overall project structure organization, log library selection, routing configuration, project deployment and so on, but space is limited, I will introduce more comprehensive node.js development experience in the next article, welcome to like more ~
conclusion
With the continuous development of front-end technology, Node.js makes JS move to a broader stage. Front-end developers can independently complete a complete front-end and back-end Web project with JS. If some excellent libraries are combined, the overall stability and performance of the project will be even better. Node.js has become a must-have skill for front-end developers.
But even if the backend and front-end are developed in the same language, there is a big difference in ideas, but it is exciting to be able to erase the language differences and explore a new field. Future technologies will definitely attract more people to participate in interesting programming with lower barriers!