Vercel is the leading platform for developing and hosting Jamstack applications. Unlike traditional Web applications that dynamically generate the UI from the server at Runtime, A Jamstack application consists of a static UI (HTML and JavaScript) and a set of Serverless functions that support dynamic UI elements through JavaScript.
Jamstack’s approach has many benefits. One of the most important benefits of this is its powerful performance. Since the UI is no longer generated from the runtime of the central server, there is much less load on the server and we can deploy the UI over an edge network, such as a CDN.
However, edge CDN only solves the problem of distributing static UI files. The back-end Serverless functions may still be slow. In fact, the current popular Serverless platform has well-known performance problems, such as slow cold starts, especially for interactive applications. In this regard, WebAssembly has a lot to offer.
Using WasmEdge, a CNCF hosted cloud-native WebAssembly Runtime, developers can write high-performance Serverless functions for deployment on public cloud or edge computing nodes. In this article, we’ll explore how to use WasmEdge functions written by Rust to drive a Vercel application back end.
Why use WebAssembly in Vercel Serverless?
The Vercel platform already has the serverless framework, which is very easy to use, to deploy functions managed in Vercel. As discussed above, WebAssembly and WasmEdge are used to further improve performance. High-performance functions written in C/C++, Rust, and Swift can be easily compiled into WebAssembly. These WebAssembly functions are much faster than the JavaScript or Python functions commonly used in serverless functions.
So the question is, if raw performance is the only goal, why not just compile these functions into machine-native executables? This is because the WebAssembly “container” still provides many valuable services.
First, WebAssembly isolates functions in the Runtime hierarchy. Errors or memory safety problems in the code do not propagate outside the WebAssembly Runtime. As software supply chains become increasingly complex, it is important to run code in containers to prevent unauthorized access to your data through dependent libraries.
Second, WebAssembly bytecode is portable. Developers only need to build once and don’t have to worry about future changes or updates to the underlying Vercel Serverless container (operating system and hardware). It also allows developers to reuse the same WebAssembly Functions in similar hosting environments, such as the public cloud of Tencent Serverless Functions, or in data streaming frameworks like YoMo.
Finally, the WasmEdge Tensorflow API provides the best way to implement the Tensorflow model in accordance with the Rust specification. WasmEdge installs the right combination of Tensorflow dependency libraries and provides a unified API for developers.
Concepts and explanations abound, strike while the iron is hot, let’s take a look at the sample application!
The preparatory work
Since our demo WebAssembly function is written in Rust, you need to have the Rust compiler installed. Be sure to install the WASM32-WASI compiler target as follows to generate WebAssembly bytecode.
$ rustup target add wasm32-wasi
Copy the code
The front end of the Demo application was written in nex.js and deployed on Vercel. We assume that you already have a basic knowledge of using Vercel.
Example 1: Image processing
Our first demo application asks the user to upload an image and then calls the serverless function to turn it into a black and white image. Before you get started, you can try out this demo deployed on Vercel.
First fork the GitHub repo of the Demo application. To deploy an application on Vercel, just import it from the Vercel for GitHub page point GitHub repo.
The content of this GitHub Repo is a standard Next. Js application for the Vercel platform. The back-end serverless functions are in the API /functions/ image_grayScale folder. The SRC /main.rs file contains the source code of the Rust program. The Rust program reads the image data from STDIN and outputs the black and white images to STDOUT.
use hex; use std::io::{self, Read}; use image::{ImageOutputFormat, ImageFormat}; fn main() { let mut buf = Vec::new(); io::stdin().read_to_end(&mut buf).unwrap(); let image_format_detected: ImageFormat = image::guess_format(&buf).unwrap(); let img = image::load_from_memory(&buf).unwrap(); let filtered = img.grayscale(); let mut buf = vec! []; match image_format_detected { ImageFormat::Gif => { filtered.write_to(&mut buf, ImageOutputFormat::Gif).unwrap(); }, _ => { filtered.write_to(&mut buf, ImageOutputFormat::Png).unwrap(); }}; io::stdout().write_all(&buf).unwrap(); io::stdout().flush().unwrap(); }Copy the code
Build Rust as WebAssembly bytecode or native code using Rust’s Cargo tool.
$ cd api/functions/image-grayscale/
$ cargo build --release --target wasm32-wasi
Copy the code
Copy build Artifacts to the API folder.
$ cp target/wasm32-wasi/release/grayscale.wasm .. /.. /Copy the code
Vercel runs API /pre.sh when setting up the Serverless environment. The WasmEdge Runtime is installed, and the WebAssembly bytecode program is compiled into a local SO library for faster execution.
The API /hello.js file complies with Vercel’s Serverless specification. It loads the WasmEdge Runtime, starts the compiled WebAssembly program in WasmEdge, and passes the uploaded image data through STDIN. Note here that API /hello.js runs the compiled grayscale.so file generated by API /pre.sh for better performance.
const fs = require('fs');
const { spawn } = require('child_process');
const path = require('path');
module.exports = (req, res) => {
const wasmedge = spawn(
path.join(__dirname, 'WasmEdge-0.8.1-Linux/bin/wasmedge'),
[path.join(__dirname, 'grayscale.so')]);
let d = [];
wasmedge.stdout.on('data', (data) => {
d.push(data);
});
wasmedge.on('close', (code) => {
let buf = Buffer.concat(d);
res.setHeader('Content-Type', req.headers['image-type']);
res.send(buf);
});
wasmedge.stdin.write(req.body);
wasmedge.stdin.end('');
}
Copy the code
And we’re done. Next, deploy repo to Vercel and you have a Jamstack application. The application features a high-performance Rust – and WebAssembly-based Serverless back end.
Example 2: AI reasoning
The second demo application lets the user upload an image and then calls the serverless function to identify the main objects in the image.
It is on the same GitHub repo as the previous example, but in the TensorFlow branch. Note: When importing this GitHub repo to the Vercel website, Vercel will create a preview URL for each branch. The TensorFlow branch will have its own deployment URL.
The back-end Serverless functions for image classification are located in the API /functions/image-classification folder in the TensorFlow branch. The SRC /main.rs file contains the source code of the Rust program. The Rust program reads the image data from STDIN and then prints the text output to STDOUT. It uses the WasmEdge Tensorflow API to run AI reasoning.
pub fn main() { // Step 1: Load the TFLite model let model_data: &[u8] = include_bytes! (" models/mobilenet_v1_1. 0 _224 / mobilenet_v1_1. 0 _224_quant. Tflite "); let labels = include_str! (" models/mobilenet_v1_1. 0 _224 / labels_mobilenet_quant_v1_224. TXT "); // Step 2: Read image from STDIN let mut buf = Vec::new(); io::stdin().read_to_end(&mut buf).unwrap(); // Step 3: Resize the input image for the tensorflow model let flat_img = wasmedge_tensorflow_interface::load_jpg_image_to_rgb8(&buf, 224, 224); // Step 4: AI inference let mut session = wasmedge_tensorflow_interface::Session::new(&model_data, wasmedge_tensorflow_interface::ModelType::TensorFlowLite); session.add_input("input", &flat_img, &[1, 224, 224, 3]) .run(); let res_vec: Vec<u8> = session.get_output("MobilenetV1/Predictions/Reshape_1"); // Step 5: Find the food label that responds to the highest probability in res_vec // ... . let mut label_lines = labels.lines(); for _i in 0.. max_index { label_lines.next(); } // Step 6: Generate the output text let class_name = label_lines.next().unwrap().to_string(); if max_value > 50 { println! ("It {} a <a href='https://www.google.com/search?q={}'>{}</a> in the picture", confidence.to_string(), class_name, class_name); } else { println! ("It does not appears to be any food item in the picture."); }}Copy the code
Use the Cargo tool to build Rust as WebAssembly bytecode or native code.
$ cd api/functions/image-grayscale/
$ cargo build --release --target wasm32-wasi
Copy the code
Copy build Artifacts to the API folder
$ cp target/wasm32-wasi/release/classify.wasm .. /.. /Copy the code
Again, the API /pre.sh script installs the WasmEdge Runtime and its Tensorflow dependencies in this application. It also compiles the false. Wasm bytecode program to the false. So native shared library at deployment time.
The API /hello.js file complies with the Vercel Serverless specification. It loads the WasmEdge Runtime, starts the compiled WebAssembly program in WasmEdge, and passes the uploaded image data through STDIN. Note that API /hello.js runs the compiled false.so file generated by API /pre.sh for better performance.
const fs = require('fs');
const { spawn } = require('child_process');
const path = require('path');
module.exports = (req, res) => {
const wasmedge = spawn(
path.join(__dirname, 'wasmedge-tensorflow-lite'),
[path.join(__dirname, 'classify.so')],
{env: {'LD_LIBRARY_PATH': __dirname}}
);
let d = [];
wasmedge.stdout.on('data', (data) => {
d.push(data);
});
wasmedge.on('close', (code) => {
res.setHeader('Content-Type', `text/plain`);
res.send(d.join(''));
});
wasmedge.stdin.write(req.body);
wasmedge.stdin.end('');
}
Copy the code
Now you can deploy Forked’s repo to Vercel and you get a Jamstack application that recognizes objects.
Just change the Rust function in the template and you’ll be ready to deploy your own high-performance Jamstack application!
Looking forward to
Running WasmEdge from Vercel’s current Serverless container is an easy way to add high performance functions to a Vercel application.
If you have developed an interesting Vercel function or application with WasmEdge, you can add wechat H0923XW and get a small gift.
Looking ahead, it would be better to use WasmEdge itself as a container rather than starting it with Docker and Node.js as we do today. This way, we can run serverless functions more efficiently. WasmEdge is already compatible with Docker tools. If you are interested in joining WasmEdge and CNCF in this exciting work, feel free to join our channel.