Netlify is a platform for developing and hosting Jamstack applications. Jamstack is actually a term coined by Netlify founder Mathias Biilmann in 2015. Netlify is also the main organizer of JamstackConf.

The Jamstack application consists of a static UI (HTML and JavaScript) and a series of Serverless functions. Dynamic UI elements are generated by JavaScript retrieving data from the Serverless function. Jamstack has many benefits, but one of the most important is excellent performance. Since the UI is no longer generated from the runtime of the central server, there is much less load on the server, and we can deploy the UI over the edge network (such as the CDN).

But edge CDN only solves the problem of distributing static UI files. The back-end Serverless function may still be slow. In fact, the popular Serverless platform has well-known performance issues, such as slow cold starts, especially for interactive applications. WebAssembly has a lot to offer in this regard.

Using the cloud-native WebAssembly Runtime WasmEdge hosted by CNCF, developers can write high-performance Serverless functions that can be deployed on public clouds or edge computing nodes. In this article, we explore how to use the WasmEdge function written by Rust to support the Netlify application back end.

Why use WebAssembly to implement Netlify functions

The Netlify platform already has a very easy-to-use Serverless framework for deploying functions. As discussed above, WebAssembly and WasmEdge are used to further improve performance. High-performance functions written in C/C++, Rust, and Swift can be easily compiled into WebAssembly. These WebAssembly functions are much faster than JavaScript or Python commonly used in serverless functions.

However, if raw performance is the only goal, why not compile these functions directly into machine-local executables (local client or NaCl)? This is because Netlify is already running these functions safely in AWS Lambda using Firecracker microVM.

Our vision for the future is to run WebAssembly as a lightweight Runtime alongside Docker and microVM in a cloud native infrastructure. WebAssembly provides higher performance and consumes fewer resources than a container like Docker or microVM. But for now, Netlify only supports Running WebAssembly in microVM.

Running WebAssembly functions in microVM has many advantages over running containerized NaCl programs.

First, WebAssembly provides fine-grained runtime isolation for individual functions. A microservice can have multiple functions and support services running in microVM. WebAssembly can make microservices more secure and stable.

Second, WebAssembly bytecode is portable. Developers only need to build once without worrying about future changes or updates to the underlying Netlify Serverless Runtime. It also allows developers to reuse the same WebAssembly functions in other cloud environments.

Third, WebAssembly applications are easy to deploy and manage. They have less platform dependence and complexity than NaCl dynamic libraries and executables.

Finally, the WasmEdge Tensorflow API provides the most compliant way to implement the Rust specification Tensorflow model. WasmEdge installs the right combination of Tensorflow dependency libraries and provides a unified API for developers.

Many concepts and explanations, strike while the iron is hot, let’s take a look at sample applications!

The preparatory work

Because our Demo WebAssembly functions are written in Rust, you need to install the Rust compiler. Be sure to install the WASM32-WASI compiler target as follows to generate the WebAssembly bytecode.

$ rustup target add wasm32-wasi
Copy the code

The demo application front-end is written in next.js and deployed on Netlify. We assume that you already have a basic knowledge of working with next.js and Netlify.

Example 1: Image manipulation

Our first demo application had the user upload an image and then call the Serverless function to turn it into a black and white image. Before you start, you can try out this demo deployed on Netlify.

First fork the GitHub repo of the Demo application. To deploy the application to Netlify, simply add your GitHub repO to Netlify.

This REPo is a standard Next-js application for the Netlify platform. The back-end serverless functions are in the API/Functions /image_grayscale folder. The SRC /main.rs file contains the source code for the Rust program. The Rust program reads the image data from STDIN and outputs the black and white image to STDOUT.

use hex; use std::io::{self, Read}; use image::{ImageOutputFormat, ImageFormat}; fn main() { let mut buf = Vec::new(); io::stdin().read_to_end(&mut buf).unwrap(); let image_format_detected: ImageFormat = image::guess_format(&buf).unwrap(); let img = image::load_from_memory(&buf).unwrap(); let filtered = img.grayscale(); let mut buf = vec! []; match image_format_detected { ImageFormat::Gif => { filtered.write_to(&mut buf, ImageOutputFormat::Gif).unwrap(); }, _ => { filtered.write_to(&mut buf, ImageOutputFormat::Png).unwrap(); }}; io::stdout().write_all(&buf).unwrap(); io::stdout().flush().unwrap(); }Copy the code

Use Rust’s Cargo tool to build Rust programs as WebAssembly bytecode or native code.

$ cd api/functions/image-grayscale/
$ cargo build --release --target wasm32-wasi 
Copy the code

Copy build Artifacts to the API folder.

$ cp target/wasm32-wasi/release/grayscale.wasm .. /.. /Copy the code

The Netlify function runs API /pre.sh when setting up the Serverless function. The WasmEdge Runtime is installed, and the WebAssembly bytecode program is compiled into a native SO library for faster execution.

API /hello.js text loads the WasmEdge Runtime, launches the compiled WebAssembly program in WasmEdge, and passes the uploaded image data via STDIN. Note here that API /hello.js runs the compiled grayscale.so file generated by API /pre.sh for better performance.

const fs = require('fs');
const { spawn } = require('child_process');
const path = require('path');

module.exports = (req, res) => {
  const wasmedge = spawn(
      path.join(__dirname, 'wasmedge'), 
      [path.join(__dirname, 'grayscale.so')]);

  let d = [];
  wasmedge.stdout.on('data', (data) => {
    d.push(data);
  });

  wasmedge.on('close', (code) => {
    let buf = Buffer.concat(d);

    res.setHeader('Content-Type', req.headers['image-type']);
    res.send(buf);
  });

  wasmedge.stdin.write(req.body);
  wasmedge.stdin.end('');
}
Copy the code

And you’re done. Then you deploy the REPo to Netlify, and you get a Jamstack application. The application has a high-performance Serverless back end based on Rust and WebAssembly.

Example 2: AI reasoning

The second demo application lets the user upload an image and then calls the serverless function to identify the main objects in the image.

It is on the same GitHub repo as the previous example, but in the TensorFlow branch. The back-end serverless functions for image recognition are in the API/Functions /image-classification folder of this branch. The SRC /main.rs file contains the source code for the Rust program. The Rust program reads the image data from STDIN and then outputs the text output to STDOUT. It uses the WasmEdge Tensorflow API to run AI reasoning.

pub fn main() { // Step 1: Load the TFLite model let model_data: &[u8] = include_bytes! (" models/mobilenet_v1_1. 0 _224 / mobilenet_v1_1. 0 _224_quant. Tflite "); let labels = include_str! (" models/mobilenet_v1_1. 0 _224 / labels_mobilenet_quant_v1_224. TXT "); // Step 2: Read image from STDIN let mut buf = Vec::new(); io::stdin().read_to_end(&mut buf).unwrap(); // Step 3: Resize the input image for the tensorflow model let flat_img = wasmedge_tensorflow_interface::load_jpg_image_to_rgb8(&buf, 224, 224); // Step 4: AI inference let mut session = wasmedge_tensorflow_interface::Session::new(&model_data, wasmedge_tensorflow_interface::ModelType::TensorFlowLite); session.add_input("input", &flat_img, &[1, 224, 224, 3]) .run(); let res_vec: Vec<u8> = session.get_output("MobilenetV1/Predictions/Reshape_1"); // Step 5: Find the food label that responds to the highest probability in res_vec // ... . let mut label_lines = labels.lines(); for _i in 0.. max_index { label_lines.next(); } // Step 6: Generate the output text let class_name = label_lines.next().unwrap().to_string(); if max_value > 50 { println! ("It {} a <a href='https://www.google.com/search?q={}'>{}</a> in the picture", confidence.to_string(), class_name, class_name); } else { println! ("It does not appears to be any food item in the picture."); }}Copy the code

Use the Cargo tool to build Rust programs as WebAssembly bytecode or native code.

$ cd api/functions/image-classification/
$ cargo build --release --target wasm32-wasi
Copy the code

Copy build Artifacts to the API folder

$ cp target/wasm32-wasi/release/classify.wasm .. /.. /Copy the code

Again, the API /pre.sh script installs the WasmEdge Runtime and its Tensorflow dependencies in this application. At the same time, it compiles the classify. Wasm bytecode program to a local shared library, classify.

The API /hello.js script loads the WasmEdge Runtime, launches the compiled WebAssembly program in WasmEdge, and passes the uploaded image data via STDIN. Note that API /hello.js runs the compiled classify. So file generated by API /pre.sh for better performance.

const fs = require('fs');
const { spawn } = require('child_process');
const path = require('path');

module.exports = (req, res) => {
  const wasmedge = spawn(
    path.join(__dirname, 'wasmedge-tensorflow-lite'),
    [path.join(__dirname, 'classify.so')],
    {env: {'LD_LIBRARY_PATH': __dirname}}
  );

  let d = [];
  wasmedge.stdout.on('data', (data) => {
    d.push(data);
  });

  wasmedge.on('close', (code) => {
    res.setHeader('Content-Type', `text/plain`);
    res.send(d.join(''));
  });

  wasmedge.stdin.write(req.body);
  wasmedge.stdin.end('');
}
Copy the code

Now you can deploy your forked REPO to Netlify and get a Web application that does object recognition.

What’s next?

Running WasmEdge in Netlify’s current Serverless container is now an easy way to add high-performance functions to Netlify applications. A better approach in the future is to use WasmEdge as the container itself so that we can run serverless functions more efficiently without Docker and Node.js. WasmEdge is already compatible with Docker tools. If you are interested in joining WasmEdge and CNCF for this exciting work, welcome to join us on Channel!