YoMo is a programming framework for assisting developers to facilitate the construction of geo-distributed Cloud System. The communication layer of YoMo is built on the QUIC protocol, which brings high-speed data transmission and built-in Streaming Serverless “stream function”, which greatly improves the development experience of SDS. The distributed cloud system built by YoMo provides an ultra-high-speed communication mechanism between near-field computing power and terminals, and has a wide range of application scenarios in Metaverse, VR/AR, IoT and other fields.

YoMo is written in Go language, and Golang plug-in and shared library are used in the Streaming Serverless part to dynamically load user code and shared library. However, it also brings some limitations to developers, especially those who use Windows. Coupled with the Serverless architecture’s rigid need for isolation, this makes WebAssembly an excellent choice for running user-defined functions.

For example, in the process of real-time AI reasoning in AR/VR and smart factory, the camera can send real-time unstructured data to the computing node in the near field MEC (multi-access edge computing) device through YoMo, and automatically perform the managed AI reasoning function. When the AI reasoning is complete, YoMo sends the AI calculation results to the terminal device in real time.

The challenge for YoMo, however, is to merge and manage handler functions written by multiple outside developers in edge computing nodes. This requires runtime isolation of these functions without sacrificing performance. Traditional software container solutions, such as Docker, are not up to the task because they are too heavy and too slow to handle real-time tasks.

WebAssembly provides a lightweight, high-performance software container. It is well suited as a runtime for YoMo data processing handler functions.

In this article, we’ll show you how to create the Rust function for Tensorflow based image recognition, compile it to WebAssembly, and then run it as a stream data handler using YoMo. We used WasmEdge for the WebAssembly Runtime because WasmEdge provides the best performance and maximum flexibility compared to other WebAssembly Runtime. WasmEdge is the only WebAssembly virtual machine that steadily supports Tensorflow. YoMo manages WebAssembly bytecode applications within WasmEdge VM instances and containers through WasmEdge’s Golang API.

GitHub source: github.com/yomorun/yom…

The preparatory work

Obviously, you need to install Golang. We assume you have already installed it.

The Golang version needs to be newer than 1.15 for our example to run.

In addition, you need to install the YoMo CLI application. It arranges and coordinates data flows and handler function calls.

$go install github.com/yomorun/cli/yomo@latest $yomo version yomo CLI version: v0.0.5Copy the code

Next, install WasmEdge and the Tensorflow shared library. WasmEdge is the leading WebAssembly Runtime hosted by CNCF. We’ll use it to embed and run WebAssembly programs from YoMo.

# Install wget WasmEdge $$at https://github.com/second-state/WasmEdge-go/releases/download/v0.8.1/install_wasmedge.sh chmod +x ./install_wasmedge.sh $ sudo ./install_wasmedge.sh /usr/local # Install WasmEdge Tensorflow extension $ wget https://github.com/second-state/WasmEdge-go/releases/download/v0.8.1/install_wasmedge_tensorflow_deps.sh $wget https://github.com/second-state/WasmEdge-go/releases/download/v0.8.1/install_wasmedge_tensorflow.sh $chmod + x ./install_wasmedge_tensorflow_deps.sh $ chmod +x ./install_wasmedge_tensorflow.sh $ sudo ./install_wasmedge_tensorflow_deps.sh /usr/local $ sudo ./install_wasmedge_tensorflow.sh /usr/local # Install WasmEdge Images extension $ wget https://github.com/second-state/WasmEdge-go/releases/download/v0.8.1/install_wasmedge_image_deps.sh $wget https://github.com/second-state/WasmEdge-go/releases/download/v0.8.1/install_wasmedge_image.sh $chmod + x ./install_wasmedge_image_deps.sh $ chmod +x ./install_wasmedge_image.sh $ sudo ./install_wasmedge_image_deps.sh /usr/local $ sudo ./install_wasmedge_image.sh /usr/localCopy the code

Finally, because our demo WebAssembly functions are written in Rust, you’ll also need to install the Rust compiler and rustWASmc toolchain.

The rest of the demo can fork and clone the source code repo.

$ git clone https://github.com/yomorun/yomo-wasmedge-tensorflow.git
Copy the code

Image classification function

Image recognition functions that process YoMo image streams are written in Rust. It uses the WasmEdge Tensorflow API to process the input images.

#[wasm_bindgen] pub fn infer(image_data: &[u8]) -> String { // Load the TFLite model and its meta data (the text label for each recognized object number) let model_data: &[u8] = include_bytes! ("lite-model_aiy_vision_classifier_food_V1_1.tflite"); let labels = include_str! ("aiy_food_V1_labelmap.txt"); // Pre-process the image to a format that can be used by this model let flat_img = wasmedge_tensorflow_interface::load_jpg_image_to_rgb8(image_data, 192, 192); // Run the TFLite model using the WasmEdge Tensorflow API let mut session = wasmedge_tensorflow_interface::Session::new(&model_data, wasmedge_tensorflow_interface::ModelType::TensorFlowLite); session.add_input("input", &flat_img, &[1, 192, 192, 3]) .run(); let res_vec: Vec<u8> = session.get_output("MobilenetV1/Predictions/Softmax"); // Find the object index in res_vec that has the greatest probability // Translate the probability into a confidence level // Translate the object index into a label from the model meta data food_name ret_str = format! ( "It {} a <a href='https://www.google.com/search?q={}'>{}</a> in the picture", confidence, food_name, food_name ); return ret_str; }Copy the code

You can compile this function to WebAssembly bytecode using the Rustwasmc tool.

Here, we require Rust compiler version 1.50 or earlier for WebAssembly functions to be used with WasmEdge’s Golang API. We will catch up with the latest Rust compiler version once the Interface Type specification is finalized and supported.

$rustup default 1.50.0 $CD flow/rust_mobilenet_food $rustwasmc build '-- enabl-ext' # The output WASM will be pkg/rust_mobilenet_food_lib_bg.wasm. # Copy the wasm bytecode file to the flow/ directory $ cp pkg/rust_mobilenet_food_lib_bg.wasm .. /Copy the code

Integrated with YoMo

On the YoMo side, we use the WasmEdge Golang API to start and run the WasmEdge virtual machine for the image recognition function. The app.go file is in the source code project as follows:

package main ... . var ( vm *wasmedge.VM vmConf *wasmedge.Configure counter uint64 ) func main() { // Initialize WasmEdge's VM initVM() defer vm.Delete() defer vmConf.Delete() // Connect to Zipper service cli, err := client.NewServerless("image-recognition").Connect("localhost", 9000) if err ! = nil {log.print ("❌ Connect to zipper failure: ", err) return } defer cli.Close() cli.Pipe(Handler) } // Handler process the data in the stream func Handler(rxStream rx.RxStream) rx.RxStream { stream := rxStream. Subscribe(ImageDataKey). OnObserve(decode). Encode(0x11) return stream } // decode Decode and perform image recognition var decode = func(v []byte) (interface{}, error) { // get image binary p, _, _, err := y3.DecodePrimitivePacket(v) if err ! = nil { return nil, err } img := p.ToBytes() // recognize the image res, err := vm.ExecuteBindgen("infer", wasmedge.Bindgen_return_array, img) return hash, nil } ... . // initVM initialize WasmEdge's VM func initVM() { wasmedge.SetLogErrorLevel() vmConf = wasmedge.NewConfigure(wasmedge.WASI) vm = wasmedge.NewVMWithConfig(vmConf) var wasi = vm.GetImportObject(wasmedge.WASI) wasi.InitWasi( os.Args[1:], /// The args os.Environ(), /// The envs []string{".:."}, /// The mapping directories []string{}, /// The preopens will be empty ) /// Register WasmEdge-tensorflow and WasmEdge-image var tfobj = wasmedge.NewTensorflowImportObject() var tfliteobj = wasmedge.NewTensorflowLiteImportObject() vm.RegisterImport(tfobj) vm.RegisterImport(tfliteobj) var imgobj = wasmedge.NewImageImportObject() vm.RegisterImport(imgobj) /// Instantiate wasm  vm.LoadWasmFile("rust_mobilenet_food_lib_bg.wasm") vm.Validate() vm.Instantiate() }Copy the code

run

Finally, we start YoMo and look at the operation of the entire data processing pipeline. Start the YoMo CLI application from the project folder. The YAML file defines the port that YoMo should listen to and the workflow handler that triggers the incoming data. Note that the stream name image-Recognition matches the data handler app.go mentioned above.

$ yomo serve -c ./zipper/workflow.yaml
Copy the code

Start the handler by running the app.go program mentioned above.

$ cd flow
$ go run --tags "tensorflow image" app.go
Copy the code

Start the simulation data source by sending data to YoMo. A video is a series of picture frames. The WasmEdge function in app.go will be called for each picture frame in the video.

# Download a video file
$ wget -P source 'https://github.com/yomorun/yomo-wasmedge-tensorflow/releases/download/v0.1.0/hot-dog.mp4'

# Stream the video to YoMo
$ go run ./source/main.go ./source/hot-dog.mp4
Copy the code

You can see the output of the WasmEdge Handler function in the console. It prints the names of the objects it detects in each picture frame of the video.

Looking to the future

This article discussed how to use the WasmEdge Tensorflow API and Golang SDK in the YoMo framework to process image streams in near real time.

In collaboration with YoMo, we will soon deploy WasmEdge in actual production in smart factories for a variety of assembly line tasks. WasmEdge is edge computing software Runtime!