After reading this tutorial, you will be able to independently build a Serverless application for image classification, such as this web page that recognizes food. You can also try out more TensorFlow functions on Tencent Cloud.

Artificial intelligence (AI) is changing our lives. But AI applications require much more than algorithms, data science and big data training models. It is estimated that 95% of AI calculations in production environments are used for reasoning. The best platform to use AI reasoning services is the public or edge cloud, as the cloud can offer rich computing power, efficient and secure model management, and faster 5G Internet connections.

There are several ways to put an AI model into a production environment on the cloud, such as Tencent Cloud.

  • You can start a virtual machine Server and run the AI model using tools like TensorFlow Server. This DIY approach requires deep operational knowledge of both ARTIFICIAL intelligence and operating systems, and is often quite expensive as you pay for idle resources.

  • You can also use the AI SaaS service of the public cloud to upload your own model and then use a Web UI or API to upload the data for reasoning. It’s easy, but not very flexible. Is limited by the models, configurations, and types of data pre-processing/post-processing supported by SaaS.

But for most developers, using AI reasoning in their own applications requires both flexibility and ease of use. That said, most application needs fall somewhere between DIY and AI SaaS. This is why deploying AI models in a production environment can be challenging.

Most developers just want to write a few lines of code to load their own AI model and then prepare data inputs and outputs based on the model’s requirements. This is where the TensorFlow Serverless function comes in. It makes it easy and fast to use the AI model in web applications. Now let’s get started!

Second State TensorFlow functions on Tencent Cloud Serverless can be written using the very simple Rust syntax. If you’re interested in learning what Rust is (Rust has been Stackoverflow’s most popular programming language for the past four years), now’s your chance. Keep reading and you can write and publish a Rust function and do AI reasoning in minutes!

Quick start

Ensure that Serverless Framework is installed. Clone or fork below Github or Gitee repo:

International access to GitHub: Github.com/second-stat…

Visit Gitee at gitee.com/secondstate…

From the root of the repo, run the SLS deploy command to build and deploy the entire application.

$ sls deploy ... . website: https://sls-website-ap-hongkong-kfdilz-1302315972.cos-website.ap-hongkong.myqcloud.com vendorMessage: 0700-tensorflow - SCF › "deploy" ran for 3 apps successfullyCopy the code

You can use this function to identify the food in the uploaded image by loading the deployed url into your browser.

Next, we’ll show you how to change the source code so that you can create TensorFlow functions for your AI model.

Installation tools

Follow these simple instructions to install Rust and SSVMUp.

Serverless’s TensorFlow function

Our Serverless function is written in Rust and compiled into WebAssembly. The Rust function takes care of the heavy lifting of data preparation and model preparation. Both of these tasks are highly dependent on the actual usage scenario of the function. The API is then called to execute the TensorFlow model and analyze the return value of the model.

The following is an annotated version of the function source code. The comment explains the seven steps this function performs. In steps # 1 and # 2, we loaded a MobileNet model based on food photo data set training. You can load your own retrained (or fine-tuned) MobileNet model files and their corresponding classified tag files. ]

Fn main() {//1. Load trained TensorFlow Lite model. let model_data: &[u8] = include_bytes! ("lite-model_aiy_vision_classifier_food_V1_1.tflite"); //2. Load the classification label file corresponding to the model. // Note: The model output is a series of numbers. The label file maps these numbers (i.e. line numbers) to the actual text description of the food category. let labels = include_str! ("aiy_food_V1_labelmap.txt"); //3. The uploaded image format is Base64 encoding and encapsulated in JSON object through Tencent Cloud API gateway. let mut buffer = String::new(); io::stdin().read_to_string(&mut buffer).expect("Error reading from STDIN"); let obj: FaasInput = serde_json::from_str(&buffer).unwrap(); let img_buf = base64::decode_config(&(obj.body), base64::STANDARD).unwrap(); //4. Load the uploaded image and adjust it to 192x192, which is the required size for this MobileNet model. let flat_img = ssvm_TensorFlow_interface::load_jpg_image_to_rgb8(&img_buf, 192, 192); //5. Run the model with the image as the input tensor and get the output tensor of the model. 5.1 Initialize the model and specify the model type as TensorFlow Lite. let mut session = ssvm_TensorFlow_interface::Session::new(&model_data, ssvm_TensorFlow_interface::ModelType::TensorFlowLite); //5.2 Specifies the input tensor name, data, and shape of the model, and the output tensor name. Multiple input and output tensors are supported. Run the model. session.add_input("input", &flat_img, &[1, 192, 192, 3]) .run(); let res_vec: Vec<u8> = session.get_output("MobilenetV1/Predictions/Softmax"); //6. The probability that each number in the res_vec vector corresponds to each label line in the label file. //6.1 Find the highest probability... //6.2 Translating probability into text... //6.3 Finding the corresponding tag text... //7. Text labels and probabilities are returned to function callers via STDOUT. let class_name = label_lines.next().unwrap().to_string(); println! (" upload images of {} < a href = "https://www.google.com/search?q= {}" > {} < / a > ", confidence. To_string (), class_name, class_name); }Copy the code

Steps # 3 and # 4 prepare the model and image data. #5 Call the API to execute the TensorFlow model and process the returned tensors, converting them into probability arrays. In # 6 and # 7, the Serverless function finds the label of the image through the probability array and outputs the result.

Web UI

The developer template for this tutorial includes a static web page that shows how to call the Serverless function from JavaScript. The web page uploads an image file using JavaScript and AJAX. Image data is encoded as Base64 before being submitted to Tencent Cloud’s API gateway. The response AJAX receives is the output of the Serverless function, which is MobileNet’s classification tag and confidence level inferred from the image.

function callServerlessFunction () { var reader = new FileReader(); reader.readAsDataURL(document.querySelector('#select_file').files[0]); reader.onloadend = function () { $.ajax({ url: window.env.API_URL, type: "post", data : reader.result.split("base64,")[1], dataType: "text", success: function (data) { document.querySelector('#msg').innerHTML = data; }, error: function(jqXHR, exception){document.querySelector('# MSG ').innerhtml = 'busy '; }}); }; return false; }Copy the code

In our example, the Web UI is completely separated from the Serverless function on the back end. In fact, we could distribute this static web page through CDN, decentralized storage, or even as a local file, and it would still work. This design pattern is called Jamstack applications.

The deployment of

At this point, you have seen how to update the Serverless function to use your own TensorFlow model and the corresponding UI in index.html to reflect the functionality of the new model. It’s time to deploy your application.

First use the SSVMup tool to compile the Rust function.

$ ssvmup build --enable-aot
Copy the code

Then copy the resulting scf.so file to the template directory.

$ cp pkg/scf.so scf/
Copy the code

Finally, the entire application is deployed using the Serverless Framework, including Serverless functions and the Web UI. You can edit the.env file to specify the available areas for your application.

$ sls deploy
Copy the code

Follow the on-screen instructions to log in to Tencent Cloud and give permission. Finally, you get a URL for your Web UI. Jump to this URL and try your AI Serverless app!