This paper will detail the end-to-end application of MindSpore, a full-scene AI framework, through a simple practical case

Environmental listing

This paper uses Windows environment, about 16 years of coarse grain brand notebook, hoping to complete this actual combat mission.

  • 10 64 – bit Windows
  • CMake > = 3.18.3
  • MinGW GCC > = 7.3.0
  • Git 2.30.0

Note: End – side training on Windows is not currently supported. To try it out, follow the official MindSpore documentation: Build MindSpore Lite: Build for Linux.

Install CMake

From the official website of CMake, support Windows, Linux, MacoS and other platforms, also support source installation, green installation and other installation methods, I use a healthy green. With my slum connection speed, I can download this less than 35 megabytes zip file in a few minutes. After unpacking, we find that the executable.exe file is in the bin directory, so we need to add the bin directory to the environment variable.

After adding the environment variables correctly, we press Win + R and enter CMD to open the command line window. Cmake and other commands will become effective, indicating that we have successfully installed cmake.

Install MinGW GCC

If the download speed is slow and thunder X doesn’t work, you can try it on your phone. After downloading, decompress again, and then find the bin directory to configure environment variables; Open a new command line window and type GCC to verify that the installation is successful.

I will not introduce git installation, remember the last installation of Git should be 19 years ago, the version requirements may not be so strict, my version is still 2.24, should not affect the use of.

Get the source code & compile

Since the source code of MindSpore has been cloned before, the source code of MindSpore will not be demonstrated here. If you are downloading the source code of MindSpore for the first time, you can execute:

git clone https://gitee.com/mindspore/mindspore.git
Copy the code

To keep in sync with the MindSpore warehouse, I executegit pullPull the latest code. I’m not using Git bash or Powershell here, mainly because they don’t support the call command, and I haven’t delve into it.

Execute the compile command to start compiling:

call build.bat lite
Copy the code

Of course, you can adjust the number of threads to suit your machine (the default is 6), for examplecall build.bat lite 8

As described in the documentation, git is required to install some of the dependencies stored on Github, so we may encounter problems such as time off, which can be solved in a very nasty way: retry/phone hot/Q turn. All we really did was get two files:

  • Mindspore-lite-{version} -Converter-win-x64. zip: includes model conversion tools to Converter.

  • Mindspore-lite-{version}-inference-win-x64.zip: includes model inference framework Runtime and benchmark tool.

Version indicates the version number of the output, which is consistent with the version corresponding to the compiled branch code. Assemble we can directly obtain official toolkit: www.mindspore.cn/tutorial/li… .

Want to use off-the-shelf tools can refer to Copy Siege Lion 1 hour introductory AI development engineer

At this point, the long wait, an hour passed, the output of the log has become a faint green, green also with some powder, like the university campus of the spring lovers slope – green grass scattered pink flower ornament.

Finally, I just went to bed… When I got up in the morning, I saw the MD5 file and the compressed package were compiled as I wanted, so I immediately tried to see if I could convert the model.

Model transformation

Although we can use the official model conversion kit directly, as in my historical article “Copy Siege Lion 1 Hour introduction to AI Development Engineer”, it is the official provided Converter. This time, in order to verify whether our self-compiled MindSpore Lite toolkits can be used properly, we unpacked mindspore-lite-1.0-Converter-win-x64. zip into a local directory to obtain the Converter tool. In addition, we also need a.mindir file. For information on how to obtain the.mindir file online, please refer to “5-minute Online Experience with MindSpore layer IR– mindir”. Here, we first directly download and use the officially provided model mobilenetv2.mindir, so according to my personal situation, execute:

call converter_lite --fmk=MINDIR --modelFile=.. /.. /mobilenetv2.mindir --outputFile=mobilenetv2Copy the code

Simply CONVERT RESULT SUCCESS:0 and generate the desired.ms file that can be used on the side.

Android projects use MindSpore Lite reasoning

As a matter of fact, I was fortunate to learn about the use of MindSpore Lite reasoning in Android projects in the “MindSpore 21-Day Combat Camp” in 2020. If you are interested, you can check out the Android APP For Target Detection based on MindSpore Lite. MindSpore Lite currently provides C++ and Java inference at runtime. The official MindSpore documentation provides detailed practical steps: read model — create session — graph compile — input data — graph execute — obtain output — release memory. The general process is shown as follows:

Next, let’s take a look at the entire reasoning exercise from the code of an actual Android project. If you choose C++ runtime, we can put the model loading and prediction main logic wrapper classes in the CPP directory, and also need to implement the JNI layer, which calls the C++ API through the library package generated on the side. The Java runtime is used here. To use the Java runtime, we need to generate the AAR package in the Linux environment, which can also be downloaded from the official website. Build your own ARR package:

bash build.sh -A java
Copy the code

The result is as follows:

  • Android projects reference AAR packages

    Aar = mindspare-lite-{version}. Aar = mindspare-lite-{version}. Aar = mindspare-lite-{version}.

    repositories {
      flatDir {
          dirs 'libs'
      }
    }
    
    dependencies {
        implementation fileTree(dir: "libs", include: ['*.aar', '*.jar'])
    }
    Copy the code
  • End – side reasoning model

    • Read the model

      When doing model reasoning in MindSpore Lite, you first need to read the.ms model file.

      import com.mindspore.lite.Model; private Model model; / /... // read.ms model file. Model = new Model(); if (! Model.loadmodel (mContext, "some_model.ms")) {log. e("MS_LITE", "failed to loadModel!" ); return false; }Copy the code
    • Create a Session

    • Graph compilation is done in Session

    • Read inputs

    • RunGraph reasoning

    • Read Outputs content

    • Delete the Session

This article is participating in the “Nuggets 2021 Spring Recruitment Campaign”, click to see the details of the campaign