Paddlelite install raspberry PI; Install the PaddleHub on an x86 computer and convert the pre-trained model in the PaddleHub to the PaddlElite format to run on the raspberry PI.

Raspberry PI builds and installs PaddlElite

sudo apt install patchelf cmake sudo update-alternatives --install /usr/bin/python python /usr/bin/python3 150 git clone  https://github.com/PaddlePaddle/Paddle-Lite.git cd Paddle-Lite sudo ./lite/tools/build.sh \ --build_extra=ON \ --arm_os=armlinux \ --arm_abi=armv7hf \ --arm_lang=gcc \ --build_python=ON \ full_publish cd build.lite.armlinux.armv7hf.gcc/inference_lite_lib.armlinux.armv7hf/python/install sudo python3 setup.py installCopy the code

The parameters of the specific source compile time, you may refer to: paddlepaddle. Making. IO/Paddle – Lite…

Convert the pre-trained model of the PaddleHub to the PaddlElite format on x86 computers

Install PaddleHub, PaddlePaddle

python -m pip install paddlehub
python -m pip install paddlepaddle
Copy the code

Test the PaddleHub pretraining model

import paddlehub as hub

senta = hub.Module(name="senta_gru")
test_text = ["This restaurant is delicious."."This movie really sucks."]

results = senta.sentiment_classify(texts=test_text, use_gpu=False, batch_size=1)

for result in results:
    print(result['text'])
    print(result['sentiment_label'])
    print(result['sentiment_key'])
    print(result['positive_probs'])
    print(result['negative_probs'])
Copy the code

Download the OPT conversion tool

Paddlepaddle. Making. IO/Paddle – Lite…

Download the PaddleHub pre-training model and transform it

Download the pre-training model package:

hub download senta_gru
Copy the code

Unzip it and place it under saved_models/senta_gru,

Transformation model:

./opt --model_dir=senta_gru/infer_model --valid_targets=arm --optimize_out_type=naive_buffer --optimize_out=saved_models/senta_gru
Copy the code

PS: View the model operators supported by the opt conversion tool:

./opt --print_model_ops=true --valid_targets=arm --model_dir=senta_gru/infer_model
Copy the code

The converted pre-training model was tested on raspberry PI

from paddlelite.lite import *

config = MobileConfig()
config.set_model_dir("model")

# (2) Create Predictor
predictor = create_paddle_predictor(config)
"'... ' ' '
Copy the code

And then there’s a segment error…


Afterword.

Then began to find all kinds of reasons, but also thought that may be caused by the leakage of what conversion…

I have also searched for materials repeatedly, but since paddle is not stable yet, and even the demo in the official website is outdated, the interface has changed and disappeared…

So finally had to give up temporarily.

Paddlepaddle and PaddleHub, as the main parts of Baidu Paddle framework, are relatively usable because of their fast iteration speed and satisfactory completion. But other components may be lower priority and the documents may be out of sync, which can cause a lot of confusion.

After all, I have already spent a whole day learning the recommendation system and finishing my stock project of situation prediction, so PADDlElite project can only be started after Baidu Paddle stabilizes.