TensorFlow Lite is the perfect weight loss version of TensorFlow, enabling machine learning models to run more efficiently on a variety of mobile devices. TensorFlow Lite Preview is now available on iOS and Android. Come and try it! Google has officially released a developer preview of TensorFlow Lite, a lightweight solution for mobile and embedded devices. TensorFlow Lite is a new design with three important features — Lightweight, cross-platform and Fast.

TensorFlow Lite:

  1. TensorFlow Model: Stores the trained TensorFlow Model on the hard disk
  2. TensorFlow Lite Converter: An application to convert models to TensorFlow Lite file format.
  3. TensorFlow Lite Model File: A Model File format based on FlatBuffers, optimized for speed and size. You can deploy the TensorFlow Lite Model File to a Mobile App, as shown in the figure above:
  4. Java API: in the Android App C++ App, convenient encapsulation.
  5. C++ API: load the TensorFlow Lite Model File and invoke the Interpreter. The two libraries are available on both Android and iOS.
  6. Interpreter: Uses a set of operators to execute the model. Operators can be selected, if no operators, only 70KB, after loading all operators 300KB. This is much less capacity than TensorFlow Mobile, which requires 1.5m using a formal set of operators.
  7. On Android devices, Interpreter supports the Android Neural Network API, which you can use for hardware acceleration. If no accelerator is available, the CPU is used by default. Developers can also customize the kernel using the C++ API.

Models TensorFlow Lite currently supports many models that have been trained and optimized for mobile. 1.MobileNet: Can recognize 1000 different object class visual models, designed for efficient execution on mobile and embedded devices. Inception V3: Image recognition model, functionally similar to MobileNet, it provides higher accuracy but is relatively larger. 3.Smart Reply: A device conversation model that responds to chat messages in real time. This feature is available on Android Wear. Inception V3 and MobileNets have been trained on ImageNet datasets. You can use transfer learning to easily retrain your image data set.

As you know, TensorFlow allows Mobile and embedded deployment of models through the TensorFlow Mobile API. Going forward, TensorFlow Lite should be seen as an upgrade to TensorFlow Mobile. As it matures, it will become the recommended solution for deploying models on mobile and embedded devices. TensorFlow Lite is currently available as a preview and you can still use TensorFlow Mobile. TensorFlow Lite has many features and is still in full development. In this release, we deliberately use a limited platform to ensure that the performance of some of the most important common models is not affected. We plan to prioritize future extensions based on user needs. Our development goal was to simplify the developer experience and enable the model to be deployed to a range of mobile and embedded devices. I’m glad the developers are also helping to make the TensorFlow Lite project work. We will support and launch the TensorFlow Lite community with the same passion as the TensorFlow project. Welcome to TensorFlow Lite. Part of the reference information content: Sheng Bo fa together pray chiayuan.org.tw/