• Google’s ML Kit offers Easy Machine Learning APIs for Android and iOS
  • By RON AMADEO
  • The Nuggets translation Project
  • Permanent link to this article: github.com/xitu/gold-m…
  • Translator: ALVINYEH
  • Proofreader: kezhenxu94

Ordinary people can also add machine learning capabilities to their applications with simple API calls.

Mountain VIEW, Calif. — Google is rolling out a new machine learning SDK called “ML Kit” for its Firebase development platform. The new SDK provides off-the-shelf apis for some of the most common computer vision use cases, allowing developers who are not machine learning experts to add some machine learning magic to their applications. This is not just an Android SDK; It also works with iOS apps.

Generally speaking, setting up a machine learning environment is a tough job. You have to learn how to use a machine learning library like TensorFlow, take a lot of training data to teach your neural network what to do, and eventually, you need it to output a model that’s lightweight enough to run on a mobile device. ML Kit simplifies this process by invoking some machine learning features on Google’s Firebase platform.

The new API supports text recognition, face detection, bar code scanning, image tagging and landmark recognition. There are two versions of each API: a cloud-based version that provides greater accuracy at the expense of some data, and a version on a local device that works even when offline. For photos, the native version of the API can identify the dog in the image, while the more accurate cloud-based API can determine the specific breed of dog. The native API is free, while the cloud-based API is priced using the usual Firebase Cloud API.

If the developer does use a cloud-based API, then none of the data stays on Google’s cloud. Once the processing is complete, the data is deleted.

In the future, Google will add an API for smart Reply. The machine learning feature, which will make its debut in Google Inbox, will scan emails and generate a few short replies to your message that you can send with a single click. This feature will debut in a preliminary preview, and calculations will always be done locally on the device. There’s also a “high-density face profile” feature coming soon to the face detection API, which is perfect for augmented reality apps that paste virtual objects onto your face.

  • YouTube video link: https://youtu.be/ejrn_JHksws

ML Kit also provides an option to decouple machine learning models from applications and store models in the cloud. Since these models can be “tens of megabytes in size,” unloading them to the cloud should speed up application installation, according to Google. Models are downloaded at run time first, so they can work offline after the first run, and the application will download any future model updates.

The sheer size of these machine learning models is a problem that Google is trying to solve with future cloud-based machine learning compression solutions. Google’s plan is to eventually adopt the full TensorFlow model and deliver a compressed TensorFlow Lite model with similar precision.

ML Kit also works well with other Firebase features, such as RemoteConfig, which allows A/B testing of machine learning models on A user basis. Firebase can also switch or update models dynamically without updating the application.

Developers who want to try ML Kit can find it on the Firebase Console.


The Nuggets Translation Project is a community that translates quality Internet technical articles from English sharing articles on nuggets. The content covers Android, iOS, front-end, back-end, blockchain, products, design, artificial intelligence and other fields. If you want to see more high-quality translation, please continue to pay attention to the Translation plan of Digging Gold, the official Weibo, Zhihu column.