By Andre Susano Pinto, Technical Lead, TensorFlow Hub, and Clemens Mewald, Product Manager

Source | TensorFlow public number

In previous articles, we announced the TensorFlow Hub, a platform for publishing, discovering, and reusing partial machine learning modules in TensorFlow. A key part of the platform is its Web experience, which allows developers to discover TensorFlow modules for use cases. Today, we are launching a new Web experience for the TensorFlow Hub to make it easier to search and discover, while laying the foundation for a multi-Publisher platform.

Exploration and discovery module

TensorFlow Hub is a shared reusable machine learning platform, and we want to provide a convenient way for researchers and developers to share their work with the wider community. One successful example is the Universal Sentence Encoder module, which speeds up applications from basic machine learning science to the wider developer community. This file references the module’s TFhub.dev URL. When the URL is copied to the browser, a module details page appears, the publisher shares the document, and a link to the Colab Notebook so you can try out the module directly. Universal Sentence Encoder has become one of the most popular modules on TF Hub.

Search and filter

Needless to say, you can search and filter modules on TF Hub. The applicability of the text modules for your question depends on the data on which they are trained. In the above example, we showed you how easy it is to search for text inserts and filter by language (language: Spanish) and find NNLM modules trained on Spanish data.

Object detection becomes simple

We are constantly expanding the TensorFlow Hub inventory with new modules developed by the Google Brain team. A recently added FasterRCNN module trained on Open Images V4. This module is loaded with a single line of code and is used to perform object detection:

The detector = hub. The Module (” tfhub. Dev/Google/fast…” )

With this module, we publish a Colab Notebook that allows you to load and examine its output. The following examples are from unsplash.com images and detected objects.

Colab Notebook guides you through downloading and applying modules, all in a matter of minutes. Note: Colab notebook links colab.research.google.com/github/tens…

Other recent additions to the TensorFlow Hub include:

The 2017 iNaturalist Kaggle Challenge winners published a paper describing their approach and published their model on TensorFlow Hub, demonstrating the advantages of migrating learning.

Jeremiah Harmsen from the TensorFlow Hub team has posted a Kaggle example demonstrating how TensorFlow Hub’s pre-training module can be used to solve mood analysis challenges on Kaggle. Note: the model link alpha. Tfhub. Dev/Google/inat… Kaggle sample link www.kaggle.com/jeremiahhar…

TensorFlow Hub for product teams

In addition to the various modules published on tfhub.dev, the TensorFlow Hub library allows you to publish modules to private storage for use. In this way, team members can share modules and benefit from each other’s work.

You can use the file system path without referring to the module through its TFhub.dev URL:

M = hub.Module(“/TMP /text-embedding “) embeddings = M (Sentences)

To create these custom inserts, follow our “Create a Module” tutorial. Note: “create a module” tutorial link www.tensorflow.org/hub/creatin…

How to start

Check out tfhub.dev to use our new Web experience, and you can also check out www.tensorflow.org/hub/ to keep up to date… API documentation. If you encounter any bugs, you can ask questions on GitHub. For easy contact, you can star GitHub projects. Note: Post questions on GitHub at github.com/tensorflow/… GitHub project link github.com/tensorflow/…

Thanks Bo Fu, Andrew Gasparovic, Jiaqi Guo, Jeremiah Harmsen, Joshua Horowitz, Zicheng Huo, Elizabeth Kemp, NoeLutz, Till Pieper, Graham Smith, Sijie Wang and Sitong Zhou.