background

For a long time, use the side of the UI code, accurate reduction of the designer’s design draft, always need the side of the interface engineers spend a lot of time and energy to complete, also encountered in the work by modification of the interface, often USES designers and engineers of a lot of energy, let we have a lot of development time consumption.

Alibaba Group Xianyu team has recently made a breakthrough based on machine vision understanding ability. We can make robots directly understand the pictures produced by designers, and then directly generate our final code through code translation machine.

The body of the

Here’s a demo video. Let’s take a look

“UI2CODE” : Let the machine understand the image elements

The framework, called UI2CODE, allows you to directly convert images generated from screenshots of any phone into code available on Android, ios and the Web. The core part of UI2CODE is to realize the content understanding of images by means of machine vision and deep learning.

Through the separation of foreground and background, so as to identify the elements in the visual draft, know what the elements inside it is. For example, is it text, image or outline etc. In addition to knowing these things, we can also know what their properties are. Among them, the outline of the radius, thickness, stroke stroke color and background color properties, such as text, lines, line height, line width, font color, font size, font formats, each row height, thickness and font modification, alignment, text type attributes, such as image attribute a relatively simple coverage, contained and extended attributes.

UI2CODE, developed by Alibaba Xianyu Technology team, uses and improves the traditional convolutional neural network, which can simultaneously memorize and extract image features at many different scales.

Arguably, as the UI2CODE framework continues to be updated and updated, this innovation will at least allow some people to work in cooler areas. That is, it frees up developers to focus on the actual functionality they’re building, rather than spending a lot of effort building the user interface.

“UI2CODE” : Let the machine understand the layout of the image

UI2CODE understands not only image elements, but also image layout.

If elements are the “micro” part of the image, layout is the “macro” part of the image. UI2CODE lets you know what business components it consists of, the row and column layout of each element, and even what duplicate types of business components it contains.

The team uses machine vision to slice images and machine learning to know if the pieces belong to known business components. If it is found to belong to this business component, the corresponding image region is replaced with the original business component module. This saves a lot of development time building business components.

In addition, the idle fish constantly trains and learns the existing samples through deep learning, and knows its internal row layout rules through the samples, so that the image layout will be more and more accurate.

Finally, the combination of image elements and image layout makes the restored code more relevant to the actual application and highly usable.

In the future, we hope to improve the system technology to enable a more accurate understanding of the image itself, and we will combine this technology with FLUTTER development practices to provide even more powerful capabilities.

We will be announcing this technology at Google Developer Day 2018 Shanghai.

Contact us

If you have any questions or corrections about the content of the text, please let us know.

Idle fish technology team is a dapper engineering technology team. We not only pay attention to the effective solution of business problems, but also promote the cutting edge practice of computer vision technology in mobile terminals by breaking the division of division of technology stack (the unification of android/iOS/Html5/Server programming model and language). As a software engineer in the Idle Fish technology team, you have the opportunity to demonstrate all your talents and courage in the evolution of the entire product and user problem solving to prove that technology development is a life-changing force.

Resume: [email protected]

Identify two-dimensional code, forward-looking technology is in your grasp