“What will be hot and what to learn in 2022? This article is participating in the” Talk about 2022 Technology Trends “essay campaign.

preface

According to the 2020-2021 China AI Computer Development Assessment Report, the growing demand for real-time business has made edge and end-to-end computing power increasingly important, and IDC predicts that by 2023, nearly 20% of servers used for AI workloads will be deployed at the edge.

According to the 2021-2022 Evaluation Report of China’S AI Computing Power Development, AI chips can be deployed in cloud data centers, edge side and terminal side. Cloud computing provides an infrastructure for artificial intelligence. At present, cloud computing is an important carrier for massive data processing and large-scale computing. Driven by new technologies such as 5G, which reduce the speed of data transmission and processing, ai processing on the edge will become a key growth area for enterprises in order to share the computing burden of data centers and also improve real-time response speed. Ai will have a wide range of applications on the edge and on the edge, from autonomous driving to industrial manufacturing to consumer smart homes and wearables, where ai chips need to strike a balance between power consumption, computing performance, shape and material cost.

Bao read the evaluation reports on the development of ARTIFICIAL intelligence computing power in the past two years, and found that in addition to the most core problem of computing power, the report also repeatedly mentioned edge measurement and end-to-end. Therefore, Bao believed that edge measurement and end-to-end will be hot topics in the field of ARTIFICIAL intelligence in 2022.

The small bag can not help but curious! What are end side and edge side respectively that? What are the advantages?

Edge of computing

Cloud vs. edge computing

Cloud computing refers to the breaking down of huge data-crunching processes through the network’s “cloud” into countless small programs, which are then processed and analyzed by a system of multiple servers and sent back to users. Cloud computing can provide huge computing capacity and massive data storage capacity, greatly promoting the development of the Internet.

With the help of the powerful computing capacity of cloud computing, many industries have achieved leapfrog development. For example, in the Internet of Things industry, the number of devices connected to the Internet of Things and the data generated show a massive growth trend.

On the one hand, the traditional cloud computing model with centralized processing mode has been unable to deal with the massive data generated by various access devices, which makes the cloud overwhelmed and causes greater data bottleneck. On the other hand, in the context of the Internet of everything, application services need low latency, high reliability and data security, while the traditional cloud computing mode cannot meet the needs in real time, privacy protection and energy consumption.

In order to become more real-time and analyze the huge amount of data from terminal devices and reduce the pressure of network transmission, ai computing power will gradually shift to the edge. So what is edge computing that?

Edge of computing

Edge computing refers to processing and analyzing data at network edge nodes. Edge node refers to any node with computing and network resources between data generation source and cloud center. For example, mobile phone can be the edge node between people and cloud center, while gateway is the edge node between smart home and cloud center.

In general, edge computing puts more computing processes on edge nodes and fewer processes running in the cloud. In this way, the data can be calculated and analyzed near the data generation source, minimizing the flow of data between the client and the cloud, thus reducing cloud response time and network service instability.

If you still don’t understand edge computing, here’s an example: You’ve all heard of octopuses, like this one:

If you don’t know about octopuses, they have “conceptual thinking” and are able to solve complex problems on their own, making them the most intelligent invertebrates in nature. It has to do with its memory system. Octopuses have two memory systems: a brain with 500 million neurons; The other is a sucker on its eight PAWS, which means it can think and solve problems.

Octopus brains are like cloud computing, with eight tiny PAWS for edge computing, each a tiny computer room. Cloud computing holds the whole, edge computing holds the local.

Because edge computing is closer to the data, edge computing has the following advantages:

  • Data localization: data need not be sent to the cloud to solve cloud storage and privacy problems;
  • Localization of computing: part of computing is performed on edge nodes, which reduces the computing pressure on the cloud and solves the problem of cloud computing overload;
  • Low communication cost: there is no need to interact with the cloud, which can realize low delay and strong real-time scene, and also save communication cost;
  • Decentralized computing, fault avoidance and extreme personalization.

We can take several typical cases of edge computing to experience the advantages of edge computing:

  • In face recognition, edge computing can reduce the response time from 900ms to 169ms.
  • By offloading some computing tasks from the cloud to the edge, the energy consumption of the entire system can be reduced by 30-40%.
  • System data integration, migration and other aspects of the time will be reduced to the original 1/20;

But it’s important to note that while edge computing is awesome, it’s not a “replacement” for cloud computing. (Octopuses can think with their PAWS, but they can’t cut off their brains and live on them.) If cloud computing pays more attention to global control, edge computing focuses on local areas, essentially complementing and optimizing cloud computing. Edge computing is closer to the data source and can solve problems with low delay and near real-time.

IDC predicts that by 2023, 20% of edge servers will be used to handle AI workloads, and 70% of enterprises will be running varying levels of data processing at the edge of the iot.

The challenge of edge computing

Edge computing also presents many challenges:

  • Inconsistency between edge computing center and different data acquisition and measurement communication protocols
  • Edge application scenarios are fragmented, and large-scale scenarios cannot be formed.
  • Edge computing has potential data loss: Edge computing data is often not sent to the cloud storage, edge computing will be discarded after processing

The intelligent

Cloud intelligence vs. terminal intelligence

Similar to cloud to edge computing, the evolution of ARTIFICIAL intelligence is going from cloud to end.

If we recall the development process of an ARTIFICIAL intelligence application, there are generally the following four steps:

  • Data acquisition and preprocessing
  • Model selection and training
  • Evaluation of model effect
  • Service deployment and reasoning for the model

Model reasoning schemes are generally deployed in the cloud or server, and API is provided externally to the client. This mode has many advantages, cloud server space resources are sufficient, can store massive data; Powerful computing power, capable of complex model reasoning. But there are problems like cloud computing:

  • Response speed: Data transmission and response depend on network transmission, so the stability and response speed cannot remain stable. There is almost no solution for real-time problems or low delay problems.

    For example, when a self-driving car comes to a crossroads in the process of driving, there happens to be network fluctuation at this moment, and the cloud cannot be contacted. Should it go left or right?

  • Data privacy: In 2013, the Ministry of Industry and Information Technology issued a Notice on the Special Rectification of APP Infringement on Users’ rights and interests, indicating that China’s attention to the privacy of personal data is strengthening, and it is difficult for apps to obtain a large amount of data and send it to the cloud. The cloud without data is like “even a smart bricks without straw”.

The intelligent

In simple terms, the end intelligence is to complete the model reasoning process at the end side.

Compared with cloud intelligence, end intelligence has the following significant advantages:

  • Low latency: model reasoning is carried out on the end, without network request with the cloud, which reduces the response time. Low delay is very important for some scenes which require high real-time performance.
  • Security: Data does not need to be transferred to the cloud, which can better protect users’ private data.
  • Customization: local training according to user habits, step by step optimization, can better achieve “thousands of faces”
  • Resource saving: By using the end-to-end computing power and storage space, a large amount of cloud computing and storage resources can be saved

But terminal intelligence also has a fatal flaw — low computing power. The end-to-end computing power and storage are far from that of cloud servers, so large-scale and high-intensity computing cannot be done. In order to squeeze as much computing power as possible, it is necessary to adapt the hardware platform, do the instruction level optimization; At the same time, the model is compressed to reduce consumption in time and space.

At the same time, because the end side is only single user data, the amount of data is small, so the algorithm cannot be optimized. The user data is limited. The end-to-end data is not suitable for long-term mass storage, and the available data is limited.

End intelligence application

In recent years, there are more and more application scenarios of end intelligence, which are widely used in face recognition, gesture recognition, image search, interactive games, etc., but they can be roughly divided into two categories:

  1. Use AI to create new ways of interacting

    • Hand wash beauty makeupAR: Based on the endAIA combination of face detection capabilitiesARApps can give consumers a more realistic shopping experience.
    • Alipay sweep fu
    • Flying Pig Double Eleven Interactive game “Find a Find”
    • ARApplications, games: beauty camera, virtual makeup test
  2. Make use of new AI technologies to achieve more personalized and time-sensitive workflow

    • End focus ranking: through real-time recognition of user awareness, rearrange the server recommendation algorithm generatedfeedsFlow, make content recommendation more accurate.
    • Personalization: Mobile Taobao “Guess you like” page
    • .

End Intelligence Challenge

The application of end intelligence has been very extensive and universal, but the development of end intelligence also faces some challenges.

  • Device fragmentation: The devices on both ends are severely fragmented, and the operating systems and versions of the devices are complex. One of the challenges in the development of end intelligence is how to ensure that the model ADAPTS to various devices and make full use of acceleration.
  • Model and engine size: Too many models affect the loading speed and take up a lot of memory at runtime. The inference engine needs to be integrated into the app, too big an app will take up a lot of storage.
  • Memory usage: A large amount of memory is used during runtime, which affects user experience.

End intelligence Trend

Tao technology through research, summed up several major trends:

Upper reasoning and training

Reasoning on end is superior to training, but training ability on end still needs to be supplemented. In the China AI Computer Development Assessment Report 2020-2021, IDC predicted that the market share of servers for inference workloads will surpass training in the near future and maintain this trend over the forecast period.

But that doesn’t mean serving training isn’t important.

Post-moorish era and XPU

In an interview with Google Jeff Dean in mid-December 2019: One thing that’s been shown to be pretty effective is iterative chips to do certain kinds of computation that You want to do that are not completely general purpose, like a general-purpose CPU.

Moore’s Law has been failing over the past few years, and general-Purpose CPU performance growth has smoothed out. But artificial intelligence models require more and more computing power. As a result, many companies have launched “XPU” specially designed for AI acceleration.

The future intelligent computing power is mainly provided by various XPUS. How to adapt to the fragmented XPU and make full use of hardware capabilities is the key field of future inference engine breakthrough.

From mobile terminal to AIOT terminal

In the next few years, the global mobile phone shipments will not grow as sharply as in previous years, but will stabilize or even decline (see the global mobile phone shipment trend below). However, AIOT end products are diverse, such as fingerprint locks, surveillance cameras, drones and so on in the security field. On the other hand, with the development of cloud computing, the Internet of Things is developing by leaps and bounds, and the number of AIOT terminals will be more and more

End intelligent framework development

After the development of end intelligence in recent years, there have been some mature end – side inference engines.

  • Google/TensorFlow Lite
  • Facebook/PyTorch
  • Tencent/NCNN
  • Bytes/Pitaya
  • Baidu/Paddle – Lite
  • Taobao/MNN

Hot article recommended

Recent good

  • Spend the Spring Festival of the particle animation to remember your words | support expression
  • Rotation, the React! — Dancing React icon
  • Super decompression, to achieve particle gravitation and repulsion effect

JavaScript advanced series

  • JavaScript advanced thorough understanding of prototypes and prototype chains
  • Advanced JavaScript precompilation learning
  • JavaScript this point to problems in advanced thoroughly of JS | “chapter 2 w 38 interview problem”
  • Complete understanding of EventLoop in JavaScript
  • JavaScript advanced scope and scope chain

The interview part

  • The latest front end of the interview questions summary (including analysis)
  • Cow guest latest front-end JS written test 100 questions
  • In 2021, every major front-end company is taking those handwritten questions

Happy programming

  • CSS to achieve a free flying bird 🐦
  • Happy cat stroking for VSCode and website adoptive cats

Refer to the link

  • Easy to play mobile AI, one button integrated terminal intelligent framework Pitaya
  • Talk about web-side AI
  • MNN and Terminal Intelligence (Part 1)
  • MNN and Terminal Intelligence (Part 2)
  • I didn’t know you were so smart
  • What is edge computing
  • All this talk about edge computing, what is it really for?
  • What is Edge computing and cloud computing in three minutes