“What will be hot and what to learn in 2022? This article is participating in the” Talk about 2022 Technology Trends “essay campaign.

preface

Ai is one of the strategic technologies leading social and economic development, and ai is at the core of the development of many industries. With the rapid development of ARTIFICIAL intelligence, earth-shaking changes are taking place every day. In this article, Bao puts forward six insights on the development of ARTIFICIAL intelligence in 2022 and even in the next few years. Let’s embrace the great era of artificial intelligence together.

Massive model is the general trend of AI development

Though not toad Hall relegated immortal, fear ice hall cold biting. Peep curtain squint jin Wu small, how many handsome in this close.

The above famous poem is not the work of the ancient Chinese masters, but the masterpiece of inspur’s newly released model of the world’s largest artificial intelligence – “Source 1.0”. “Source 1.0”, with 245.7 billion parameters, uses the text analysis model to acquire 5TB high-quality Chinese data sets. As a language model, it can better complete tasks such as reading, understanding, reasoning and logical judgment, especially for Chinese.

Why is there such a massive model? With the increasing application of ARTIFICIAL intelligence scenarios in various industries, AI models are increasingly diversified and complicated. Small AI models can complete practical applications in all walks of life, but their versatility is poor and accuracy has limitations. When application scenarios are changed, small AI models may not be applicable.

“How ai can develop human-like cognitive abilities of logic, consciousness and reasoning is the direction that AI research has been exploring. Currently, training very large models with very large numbers of parameters through large data sets is considered an important direction that is very promising for achieving general ai.” Academician Wang Endong believes that with the rise of large-scale models, large-scale has become a very important trend in the development of artificial intelligence in the future.

Massive model will be the basis of standardized innovation. In the past decade, the size of artificial intelligence model parameters has gradually increased from tens of millions to hundreds of billions. In 2020, OpenAI released gpT-3 deep learning model with 175 billion parameters, officially bringing the language model into the era of 100 billion parameters. In 2021, there will also be several super-large models in the world. For example, in the English field, MT-NLG model launched by Microsoft and Nvidia has 530 billion parameters. In Chinese, we have Inspur’s Source 1.0 with 245.7 billion parameters, Baidu and PCL’s ERNIE 3.0 Titan with 260 billion parameters, There are even 1.6 trillion parameter models like Switch Transformer (which uses a Mix of Expert, MoE, not a single model).

Judging from the research and development of jumbo model in 2021, the trend of jumbo model is just beginning, and in 2022, trillion level monomer model will even be launched. At the same time, optimization of resource use of massive model will also become a new research direction. It is believed that there will be a massive model with more parameters but less GPU in the future.

For the development of artificial intelligence, on the basis of realizing ideal generalization capability, the smaller the model, the more widely and flexibly adapted to application scenarios. However, on this ideal path, it is necessary to first use large models and extract massive valuable data for sufficient training and learning, so as to gradually tune to the optimal and optimal results. Then, it is necessary to carry out training of small models to achieve flexible and extensive adaptation.

The AI is constantly penetrating beyond the edge

Edge computing refers to processing and analyzing data at network edge nodes. Edge node refers to any node with computing and network resources between data generation source and cloud center. For example, mobile phone can be the edge node between people and cloud center, while gateway is the edge node between smart home and cloud center.

With the rapid development of the Internet of Things technology and the increasing demand for real-time business, edge measurement and end-to-end computing capability become more and more important. For example, in the industrial Internet, the accuracy and real-time requirements of data collection continue to improve, and the amount of data collected is also more and more huge. In order to better analyze data in real time, process massive terminal data and reduce the pressure of cloud network transmission, artificial intelligence computing power will continue to penetrate to the edge, no matter it is closer to the light edge of end-to-end data or closer to the core data center heavy edge, 2022 will usher in golden development opportunities.

IDC predicts that by 2023, nearly 20% of processors will be deployed on the edge to handle AI workloads; Seventy percent of enterprises will run different levels of data processing on the edge of the Internet of Things.

The development of end side will also usher in a golden opportunity. End intelligence itself has high real-time, low delay and strong privacy. In recent years, it has witnessed rapid development and is widely used in face recognition, gesture recognition, image search, interactive games and so on. Although there are serious limitations to end-intelligence, IDC predicts that the market share of servers for inference workloads will surpass training in the near future and continue to do so over the forecast period. And as major companies continue to introduce higher performance XPU, the end intelligence will be less and less limited by computing power.

AI drives the diversification of chips

, in the field of artificial intelligence, algorithms, data to calculate the force are three key elements, and when it comes to calculate force, is inseparable from the chip, in recent years, artificial intelligence application ground scene is more and more rich, so artificial intelligence chip present diversified development trend, through continuous evolution of architecture, to provide a steady stream of power for the next generation of computing.

  • From the demand side: with the rapid development of smart city, intelligent manufacturing, intelligent finance, automatic driving and other fields, the application scenarios supporting speech recognition, computer vision and natural language processing are increasingly extensive, and enterprises have an increasing demand for ARTIFICIAL intelligence chips.
  • From the perspective of supply side: the differentiated use of AI in different industries and scenarios also gives birth to ai chips with differentiated characteristics. The wide application of ARTIFICIAL intelligence chips and the continuous enrichment of common applications have brought excellent development opportunities for manufacturers specializing in the development of artificial intelligence chips, and artificial intelligence chip products are more subdivided and diversified. For example, Cambrian, Horizon, etc. are involved in the chip industry to accelerate the research and development of chips and progress.

Changes in both demand and supply have promoted the diversified innovation and development of ai chip industry and technology:

The chip type

In terms of technical architecture, AI chips can be roughly divided into two types:

  • Algorithm acceleration chip: Based on some commonly used chip architectures, add acceleration units to AI algorithms, such as usingCPU,GPU,ASICDSPTo accelerate some of the existing onesAIIn the algorithm.
  • Adaptive smart chips: These chips have more flexibility on their own, the ability to adjust themselves, change themselves, adapt to new work needs, and even some self-learning capabilities. Such as neuromimicry chip, software – defined reconfigurable chip and so on.

One way to solve the problem is to mimic the workings of biological neurons more closely than mainstream artificial neural networks, an approach known as “neuromorphic”.

The ai industry is changing rapidly, and chip manufacturers need to constantly develop and upgrade new chip products to meet this challenge, especially GPU, but also FPGA, ASIC and NPU.

IDC research found that in the first half of 2021, AMONG Chinese AI chips, GPU is still the first choice to realize data center acceleration, accounting for more than 90% of the market share, while other non-GPU chips such as ASIC, FPGA and NPU are also increasingly adopted in various industries and fields, with the overall market share approaching 10%. It is expected to exceed 20% by 2025.

Neuromimicry chips have the advantages of low power consumption, low latency and high processing speed, and their industrialization and commercialization are still evolving. The development of machine learning and in-depth research on brain will bring more possibilities for the further development of neuromimicry chips.

Deployment location

AI chips can be deployed in the cloud, edge, and terminal. The cloud is the center of data and large computing power and the carrier of massive data processing and large-scale computing. The CLOUD AI chip needs to have high storage capacity, high floating-point processing speed and high scalability. In order to share the pressure of cloud computing power and improve the real-time response speed of application scenarios, AI chips will be deployed on the edge on a large scale and dispersed in the future, which requires that the chips have strong adaptability and can adapt to various complex scenarios.

Industry development

With the wide application of artificial intelligence, various KINDS of AI chips are in demand.

  • Computing power requirements: AI needs to fund operations on unstructured data,AIChips need to provide powerful computing power to efficiently cope with a variety of scenarios.
  • University heat dissipation means: with the improvement of computing power, efficient heat dissipation means are more and more important. High computing power and low energy consumption areAITrends in chip development.
  • Algorithm flexibility: There will be new algorithms other than deep learning to stand out in the future, which requiresAIChips not only adapt to deep learning algorithms, but also need to adapt to different algorithms.

AI and cloud accelerated integration

AI and cloud computing are a natural match, and the combination of AI and cloud computing is the general trend.

Top companies at home and abroad have begun to layout cloud intelligence. Baidu cloud and Ali cloud are respectively upgraded to Baidu intelligent cloud and Ali intelligent cloud. Jingdong Cloud and AI Division officially unified the three brands of jingdong Cloud, JINGdong ARTIFICIAL Intelligence and JINGdong Internet of Things into the brand of “Jingdong Zhilian Cloud”. Meanwhile, Amazon AWS, Google Cloud and Microsoft AZURE, while not adding “intelligence” to their names, have made AI a strategic layout.

  1. AIThe most direct way to make cloud computing stronger is to improve computing power.

The rapid development of AI industrialization, the rapid development of loT, 5G, edge computing and other new technologies, the amount of data provided by the rapid increase and increasingly complex data structure, the application of computing power demand presents an exponential growth, the calculation process is more and more complex. The integration of AI and cloud computing has spawned many new architectural approaches.

  • Edge computing: Shaping the new computing architecture of cloud-edge
  • Integration of visual, phonetic, semantic, etcAIMultimodal calculation of capability
  • Real-time class with high real-time and low latencyAIApplications, such as autonomous driving.
  1. Cloud computing makesAIstrongerAICloud computing can provide three elements. Cloud computing can collect data, provide large-scale clustering computing capacity, and also accumulate a large number of customers and application scenariosAIProvide greater convenience.

The integration of AI and cloud is an inevitable trend. AI public cloud services can enable enterprises to deploy AI applications efficiently, easily acquire AI capabilities on the cloud and effectively access and use ARTIFICIAL intelligence technologies. With the rapid development of artificial intelligence technology, AI public cloud services can be rapidly iterated at the initial stage with less cost.

Industry leaders are also starting to deploy private clouds to support their emerging business applications, including artificial intelligence. The trend toward hybrid IT architectures with public, private, and traditional data centers is having a significant impact on enterprise technology and business innovation.

Artificial intelligence scenarios bloom in multiple places

With the development of artificial intelligence technology, artificial intelligence has gradually realized the deployment and even large-scale application in the production and business environment, helping all walks of life to develop in a more intelligent, green, comprehensive and diversified direction.

According to IDC’s research on the application status of artificial intelligence technology in 2021, computer vision is the most popular application technology, followed by video surveillance, image recognition, intelligent camera, face recognition and other enterprise applications. In the future, companies are expected to deepen the use of technologies such as speech recognition and natural language processing. The figure below shows the ratio of deployed ai scenarios to those deployed within three years.

From the perspective of industrial applications, the application scenarios of ARTIFICIAL intelligence have been transformed from fragmentation to deep integration, from single point of application to diversified application scenarios. Ai has been widely applied in finance, energy, manufacturing, transportation and other fields. Specifically,

  • Enterprise computing power investment degree is high, the coverage is wide, into the mature application stage of anti-fraud, risk assessment, intelligent recommendation, etc
  • Intelligent supply chain, intelligent communication, intelligent equipment operation and maintenance and other manufacturing industries are developing rapidly
  • Limited by development time, computing power, model, technology, capital and other reasons, visual perception, intelligent oilfield and other application scenarios are still in the early stage of development, with broad prospects in the future.
  • In response to the concept of low energy consumption, low emissions, recyclable and sustainable development, ARTIFICIAL intelligence can be applied to green energy and environmental ecological governance.

In addition, IDC conducts comprehensive evaluation on the investment in AI in different industries, the maturity of industrial application scenarios, and the maturity of data platforms. In 2021, the TOP5 industries in terms of AI application penetration are Internet, finance, government, telecommunications, and manufacturing. Transportation, health care, energy and education ranked sixth to ninth. The chart below shows the penetration of China’s AI industry in 2021.

It can be seen that the implementation of ARTIFICIAL intelligence technology brings more value to the industry, which not only improves the operation efficiency and production efficiency of enterprises, but also promotes the ability of enterprises to innovate. In 2022, the field of artificial intelligence will usher in more extensive and comprehensive application, blossom in many places, and promote the progress and development of society in an all-round way.

Policy positive, AI such as nine days kunpeng

As one of the strategic technologies, ai industry can effectively promote global economic recovery and enterprise innovation during the epidemic period and the new normal. Artificial intelligence is highly concerned by all countries in the world, and countries are accelerating the national strategic layout of ARTIFICIAL intelligence. Driven by favorable policies, AI develops like a kunpeng in heaven, rising ninety thousand miles and deepening its application in various fields.

From 2017 to 2021, China has repeatedly formulated policies to encourage AI to develop the whole industrial chain from basic theoretical research to industrial application. “Difference” planning outline of a new generation of artificial intelligence, as one of the seven new frontier to pr, encourage accelerated artificial intelligence cutting-edge basic theoretical breakthrough, dedicated chip research and development, open source algorithms such as deep learning framework platform construction, promote learning and reasoning and decision-making, images, graphics, speech video, natural language processing areas such as innovation, The integration of ARTIFICIAL intelligence with digital information technologies such as big data, the Internet of things and edge computing will be accelerated to promote industrial upgrading and overall improvement of productivity.

In August 2021, S&T (S&T) of the Department of Homeland Security (DHS) released the Artificial Intelligence and Machine Learning Strategic Plan. The strategic Plan sets out three goals: to promote the use of next-generation ARTIFICIAL intelligence and machine learning technologies in the Department of Homeland Security, increase investment in RESEARCH and development, and use these technologies to build a secure network infrastructure; Facilitate the deployment of existing proven AI and machine learning capabilities in DHS missions; Build and nurture an interdisciplinary AI/ML workforce.

The 2020 White Paper on AI published by the European Commission sets out a clear vision for AI in Europe: an ecosystem of excellence and trust. In April 2021, the European Commission proposed a draft regulation aimed at strengthening the regulation of artificial intelligence (AI) technology. The draft seeks to create a list of so-called “AI high-risk application scenarios,” with new standards and targeted regulation for the development and use of AI technology in critical infrastructure, university admissions, loan applications and other areas identified as “high-risk” applications.

According to the organization for economic cooperation and development, currently has more than 60 countries and regions in the world have priority to the development of artificial intelligence policies and issues, formulate and publish national strategy of AI, attaches great importance to the ecological development of artificial intelligence industry, strengthen the research capacity, strengthening the related supporting industries, promote the competitiveness of the enterprise and innovation, Actively explore AI development ways to load its own needs and advantages.

In 2022, on the basis of favorable policies of various countries, AI will surely usher in brilliant development.

The article refers to the assessment Report of China’s ARTIFICIAL Intelligence Computing Power Development in 2021-2022.