Introduction: This article introduces the typical application scenarios and some products of Serverless from cloud computing.

First, from cloud computing to Serverless

Since the world’s first general-purpose computer ENIAC (left) was born, the development of computer science and technology has never stopped moving forward. From 2003 to 2006, Google published three classic papers (figure right), which pointed out the technical foundation and future opportunities of HDFS (distributed file system), MapReduce (parallel computing) and HBase (distributed database), laying a foundation for the development direction of cloud computing.

So from. ENIAC to Google’s three classic papers, the development of computer science and technology is constantly advancing. In the era of cloud computing, it can be said that the development of computer science and technology is in rapid progress.

Cloud computing concepts

For the definition of cloud computing, academia and industry have different understandings. Let’s review the development history of cloud computing:

1. In 2006, Google CEO first proposed the concept of cloud computing at search Engine Conference; In the same year, Amazon sold its flexible computing capacity as a cloud server, marking the formal birth of cloud computing as a new business model.

2. In 2008, Microsoft released Windows Azure, a cloud computing platform, trying to make technology and services hosted and online;

3. In 2009, Berkeley White Paper on Cloud Computing was published, in which the definition of cloud computing was clearly stated: Cloud computing includes the application services on the Internet and the hardware and software facilities providing these services in data centers.

After a clear definition, Berkley presented a vision for cloud computing, and also identified 10 problems that cloud computing faces, such as service availability, data loss, or data security and auditability.

2. Concept of Serverless

Serverless definition

Serverless does not mean that there is no need to rely on resources such as servers. ** It means that developers no longer have to worry about servers and can focus more on production code. Meanwhile, computing resources are beginning to appear as services, ** rather than as the concept of servers. **Serverless is a complete process for building and managing a microservices-based architecture that allows users to manage their application deployment at the service deployment level rather than the server deployment level. ** is different from traditional architectures in that it is completely managed by third parties, triggered by events, exists in Stateless, transient (which may only exist during a single invocation) in computing containers, Serverless requires no more infrastructure to deploy applications. You can basically build, deploy, and start services automatically.

Serverless architecture

On the left is the architecture of a traditional Web application, which is composed of client, server, database and other elements.

In the past, when doing such projects, developers had to do a lot of work on the server side, such as buying servers, considering the number of purchases, broadband, operating system, which regions to deploy, environment, software, etc. Subsequently, the health of this server also needs to be monitored in real time and perceived constantly.

In the Serverless architecture, developers only need to care about our business code. In the whole development, online and maintenance process of the project, users do not need to pay attention to the maintenance of the server level, nor do they need to invest in the operation and maintenance resources for the peaks and troughs of traffic, which will be responsible by cloud vendors. At the same time, in the Serverless architecture, users do not have to pay extra for idle resources.

Serverless advantages

Serverless architecture has the characteristics of zero server operation and no computing cost when idle. The delivery mentality can be seen as leaving complexity to cloud vendors and convenience to more developers. To sum up, the advantages of Serverless can be reflected as follows:

1) Reduce cost and improve efficiency

Cloud vendor provides the user with a server management and operational work, provides the user with a database, such as object storage Baas service, allows users to focus more on their own business logic, improving development efficiency and narrow the innovation cycle of the project, at the same time Serverless users worry more about their own server operations, Infrastructure operation and maintenance work, let alone extra expenses for this part, need not bear more operation and maintenance work costs; The Serverless architecture provides a more complete and comprehensive pay-as-you-go model. Users only need to pay according to the amount of resources they actually use. The Serverless architecture has clear advantages in this area.

  • Reduce operation and maintenance costs
  • Reduce labor costs
  • Improving r&d efficiency
  • Reduce the innovation cycle
  • Pay according to quantity, reduce expenditure cost

2) Safe, convenient and reliable

More professional things to more professional people to do, Serverless architecture will be more server operation and maintenance, security related things to cloud vendors to do, a large scale to improve the overall security of the project; At the same time, Serverless architecture is significantly simpler than other architectures, because more Baas services are provided by cloud vendors, users will manage fewer components, which means that users of Serverless can manage projects more easily and conveniently. At the same time, Serverless architecture has elastic capacity, that is, automatic scaling capacity, which enables projects to expand automatically when traffic increases and reduce capacity automatically when traffic decreases, thus ensuring the security and stability of the whole business. Professional teams guarantee security and performance for users, which enables the Serverless architecture to:

  • Lower security risk
  • Lower resource overhead
  • In line with “green” computing ideas
  • More convenient management
  • Elastic expansion, more reliable service

In general, hosting to cloud vendors not only massively improves overall security and stability of the project, but the Serverless architecture is also significantly simpler than other architectures.

Challenges

Although Serverless architecture has been around for many years, it really entered the “first year” and developed rapidly in a very short time. Therefore, although Serverless architecture has many advantages, it also faces some difficulties and challenges, including but not limited to serious cold start problems, imperfect development tools, vendor lock-in and so on. However, in recent years, the popularity of Serverless architecture continues to rise. People have high expectations for it, and various manufacturers have also increased investment. They believe that the current problems are temporary and Serverless architecture will evolve towards a better and easier direction.

Serverless architecture provides users with a new programming paradigm. At the same time, when users are enjoying the first wave of technology dividend brought by Serverless, the disadvantages of Serverless are gradually exposed, such as the cold start of functions, which is now a very serious and highly concerned problem. Because the Serverless architecture has the ability to scale flexibly, the provider of Serverless services will increase or decrease instances based on the traffic fluctuations of user services, as shown in the diagram.

Take Aliyun function calculation as an example. When the system receives the first event that triggers the function, it will start a container to run the code. If a new event is received at this point and the first container is still processing the previous event, the platform launches a second code instance to handle the second event, and this automatic, zero-management level scaling of the Serverless architecture continues until there are enough code instances to handle all the workloads. Of course, it is not only in concurrent cases that the cold start of a function is triggered easily, but also when the interval between two functions exceeds the threshold of instance release time, as shown in the figure below.

However, there is an issue here. When a new request or event comes in, the following two situations may occur in a broad sense:

  1. There are idle instances that can be reused directly: hot start.
  2. There are no idle instances that can be reused directly: cold start.

When a function is executed locally, usually the environment is ready and only the corresponding method of the function needs to be executed each time. However, this is not the case under Serverless architecture. The difference between local and FaaS function calls is shown in the figure below.

Typical application scenarios

Serverless architecture has been developed for several years, and there are many best practices in many fields. CNCF summarizes some scenarios suitable for Serverless architecture.

Real-time file processing

In video applications, social applications and other scenarios, pictures, audio and video uploaded by users tend to have a large amount and high frequency, which has high requirements on real-time and concurrent capability of the processing system. For example, you can use multiple functions to process images uploaded by users, including image compression and format conversion, to meet requirements in different scenarios.

ETL data processing

To process big data, you need to set up a big data framework such as Hadoop or Spark and have a data processing cluster. With the Serverless technology, you only need to continuously store the obtained data to the object storage, and use the object storage related triggers to trigger the data splitting function to split the relevant data or tasks, and then call the related processing function. After the processing is completed, the data is stored in the cloud database.

The nearly unlimited capacity of function computation makes it easy for users to compute large amounts of data. Using the Serverless architecture, multiple mapper and Reducer functions can be concurrently executed on source data to complete the work in a short time. The whole process can be simplified as shown in the following figure. Compared to the traditional way of working, using the Serverless architecture can avoid idle waste of resources and thus save costs.

Real-time data processing

With the rich event sources supported by the Serverless architecture and the event triggering mechanism, data can be processed in real time through a few lines of code and simple configuration, such as decompressing the object storage compression package, cleaning the data in the log or database, and customized consumption of MNS messages.

Machine learning (AI inference prediction)

After the AI model has been trained, the Serverless architecture can be used to provide reasoning services externally, by wrapping the data model in calling functions and running the code when the actual user request arrives. Compared with traditional inference prediction, the advantage of this method is that no matter function module, back-end GPU server, or other related machine learning services connected to the function module, can be paid by the amount and automatically scale, so as to ensure the performance and service stability.

Web application/mobile application backend

The Serverless architecture, in combination with other cloud offerings from cloud vendors, enables developers to build mobile or Web applications that can scale flexibly and easily create rich Serverless backends that can run in multiple data centers with high availability without any management work in terms of scalability, backup redundancy.

Audio and video transcoding

In video application and social application scenarios, users upload some videos. Usually, the uploaded videos are transcoded to different definitions. After the combination of Serverless technology and object-storage related products, object-storage related triggers can be used, that is, the uploader uploads the video to the object-storage, which triggers the computing platform of Serverless architecture (FaaS platform) to process the video, and then the video is stored in object-storage. At this time, other users can choose to play the encoded video and choose different definition, as shown in the picture.

To sum up, the typical application scenarios of the Serverless architecture are more determined by the characteristics of the Serverless architecture. Of course, over time, the Serverless architecture will evolve, with features becoming more prominent and weaknesses being offset.

Iv. Introduction to Serverless Products

Public Cloud Products

In the years since the concept of Serverless was first proposed, from obscurity to prominence, the Serverless architecture has also grown rapidly. In the Serverless architecture, computer services are often provided by Faas platforms. AWS Lanmbda, Google Cloud Functions and Ali Cloud Function Computing are all representative industrial products.

Ali cloud Serverless

Serverless architecture has high requirements for the underlying technical foundation, and it can be seen from the figure that Ali Cloud is very perfect about the overall form of Serverless architecture, and has been continuously improving on the road of self-construction and self-research. Alibaba Cloud Serverless products have been implemented in Alibaba’s economy, which are well reflected in Taobao, Alipay, Xianyu, Feizhu, Dingding and Yuquai.

Function computing is the Serverless product with the most complete ecology and rich functions in China, and the one-step cloud and one-button Serverless will become a reality for developers. The figure above shows some analysis of Aliyun Serverless from the product dimension, while the figure below shows typical business capabilities of Serverless from the function or architecture, underlying infrastructure, and computing level. For example, elastic scaling, load balancing, flow control, high availability deployment, version gray fault recovery, etc. Like the container image service at run time, Ali Cloud was launched last year, followed by AWS, Tencent, etc., have also launched container image. The figure explains how Ali Cloud Serverless is gradually built to the upper layer through the bottom layer, so that developers can use Serverless more simply and conveniently.

Open source products

Not only are there many vendors working on the Serverless architecture in the industry, there are also many excellent Serverless projects in the open source space. Many excellent open source FaaS platforms, including OpenWhisk, Fission, Knative, and Kubeless, have been approved by CNCF.

Lecturer profile: Liu Yu (Jiang Yu), PhD candidate in Electronic information major of National University of Defense Technology, ali Cloud Serverless product manager, Ali Cloud Serverless cloud evangelist, CIO College distinguished lecturer.

This article is compiled from the Matinee of ServerlessLive series on September 22

Live replay link: 0 foundation promotion Serverless master class – Ali Cloud developer community

The original link

This article is the original content of Aliyun and shall not be reproduced without permission.