Edge computing is a computing service that is located on the side of the object or data source. Its applications are initiated on the edge, generating faster network service responses and meeting the industry’s basic needs in real-time business, application intelligence, security and privacy protection. In this paper, the author introduces some methods of real-time data processing on CDN nodes, which are worth learning by architects.

CDN-Content Delivery Network

So let’s first define what CDN is. A content delivery network (CDN) is a system of distributed servers that deliver web content to users based on their geographic location and the source of the data content (usually web pages). But in the age of the Internet, CDN is no longer just for distributing web content. In the case of Cloudflare Workers [1], in addition to using its network to distribute content, you can even deploy and run your code on its edge nodes. “You can deploy or run Javascript code, which helps you decouple the code from the end-user device, such as programmatically enabling routing, filtering, and so on.” In this explosive Internet age, high scalability is a critical capability. CDN and Edge Computing will further converge.

Real-time data acquisition — push and pull

At present, many applications that emphasize real-time need to push and pull data. Both passive push and active pull are very common and simple engineering problems. For example, historical data can be pulled from CDN during application initialization, and then update data can be pushed by other services. But let’s see if we can combine these two mechanisms together. Connect Fastly and Fanout via an agent

Fastly is an Edge Cloud Platform that enables applications to execute and service on the Edge of the web. In essence, it provides a highly scalable “data pull-response” service that listens and responds to user requests in real time. Compared to a traditional CDN, Fastly can also cache static content, while deploying and running application logic. Fanout, on the other hand, is a highly scalable data push service, such as a high-performance reverse proxy service that pushes data to clients in real time over long links. Fastly and Fanout can be used together. Collectively, they act as a reverse proxy for the source server, Fastly proxy traffic to Fanout, so that the client doesn’t have to directly request your source server. This will bring some benefits:

  • High availability and scalability, that’s for sure

  • Caching initial data

  • Cache instructions for Fanout. This should be noted: some of the behavior of Fanout is configured by instructions. Such as transport mode, subscription channel and so on. Typically, these instructions are retrieved from the source server’s response (a special class of headers, known as Grip). Fastly can cache these commands after receiving a response.

Mapping network Flows

By using Fanout and Fastly together, we can reconstruct the network data flow in this push-pull model. Here’s a closer look at how it works: Suppose we have an HTTP Endpoint called /stream, which returns some initial data and pushes it to the connected client when new data is generated. With Fanout, we can make this Endpoint return a response with instruction :(response header for example)

HTTP/1.1 200 OKContent-Type: text/plainContent-Length: 29Grip-Hold: streamGrip-Channel: updates{"data": "current value"}Copy the code

When Fanout receives such a response from the source server, it converts it into the HTTP Streaming response:

HTTP/1.1 200 okContent-Type: text/plainTransfer-Encoding: chunkedConnection: Transfer-encoding {"data": "current value"}Copy the code

Thus, the Fanout request to the source server is complete, but the client request (connection) to Fanout is still in the open state, represented by a sequence diagram like this:

Because Fanout is a request/response mode for short links to the source server, you can transform long links via Fastly:

This way, when a client requests /stream, the source server does not participate at all:

In other words, Fastly returns the same response to Fanout, with the specific headers for those initial data, and Fanout maintains the streaming connection to the client. In the above process, we only solved the “pull” process, but also needed to realize the real-time “push” of new data to Fanout (client). Clear fastly’s cache

We need to clear the cache on fastly to update it when the data on the source server changes. Again, if the /stream endpoint data changes, we need to clear Fastly’s cache and broadcast the new data to Fanout. The sequence diagram below depicts a more complex scenario in which existing clients are pushed with new data, followed by a new client connection:

Efficient flow control

In this hybrid architecture, the ideal data read and write model for efficiency is:

  • Data access: Several new reads per second

  • Data updates: several writes per minute

  • Data distribution: millisecond delivery

If your data changes every second, it’s probably best not to clear the cache every time your data changes. For example, during peak times, we can limit the frequency of purges, and most read requests are still answered by cached data and updated later. Demo

The demo application, hosted on GitHub, uses Fastly and Fanout to provide a live

Counter service. The request goes first to fanout, then to Fastly, and eventually to a Backend server implemented by Django. The service implements simple counter logic. When a counter value is updated, Fastly’s cache is erased and sent via fanout. The cleanup and update processes are restricted by flow control to maximize cache efficiency. Imagine

We can design a message content distribution network consisting of several sets of servers that are completely geographically distributed and can provide both dynamic and static content distribution in near real-time. This new type of CDN allows data processing to extend to the edge of the network, regardless of where the application’s own source services are located. This will open up huge possibilities for mobile applications and IoT application forms.

英文原文 :

https://hackernoon.com/powering-your-app-with-a-realtime-messaging-cdn-13d92a6df5f3