0. Architecture Introduction

The real-time stream on the simulation line, such as the user’s operation log, will be processed after the data is collected. For the time being, only the data collection will be considered. Html+Jquery+Nginx+Ngx_kafka_module+Kafka will be used to achieve this. Ngx_kafka_module is an open source component designed to interconnect Nginx with Kafka.

1. Requirement description

1.1 withhtmlandjquerySimulate user request logging

These include the following items:

User id:user_id, access time :act_time, operations: (action, including click,job_collect,cv_send,cv_upload)

Enterprise code Job_code

1.2 Accept requests in 1.1 using Nginx

1.3 After receiving the request, use ngx_kafka_module to send data to the tp_individual topic of Kafka.

1.4 Use a consumer to consume the topic in Kafka, observe

2. Setup procedure

2.1 Kafka installation

Since you use a ready-made installed Docker-kafka image, you can start it directly.

2.2 Installing and starting Nginx

$CD/usr/local/SRC $git clone [email protected]: edenhill/librdkafka git # into librdkafka, $CD librdkafka $yum install -y GCC GCC -c++ pcre-devel zlib-devel $./configure $make && make install $yum -y install make zlib-devel gcc-c++ libtool openssl openssl-devel $ cd /opt/hoult/software # 1. Wget download $http://nginx.org/download/nginx-1.18.0.tar.gz # 2. $tar -zxf nginx-1.18.0.tar.gz -c /opt/hoult/ Servers # 3\. $CD /opt/hoult/software $git clone [email protected]:brg-liuwei/ngx_kafka_module. $CD /opt/hoult/ Servers /nginx-1.18.0 $./configure --add-module=/opt/hoult/software/ngx_kafka_module/ $make && make install # 5. Delete the Nginx installation package $rm/opt/hoult/software/Nginx - 1.18.0. Tar. Gz # 6. Start nginx $CD /opt/hoult/ Servers /nginx-1.18.0 $nginxCopy the code

3. Configure related parameters

3.1 Nginx Configure nginx.conf

#pid logs/nginx.pid; events { worker_connections 1024; } http { include mime.types; default_type application/octet-stream; #log_format main '$remote_addr - $remote_user [$time_local] "$request" ' # '$status $body_bytes_sent "$http_referer" ' #  '"$http_user_agent" "$http_x_forwarded_for"'; #access_log logs/access.log main; sendfile on; #tcp_nopush on; #keepalive_timeout 0; keepalive_timeout 65; #gzip on; kafka; kafka_broker_list linux121:9092; server { listen 9090; server_name localhost; #charset koi8-r; #access_log logs/host.access.log main; # -- -- -- -- -- -- -- -- -- -- -- -- kafka related configuration start -- -- -- -- -- -- -- -- -- -- -- -- location = / kafka/log {# cross-domain related configuration add_header 'Access - Control - Allow - Origin' $http_origin; add_header 'Access-Control-Allow-Credentials' 'true'; add_header 'Access-Control-Allow-Methods' 'GET, POST, OPTIONS'; kafka_topic tp_individual; } #error_page 404 /404.html; }}Copy the code

3.2 Start Kafka producers and consumers

Sh --zookeeper linux121:2181/myKafka --create --topic tp_individual --partitions 1 --replication-factor 1 # create consumer kafka-console-consumer.sh --bootstrap-server linux121:9092 --topic tp_individual Kafka-console-producer. sh --broker-list linux121:9092 --topic tp_individual copy codeCopy the code

3.3 Writing Html + Jquery code

<! DOCTYPE html> <head> <meta http-equiv="Content-Type" content="text/html; charset=UTF-8"> <meta name="viewport" content="width=device-width, initial-scale=1,shrink-to-fit=no"> <title>index</title> <! -- jquery cdn, Can change other -- > < script SRC = "https://cdn.bootcdn.net/ajax/libs/jquery/3.5.1/jquery.js" > < / script > < / head > < body > < input /> <input id="collect" type="button" value=" collect" Onclick ="operate('job_collect')" /> <input ID ="send" type="button" value=" send resume "onclick="operate('cv_send')" /> <input Id ="upload" type="button" value=" upload CV "onclick="operate('cv_upload')" /> </body> <script> function operate(action) {var  json = {'user_id': 'u_donald', 'act_time': current().toString(), 'action': action, 'job_code': 'donald'}; $.ajax({ url:"http://linux121:8437/kafka/log", type:"POST" , crossDomain: true, data: Json.stringify (JSON), // This sentence allows cross-domain cookies to access xhrFields: {withCredentials: True}, success:function (data, status, XHR) {// console.log(" '" + action) }, error:function (err) { // console.log(err.responseText); }}); }; function current() { var d = new Date(), str = ''; str += d.getFullYear() + '-'; str += d.getMonth() + 1 + '-'; str += d.getDate() + ' '; str += d.getHours() + ':'; str += d.getMinutes() + ':'; str += d.getSeconds(); return str; } </script> </html>Copy the code

Put A.HTML in the nginx directory and use the browser to access 192.168.18.128:9090

4. Demonstrate

  • 4.1 Start the ZK cluster, kafka cluster first
  • 4.2 Then create topic, create consumer, create producer, test topic
  • 4.3 Start the nginx access page and click to observe the consumer status

The whole process is shown below:

Check your profile for more.

Author: Hoult, Wu Xie

Links: juejin. Cn/post / 690304… Source: Nuggets