Docker installs and deploys ELK and Springboot to implement log collection


Download the Docker image

Docker Pull ElasticSearch :7.6.2 Docker Pull Logstash :7.6.2 Docker Pull Kibana :7.6.2Copy the code

Create the directory to mount the configuration file

mkdir /root/data/elk && cd /root/data/elk
mkdir elasticsearch
mkdir logstash
Copy the code

Create the logstash-springboot.conf file and upload it to the mount directory logstash

Logstash -springboot.conf file contents
Input {TCP {mode => "server" host => "0.0.0.0" port => "4560" codec => json_lines}} output {#stdout {# codec => rubydebug #} elasticsearch { hosts => ["es:9200"] index => "springboot-logstash-%{[appname]}-%{+YYYY.MM.dd}" #user => "elastic" #password => "changeme" } }Copy the code

Start the ELK service with the docker-comemage. yml script

Docker-comemage. yml file contents
Version: '3' Services: ElasticSearch: image: elasticSearch :7.6.2 container_name: ElasticSearch environment: - "cluster.name=elasticsearch" # set cluster name to elasticsearch - "discovery.type=single-node" # start in single-node mode - "ES_JAVA_OPTS= -xms512m Volumes: -xmx512m" - / root/data/elk/elasticsearch/plugins: / usr/share/elasticsearch/plugins mount - # plug-in file / root/data/elk/elasticsearch/data: / usr/share/elasticsearch/data # # data file mount - F: \ dev, elk, elasticsearch, x - pack - 6.2.4. Zip: / usr/share/elasticsearch/templ/x - pack - 6.2.4. Zip ports: - 9200:9200 kibana: Image: Kibana :7.6.2 container_name: Kibana Links: -elasticSearch: depends_on - ElasticSearch #kibana restarts environment after ElasticSearch: Hosts =http://localhost:9200" # set access to elasticSearch ports: -5601:5601 logstash: image: Logstash: 7.6.2 container_name: logstash volumes: - / root/data/elk/logstash/logstash - springboot. Conf: / usr/share/logstash/pipeline/logstash. Conf # mount logstash config file - /root/data/elk/logstash/logstash.yml:/usr/share/logstash/config/logstash.yml depends_on: Ports: -4560:4560 For elasticSearch ports: -4560:4560Copy the code
Upload to the server temporary directory and run in that directory
docker-compose up -d
Copy the code

Install the plug-in after the successful operation

Install Elasticsearch Chinese word segmentation (optional)

The Chinese word analyzer, IKAnalyzer, needs to be installed and restarted.

Docker exec it elasticsearch /bin/bash # This command needs to run elasticsearch-plugin install in the container https://github.com/medcl/elasticsearch-analysis-ik/releases/download/v7.6.2/elasticsearch-analysis-ik-7.6.2.zip docker restart elasticsearchCopy the code
Install the jSON_lines plugin in logstash
CD /bin/ # install the logstash plugin Logstash -codec-json_lines # exit the container # restart the logstash service docker restart the logstash serviceCopy the code

So far ELK has been installed and deployed successfully, and the next step is to integrate with Springboot

Add a Logstash dependency to your project

<! Logstash --> <dependency> <groupId>net.logstash. Logback </groupId> <artifactId>logstash-logback-encoder</artifactId> The < version > 5.3 < / version > < / dependency >Copy the code

Note: the version number here is 5.3, I tested 4.8 and reported no errors! But logstash can’t forward to ElasticSearch, which is a big headache.

Create logback-spring.xml in the Resources directory

Logback-spring. XML file contents
<? The XML version = "1.0" encoding = "utf-8"? > <! DOCTYPE configuration> <configuration> <include resource="org/springframework/boot/logging/logback/defaults.xml"/> <include resource="org/springframework/boot/logging/logback/console-appender.xml"/> <! - the name of the application - > < property name = "APP_NAME" value = "yzcrm - admin" / > <! <property name="LOG_FILE_PATH" value="${LOG_FILE:-${LOG_PATH:-${LOG_TEMP:-${java.io.tmpdir:-/tmp}}}/logs}"/> <contextName>${APP_NAME}</contextName> <! - logging to FILE appender every day -- -- > < appender name = "FILE" class = "ch. Qos. Logback. Core. Rolling. RollingFileAppender" > < rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy"> <fileNamePattern>${LOG_FILE_PATH}/${APP_NAME}-%d{yyyy-MM-dd}.log</fileNamePattern> <maxHistory>30</maxHistory> </rollingPolicy> <encoder> <pattern>${FILE_LOG_PATTERN}</pattern> </encoder> </appender> <! - output to logstash appender - > < appender name = "logstash" class = "net. Logstash. Logback. Appender. LogstashTcpSocketAppender" > <destination>localhost:4560</destination> <encoder charset="UTF-8" class="net.logstash.logback.encoder.LogstashEncoder" > <customFields>{"appname": "admin"}</customFields> </encoder> </appender> <root level="INFO"> <appender-ref ref="CONSOLE"/> <appender-ref ref="FILE"/> <appender-ref ref="LOGSTASH"/> </root> </configuration>Copy the code

If the configuration is ok, you can run your project directly!

Visit Kibana to view the logs

localhost:5601
Copy the code

If there is data, prove that ELK has been successfully integrated! Can be happy to touch fish