1. Install the ES
1.1 Downloading an ES Image
Docker pull elasticsearch: 7.6.1Copy the code
1.2 Mount Directory
mkdir -p /usr/local/docker/elk/es/data
mkdir -p /usr/local/docker/elk/es/logs
mkdir -p /usr/local/docker/elk/es/config
chmod a+w /usr/local/docker/elk/es/data
chmod a+w /usr/local/docker/elk/es/logs
chmod a+w /usr/local/docker/elk/es/config
Copy the code
In/usr/local/docker/elk/es/config directory elasticsearch. New yml file
cluster.name: my-application
network.host: 0.0. 0. 0
http.port: 9200
# Enable es cross-domain
http.cors.enabled: true
http.cors.allow-origin: "*"
http.cors.allow-headers: Authorization
xpack.security.enabled: true
xpack.security.transport.ssl.enabled: true
Copy the code
1.3 run ES
docker run -d --name elasticsearch \
-p 9200:9200 -p 9300:9300 \
-v /usr/local/docker/elk/es/data:/usr/share/elasticsearch/data \
-v /usr/local/docker/elk/es/logs:/usr/share/elasticsearch/logs \
-v /usr/local/docker/elk/es/config/elasticsearch.yml:/usr/share/elasticsearch/config/elasticsearch.yml \
-e "discovery.type=single-node"\ elasticsearch: 7.6.1Copy the code
Enter the ElasticSearch container and run the following command
Into the container
docker exec -it elasticsearch /bin/bash
Copy the code
Setting a password Press Y to confirm the password
elasticsearch-setup-passwords interactive
Copy the code
Access ES Enter the password you just set for the elastic user
2. InstallKibana
2.1 downloadKibana
The mirror
Docker pull kibana: 7.6.1Copy the code
2.2 Mount Directory
mkdir -p /usr/local/docker/elk/kibana/config
chmod a+w /usr/local/docker/elk/kibana/config
Copy the code
In/usr/local/docker/elk/kibana/config directory kibana. New yml file
server.host: 0.0. 0. 0
server.port: 5601
elasticsearch.hosts: ["http://192.168.0.103:9200"]
elasticsearch.username: "kibana"
elasticsearch.password: "123456"
Set Kibana to Chinese
#i18n.locale: "en"
#i18n.locale: "zh-CN"
Copy the code
2.3 runKibana
docker run -d --name kibana \
-p 5601:5601 \
-v /usr/local/ docker/elk kibana/config/kibana yml: / usr/share/kibana/config/kibana yml \ kibana: 7.6.1Copy the code
Access kibana using elastic user access
3. The installationLogstash
3.1 downloadLogstash
The mirror
Docker pull logstash: 7.6.1Copy the code
3.2 Mount Directory
mkdir -p /usr/local/docker/elk/logstash/config
chmod a+w /usr/local/docker/elk/logstash/config
Copy the code
In/usr/local/docker/elk/logstash/new logstash config directory. The file
input { beats { port => 5044 } } filter { grok{ match => { "message" => "%{TIMESTAMP_ISO8601:timestamp} %{LOGLEVEL:level} %{JAVALOGMESSAGE:msg}" } } date { match => ["timestamp","yyyy-MM-dd HH:mm:ss,SSS","ISO8601"] target => "@timestamp"}} output {if "ERROR" in [message]{hosts => ["192.168.0.103:9200"] index => "Open-web-error" template_overwrite => true}}else {elasticsearch {hosts => ["192.168.0.103:9200"] index => "open-web" template_overwrite => true } } }Copy the code
What I’m doing here is using the FileBeta as the input source of the LogStash to output to different indexes of ES depending on the level of the log
In/usr/local/docker/elk/logstash/config directory under new logstash. Yml file
xpack.monitoring.enabled: true
xpack.monitoring.elasticsearch.username: logstash_system
xpack.monitoring.elasticsearch.password: "123456"
xpack.monitoring.elasticsearch.hosts: ["http://192.168.0.103:9200"]
Copy the code
3.3 runlogstash
docker run -it -d -p 4560:4560 -p 5044:5044 --name logstash \
-v /usr/local/docker/elk/logstash/config/logstash.conf:/usr/share/logstash/pipeline/logstash.conf \
-v /usr/local/ docker/elk/logstash config/logstash. Yml: / usr/share/logstash config/logstash yml \ logstash: 7.6.1Copy the code
4. DownloadFilebeat
I start fileBeat locally to listen to the SpringBoot project’s log file and send it to LogStash to parse and store it in ES
Filebeta download address
Select the product and version to download
4.1 configurationfilebeat.yml
file
Coordinates the location of the listening log file
configurationlogstash
address
4.2 startFilebeat
.\filebeat.exe -e -c .\filebeat.yml
Copy the code
Start the SpringBoot project to generate log files
5. Enter theKibana
Viewing Log Information
Click on the Management
Click Index Patterns and then Create Index Pattern
Select the index you want to view and click Next Step
Click Create Index Pattern
To view logs, search for related fields in the search bar
You can set the time format here if you’re not used to it
You can view the specific Settings
Kibana sets the time format