In my previous post “Elastic: Deploying Elastic Stacks with Docker”, I covered in detail how to deploy Elasticsearch and Kibana in Docker. In today’s article, we’ll take a closer look at how to deploy Logstash in Docker.

First, let’s create a directory called docker-logstash. In this directory, there are the following files:

$ pwd /Users/liuxg/data/docker-logstash $ ls -al total 16 drwxr-xr-x 5 liuxg staff 160 May 7 22:13 . drwxr-xr-x 132 liuxg staff 4224 May 7 21:58 .. -rw-r--r-- 1 liuxg staff 29 May 7 21:59 .env -rw-r--r-- 1 liuxg staff 1039 May 7 22:37 docker-compose.yml drwxr-xr-x 3 Liuxg staff 96 May 7 22:18 logstash_pipeline $tree - 1. ├─ Docker-teach.txt TXT TXT TXT TXT TXT TXT TXT TXT TXT TXT TXT TXT TXT TXT TXT TXT TXT TXT TXT TXT TXTCopy the code

In this directory, it has a.env file, as shown above. It defines an environment variable to be used in docker-comemage. yml:

$ cat .env
ELASTIC_STACK_VERSION=7.12.1
Copy the code

You can define the version of the Elastic Stack you want by using ELASTIC_STACK_VERSION. Above, we use the latest release, 7.12.1.

Next we create a file called docker-comemage.yml:

docker-compose.yml

Version: '3.7' Services: ElasticSearch: image: docker.elastic.co/elasticsearch/elasticsearch:${ELASTIC_STACK_VERSION} container_name: es01 environment: - discovery.type=single-node ulimits: memlock: soft: -1 hard: -1 volumes: - esdata01:/usr/share/elasticsearch/data ports: - 9200:9200 networks: - elastic kibana: image: docker.elastic.co/kibana/kibana:${ELASTIC_STACK_VERSION} container_name: kibana ports: ['5601:5601'] networks: ['elastic'] environment: - SERVER_NAME=kibana.localhost - ELASTICSEARCH_HOSTS=http://elasticsearch:9200 - I18N_LOCALE=zh-CN depends_on: ['elasticsearch'] logstash: image: logstash:${ELASTIC_STACK_VERSION} ports: - 5000:5000 volumes: - type: bind source: ./logstash_pipeline/ target: /usr/share/logstash/pipeline read_only: true networks: - elastic volumes: esdata01: driver: local networks: elastic: driver: bridgeCopy the code

As shown above, it uses Docker to install Elasticsearch, Kibana, and Logstash. For the sake of illustration, we use a node to install Elasticsearch.

For the Logstash installation, we define a port 5000 so that we can use the TCP port 5000 in the Logstash configuration file to pass data into the Logstash pipeline and process it. At the same time, we use the definition of the volumes, the local files in the directory logstash_pipeline bind to Logstash/usr/share/Logstash/pipeline directory. This enables the pipeline defined in logSTash_pipeline to run automatically after the Logstash startup.

Let’s look at the ports.conf file defined in the logstash_pipeline directory:

ports.conf

input { tcp { port => 5000 } } output { elasticsearch { hosts => ["elasticsearch:9200"] index => "hello-logstash-docker" }}Copy the code

This configuration file is very simple. It accepts data from Port 5000 and sends it directly to Elasticsearch. For the sake of illustration, I have not used any filter here.

Now everything is ready. Docker-comemage. yml: docker-comemage. yml: docker-comemage. yml: docker-comemage. yml: docker-comemage. yml

docker-compose up
Copy the code

If you haven’t downloaded the desired version of Elastic Stack before, it will automatically download the desired image. When all the Dockers are up and running, we can see:

Logstash is up and running.

We can go to Kibana to see all the current indexes:

GET _cat/indices
Copy the code
Kibana_7.12.1_001 G4sOvCHwSqG0mgI5bHb3Yg 1 0 15 6 2.1 MB 2.1 MB Green open. apm-custom-link fZ14FvQSQoGdgDRaiBL_Cw 1 0 0 0 208b 208b green open .apm-agent-configuration rjldKIhSTYqhoknTMUmO2w 1 0 0 0 208b 208b Kibana_task_manager_7.12.1 _001 BGSG-dSfSmOLyLk7FK9IGQ 1 0 9 383 237.4 KB 237.4 KB Green open. kibana_task_manager_7.12.1_001 BGSG-dSfSmOLyLk7FK9IGQ 1 0 9 383 237.4 KB Kibana-event-log-7.12.1-000001 5dUT5TJPSpOLBV60R9q8nA 1 0 15 0 34.5KB 34.5KB Green open. Tasks BBEUxgA2QxCsD9LWeklppg 1 0 11 2 68.4 KB 68.4 KBCopy the code

From the output above, we can’t see the Hello-logstance-docker index.

Enter the following command in another terminal:

telnet localhost 5000
Copy the code

Then, we typed some random strings:

We went back to Kibana and looked at all the indexes:

GET _cat/indices
Copy the code
Kibana_7.12.1_001 G4sOvCHwSqG0mgI5bHb3Yg 1 0 24 2 4.2 MB 4.2 MB Green open. apm-custom-link fZ14FvQSQoGdgDRaiBL_Cw 1 0 0 0 208b 208b green open .apm-agent-configuration rjldKIhSTYqhoknTMUmO2w 1 0 0 0 208b 208b Kibana_task_manager_7.12.1_001 BGSG-dSfSmOLyLk7FK9IGQ 1 0 9 458 233.1 KB 233.1 KB Yellow Open Hello -logstash-docker ENOQrJYZSM2JlhtV4yvJLg 1 1 3 0 16.3 KB 16.3 KB Green open. kibana- Event-log -7.12.1-000001 5dUT5TJPSpOLBV60R9q8nA 1 0 15 0 34.5KB 34.5KB Green open. Tasks BBEUxgA2QxCsD9LWeklppg 1 0 11 2 68.4 KB 68.4 KBCopy the code

It shows that a new index hello-logstuck-docker has been generated. We can use the following command to view its contents:

GET hello-logstash-docker/_search
Copy the code

The command above will return:

{ "took" : 1, "timed_out" : false, "_shards" : { "total" : 1, "successful" : 1, "skipped" : 0, "failed" : 0 }, "hits" : {" total ": {" value" : 3, "base" : "eq"}, "max_score" : 1.0, "hits" : [{" _index ": "Hello - logstash - the docker _type", "" :" _doc ", "_id" : "CSCvSXkBnY - NEmvD2mLa", "_score" : 1.0, "_source" : {" port ": 59430, "@ timestamp" : "the 2021-05-08 T01:55:31. 938 z", "message" : "Hello, liuxg!" "" "" ", "@ version" : "1", "the host" : "gateway" } }, { "_index" : "hello-logstash-docker", "_type" : "_doc", "_id" : "CCCvSXkBnY-NEmvD2mLZ", "_score" : 1.0, "_source" : {"port" : 59430, "@timestamp" : "2021-05-08t01:55:31.952z ", "message" : """This is so cool and nice """, "@version" : "1", "host" : "gateway" } }, { "_index" : "Hello - logstash - the docker _type", "" :" _doc ", "_id" : "CiCvSXkBnY - NEmvD_GJP", "_score" : 1.0, "_source" : {" port ": 59430, "@timestamp" : "2021-05-08T01:55:40.644z ", "message" : """I love to see Logstash working "", "@version" : "1", "host" : "gateway" } } ] } }Copy the code

We can see that the input has been successfully imported into Elasticsearch.